No, Integer and String are different types. To convert an integer to string use: String.valueOf(integer), or Integer.toString(integer) for primitive, or Integer.toString() for the object.
No. Every object can be casted to an java.lang.Object, not a String. If you want a string representation of whatever object, you have to invoke the toString() method; this is not the same as casting the object to a String.
You can't cast explicitly anything to a String that isn't a String. You should use either:
"" + myInt;
or:
Integer.toString(myInt);
or:
String.valueOf(myInt);
I prefer the second form, but I think it's personal choice.
Edit OK, here's why I prefer the second form. The first form, when compiled, could instantiate a StringBuffer (in Java 1.4) or a StringBuilder in 1.5; one more thing to be garbage collected. The compiler doesn't optimise this as far as I could tell. The second form also has an analogue, Integer.toString(myInt, radix) that lets you specify whether you want hex, octal, etc. If you want to be consistent in your code (purely aesthetically, I guess) the second form can be used in more places.
Edit 2 I assumed you meant that your integer was an int and not an Integer. If it's already an Integer, just use toString() on it and be done.
Casting is different than converting in Java, to use informal terminology.
Casting an object means that object already is what you're casting it to, and you're just telling the compiler about it. For instance, if I have a Foo reference that I know is a FooSubclass instance, then (FooSubclass)Foo tells the compiler, "don't change the instance, just know that it's actually a FooSubclass.
On the other hand, an Integer is not a String, although (as you point out) there are methods for getting a String that represents an Integer. Since no no instance of Integer can ever be a String, you can't cast Integer to String.
In your case don't need casting, you need call toString().
Integer i = 33;
String s = i.toString();
//or
s = String.valueOf(i);
//or
s = "" + i;
Casting. How does it work?
Given:
class A {}
class B extends A {}
| (B)
B b = new B(); //no cast
A a = b; //upcast with no explicit cast
a = (A)b; //upcast with an explicit cast
b = (B)a; //downcast
A and B in the same inheritance tree and we can this:
a = new A();
b = (B)a; // again downcast. Compiles but fails later, at runtime: java.lang.ClassCastException
The compiler must allow things that might possibly work at runtime. However, if the compiler knows with 100% that the cast couldn't possibly work, compilation will fail.
Given:
class A {}
class B1 extends A {}
class B2 extends A {}
(A)
/ \ (B1) (B2)
B1 b1 = new B1();
B2 b2 = (B2)b1; // B1 can't ever be a B2
Error: Inconvertible types B1 and B2.
The compiler knows with 100% that the cast couldn't possibly work. But you can cheat the compiler:
B2 b2 = (B2)(A)b1;
but anyway at runtime:
Exception in thread "main" java.lang.ClassCastException: B1 cannot be cast to B2
in your case:
(Object) / \ (Integer) (String)
Integer i = 33;
//String s = (String)i; - compiler error
String s = (String)(Object)i;
at runtime: Exception in thread "main" java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String