00 Votes

C#/.NET: Difference between int and Int32

Question by Guest | 2016-09-27 at 19:40

There seems to be two different possibilities to declare an integer in C#. You can either use int or you can use Int32:

int i = 1;
Int32 i = 1;

Internally, both integer types are even supposed to be completely identical, because "int" seems to be defined as "System.Int32". 

Can someone explain to me why there are those different types and which of those types I should use ideally? 

ReplyPositiveNegative
0Best Answer0 Votes

Indeed, int and Int32 are internally the same. "int" is an alias for "System.Int32" and because of that the usage of both constructs is leading to the same code after compilation and there is also no difference in performance.

Customarily, you should use the alias in usual code, that is the lowercase variant (which can also be typed faster):

int i = 1;

I would only use Int32 in a case in which it this explicitly necessary to have a 32 bit integer. If the exact integer type does not play a role (for example when an integer is needed for a small loop between 0 and 5 or something like that), I would always use the default integer type "int". 

With handling it like that, you will also be on the safe side in future. I mean, of course, it could be possible that at some time in future, the "default" type will be changed and "int" will suddenly be defined as System.Int64 internally. If you have always declared your integers as 32 bit in cases in which it is absolutely necessary, your code will not be leading to any problems after the change. 

By the way, next to "int", there are also other integer aliases in C# you can use:

TypeAlias for
shortSystem.Int16
ushortSystem.UInt16
intSystem.Int32
uintSystem.UInt32
longSystem.Int64
ulongSystem.UInt64

Even types like string ( = System.String), bool ( = System.Boolean), char ( = System.Char) or object ( = System.Object) are defined in a similar way.

By the way, when declaring a string I would recommend using the lower case alias in all cases, when declaring an integer depending on whether the bit size is important or not as I said. Keep in mind that even the definitions of that short is a 16 bit integer, int a 32 bit integer and long a 64 bit integer can vary from language to language and therefore traditionally confused. Therefore, especially when using "exotic" integer values (if they are used because of a special reason), an exact definition is the more important. 
2016-09-28 at 18:03

ReplyPositive Negative
Reply

Related Topics

MySQL: Integer Types

Info | 0 Comments

Important Note

Please note: The contributions published on askingbox.com are contributions of users and should not substitute professional advice. They are not verified by independents and do not necessarily reflect the opinion of askingbox.com. Learn more.

Participate

Ask your own question or write your own article on askingbox.com. That’s how it’s done.