[C#] What is the difference between int, Int16, Int32 and Int64?


Answers

The only real difference here is the size. All of the int types here are signed integer values which have varying sizes

  • Int16: 2 bytes
  • Int32 and int: 4 bytes
  • Int64 : 8 bytes

There is one small difference between Int64 and the rest. On a 32 bit platform assignments to an Int64 storage location are not guaranteed to be atomic. It is guaranteed for all of the other types.

Question

What is the difference between int, System.Int16, System.Int32 and System.Int64 other than their sizes?




Int=Int32 --> Original long type

Int16 --> Original int

Int64 --> New data type become available after 64 bit systems

"int" is only available for backward compatibility. We should be really using new int types to make our programs more precise.

---------------

One more thing I noticed along the way is there is no class named Int similar to Int16, Int32 and Int64. All the helpful functions like TryParse for integer come from Int32.TryParse.




According to Jeffrey Richter(one of the contributors of .NET framework development)'s book 'CLR via C#':

int is a primitive type allowed by the C# compiler, whereas Int32 is the Framework Class Library type (available across languages that abide by CLS). In fact, int translates to Int32 during compilation.

Also,

In C#, long maps to System.Int64, but in a different programming language, long could map to Int16 or Int32. In fact, C++/CLI does treat long as Int32.

In fact, most (.NET) languages won't even treat long as a keyword and won't compile code that uses it.

I have seen this author, and many standard literature on .NET preferring FCL types(i.e., Int32) to the language-specific primitive types(i.e., int), mainly on such interoperability concerns.




Nothing. The sole difference between the types is their size (and, hence, the range of values they can represent).




EDIT: This isn't quite true for C#, a tag I missed when I answered this question - if there is a more C# specific answer, please vote for that instead!


They all represent integer numbers of varying sizes.

However, there's a very very tiny difference.

int16, int32 and int64 all have a fixed size.

The size of an int depends on the architecture you are compiling for - the C spec only defines an int as larger or equal to a short though in practice it's the width of the processor you're targeting, which is probably 32bit but you should know that it might not be.