Last post Oct 23, 2014 06:45 AM by DMW
Oct 20, 2014 06:52 AM|ts-alan|LINK
written in msdn:
If you want a numeric real literal to be treated as decimal, use the suffix m or M.Without the suffix m, the number is treated as a double and
generates a compiler error.
I have a question.why is it necessary?Why did so deсimal has the suffix M?What developers of compiler pursue a purpose?
Oct 20, 2014 07:22 AM|PatriceSc|LINK
Because "without the suffix m, the number is treated as a double". Isn't it a sufficient reason?
If you meant, why the default is double rather than decimal, this is likely to minimize changes compared to C like languages. Introducing a new different default would likely make things more confusing when porting accross languages or when a developper
uses multiple languages for likely no benefit at all.
If you meant, why a type is needed, exactly as variables are typed, when using a literal value rather than a variable or constant, it needs to be typed as well to ensure type safety so you can't just create an "untyped" numeric value.
Else please be explicit about why you think it shouldn't be necessary as from my point of view, the primary reason is given as part of the question.
Oct 20, 2014 12:04 PM|gerrylowry|LINK
@ts-alan welcome to
my best guess is that you're new to programming in general.
if you're a fan of "The Big Bang Theory", you might agree that likely Sheldon would be pleased if i start with an analogy related to trains. imagine a single track, two oncoming trains and a siding. at either end of the siding is a mechanical switch that
allows a train to be on the alternate railway track.
[West] TrainA =========================== TrainB [East]
w \\ // e
TrainA is eastbound; TrainB is westbound; if switch w or switch e is not used at the appropriate time, the trains will have a head on collision.
The switches w and e are like binary switches in a computer. Like those switches, computers are not yet very smart, definitely, computers are not psychic. For that reason we need to have ways of communicating information and directions to a computer.
We code our instructions in a human readable source code that is translated into binary information that causes logic switches in a computer to function appropriately.
ts-alan, a key point
here is that inside the computer, there is no internal way of knowing what is on the main train track or the siding ... TrainA could have been a freight train and TrainB a passenger train and the siding would never have known. inside a modern digital computer,
it's all binary storage. one program might use an area of storage for instructions, another program might use the same are of storage for storing poems, yet another program might again use the same area of storage for the first 10 prime numbers, et cetera.
it is important that programmers communicate what type of data is being used.
http://msdn.microsoft.com/en-us/library/ya5y69ds.aspx "Built-In Types Table (C# Reference)"
ts-alan, this answer
may surprise you: the 'm' or 'M' are not really always necessary.
System.Decimal decimalNumberA = 1;
behind the scenes, it IL code that occurred via translation.
IL_0002: newobj System.Decimal..ctor
IL_0007: stloc.0 // decimalNumberA
IL_0008: ldloc.0 // decimalNumberA
IL_0009: box System.Decimal
IL_000E: call System.Object.GetType
IL_0013: call System.Console.WriteLine
adding the 'm' produces identical results:
System.Decimal decimalNumberA = 1m; // m suffix on the literal
often, the suffix, when unnecessary for the compiler (which, BTW, does generate a warning), is a visual aid for the humans who read the code.
However, because of some internal conversions, there are times when the suffix is necessary. Compare:
Decimal d1 = (Decimal)1000000000000.345; // loss of precision will occur
Decimal d2 = 1000000000000.345m;
Bottom line, even if not required to make the compiler happy, it's probably a good idea to use the suffix 'm' or 'M' as a reminder to human readers of your code that your literal is intended to be System.Decimal.
Oct 23, 2014 06:45 AM|DMW|LINK
Why did so deсimal has the suffix M?
Apocryphally, "M" is used because it stands for Money, and decimal is the appropriate type to use for money because of its precision.
Of course, if when the US increases beyond the level that a decimal can hold, I guess Microsoft will have to introduce a new type, probably with the suffix "emm", for "even more money".