If I run
var num = 23; var n = num.toString(); console.log(n)
23 as expected but if I apply the
toString() directly to a number like,
var n = 15.toString(); console.log(n)
it throws an error:
Uncaught SyntaxError: Invalid or unexpected token.
I noticed it also works fine for decimal values in the num variables (like .3, .99, 2.12, 99.1) etc. Could some one please help me understand the difference and how this function works
We see this as an Integer with a function being called on it.
The parser doesn't. The parser sees an attempt to declare a floating-point literal. The parser uses this:
It assumes that you are declaring a Floating-Point Literal because it is:
If you really, for all intents and purposes, want to call
23.toString(), the course of action is to isolate the literal like so:
(23).toString(); //interprets 23 as literal
23..toString(); //interprets 23. as literal
var foo = "The answer is " + 42;
So does this.
var bar = "39" - 0 + 3; //42
var baz = "39" + 3; //393!!!
However you can do this...
var n = (15).toString(); console.log(n);
... and it will work.
Thanks @apsillers, for the explanation. I didn't know that. The first dot on numbers is treated as part of the number, hence the problem.
1.1.toString() works. Interesting.
I can't explain why, but if you do
Also another way to cast to string:
23 + '';
When you store it into num as 23 that is the value assigned to it.
When you call
23.toString() it thinks it is 23(decimal point) and some word tostring which doesn't make sense.
So what you have to do is add another decimal point afterward to let it know that it is 23.0
What you get then is
23.(invisibleZeroHere).toString() AKA 23..toString()
©2020 All rights reserved.