If I type 019 > 020
in the JavaScript console (tested in both Chrome and Firefox), I get the answer true
.
This is due to 020
being interpreted as an OctalIntegerLiteral
0 (equals 16
) whereas 019
is apparently being interpreted as OctalIntegerLiteral
1 (and equals 19
). As 19
is greater than 16
, 019 > 020
is true
.
What puzzles me is why 019
is interpreted as a DecimalLiteral
in first place. Which production is it? DecimalIntegerLiteral
does not allow 019
:
DecimalIntegerLiteral ::
0
NonZeroDigit DecimalDigits_opt
OctalIntegerLiteral
also does not allow 019
(as 9
is not an octal digit):
OctalIntegerLiteral ::
0 OctalDigit
OctalIntegerLiteral OctalDigit
OctalDigit :: one of
0 1 2 3 4 5 6 7
So from what I see in the specification, 019
should actually be rejected, I don't see why it is interpreted as a decimal integer.
I guess there's some kind of compatibility rule in place here but I have failed to find a formal definition. Could please anyone help me with this?
(Why I need this: I'm developing a JavaScript/ECMAScript parser for Java with JavaCC and have to pay a special attention to the specification - and deviations thereof.)