It now passes all of the tests in 15.1.2.2-1 (except that parseInt
still has the .length property, which is a different bug) - so I'll
close the bug.
Still possibly at issue is whether we conform to ECMA language about
decimal numbers that are too large to fit in a double. I treat
decimal digits after the 20th as zero, but there could be some
floating-point rounding wackiness going on. In particular - are we
doing the right thing for numbers that are powers of 2, but larger
than 2^54, that are representable in a double?
Array.prototype.sort(comparefn) was casting the result of the compare
to an int, which lost when the compare function returned (ecma-valid)
strange double values. These now get clamped to -1, 0, 1.
Added support in the javascript shell for the #! unix script hack; if
the first line read by the shell (from a file, not interactive) starts
with #, the line is treated as a comment.
This should make
#!/usr/bin/js work...
character strings; added a js_DeflateString call. Thanks to gcc 2.8.1
for catching this - it complained about "char format, different type
arg (arg 4)" - which means that it looked in the (printf-style) format
string and checked type against it. Wow.
a localized string from the os for toLocaleString. The time struct
used to interface to the os time-formatting functions only takes a
16-bit year, so we map to an equivalent year (for getting the timezone
string) or clamp for years outside that range.
reserved words, and changed the versioning check that previously
applied to 'export' to accept any 'ecma' version... which means that
export becomes a keyword for the default version. Does this mean
we'll need to unreserve all the java keywords? Not sure we want to do
that...
identifier; it was only printing the first character of the
identifier, because it expected 8-bit chars, and was being called with
a 16-bit representation of the offending keyword.