JavaScript oddities

JavaScript is one of the most widely used languages, and is becoming increasingly common as a server side language thanks to Node.js. Despite this the language was originally put together in ten days by Brendan Eich in 1995. Since then JavaScript has been standardised and heavily extended, however like most languages it has it's share of quirks and oddities...

parseInt and null

The parseInt function parses a string and returns an integer:

>> parseInt('3')

It's normally best practise to specify a radix (the base in mathematical numeral systems) when parsing integers:

>> parseInt('3', 10)

This is very useful if you need to deal with non-decimal systems like hexadecimal:

>> parseInt('ff', 16)

The first parameter to parseInt should be a string, however because JavaScript is weakly typed, passing non-string values won't result in a syntax error. Instead the value is converted to a string and parsed:

>> parseInt(false, 10)

This can lead to some odd behaviour:

>> parseInt(null, 24)

So what's going on? Well, null is first being converted to a string. Once converted the first character (n) is parse as 23, however the next character, u, along with the remainder of the string is discarded because u is not used in base 24 numbers.

Trying to parse false also produces similar behaviour:

>> parseInt(false, 16)

Lexicographic sorting

The sort method in JavaScript can be used to arrange elements of an array in place:

>> ['z', 'a', 'm'].sort()
Array [ "a", "m", "z" ]

However unlike other languages like Python, numeric values are by default sorted lexicographically. This can produce some unexpected results:

>> [1,3,11].sort()
Array [ 1, 11, 3 ]

You can easily override this behaviour using a comparison function:

>> [1,3,11].sort(function(a, b){return a - b;})
Array [ 1, 3, 11 ]

Large numbers

Unlike other languages JavaScript only has one numeric type which uses a double precision 64-bit floating-point (IEEE 754). There is no specific type for integers! For most use cases this is fine, however it can result in some unexpected behaviour. For example long numbers will lose precision:

>> console.log(111111111111111111)

The Number.MAX_SAFE_INTEGER constant can be used to work out the maximum number you can safely work with:


Addition and concatenation

In JavaScript you can only add numbers and strings. If you try to add another type it will first be converted to a primitive. For example if you try to add two arrays you end up with an empty string:

>> [] + []

This happens because each array object is first converted to a string, then the two strings are concatenated:

>> toString([])
>> toString([]) + toString([])
>> ['hello'] + ['world']

An empty object, when converted to a string will end up as [object Object], and adding this to an array has a fairly predictable result:

>> toString({}
"[object Object]"
>> [] + {} // effectively "" + "[object Object]"
"[object Object]"

Addition is normally associative, however something strange happens if you reverse the addition above:

>> {} + []

So what's actually going on? Well the code above is not actually adding an empty array to an empty object. You can see this by using the new object constructor to confirm you don't get the same behaviour:

>> new Object() + []
"[object Object]"

Instead {} is being interpreted as an empty code block and ignored. This leaves + [], where the plus is actually a unary plus operator. This operator will attempt to convert the operand ([]), into a number. In the case of an empty array this results in 0:

>> + []

You can also see similar behaviour if you try to evaluate {} + {}:

>> {} + {}
>> +{}
>> Number({})

Or if you try to use + twice:

>> "foo" + + "bar" // effectively "foo" + (+"bar")

Note: I originally came across this oddity watching a lightning talk called "Wat" by Gary Bernhardt. It's well worth watching if you've not seen it before.

typeof null

JavaScript has both null and undefined. Running typeof against undefined behaves as expected:

>> typeof(undefined)

However null behaves a little bit differently:

>> typeof(null)

This is actually a hang over from the first implementation of JavaScript. Values were represented as a type tag plus the value. However null was represented with a NULL pointer. The type tag for objects was 0, as a result null is also interpreted as an object. This oddity is documented in the MDN web docs for typeof.

Interestingly a fix for this behaviour was proposed, however it never made it into JavaScript because it broke backwards compatibility with some websites.