Yes, they are. They are actually hash tables internally, so you can use not only large integers but also strings, floats, or other objects. All keys get converted to strings via toString() before being added to the hash. You can confirm this with some test code:
<script>
var array = [];
array[0] = "zero";
array[new Date().getTime()] = "now";
array[3.14] = "pi";
for (var i in array) {
alert("array["+i+"] = " + array[i] + ", typeof("+i+") == " + typeof(i));
}
</script>
Notice how I used for...in syntax, which only gives you the indices that are actually defined. If you use the more common for (var i = 0; i < array.length; ++i) style of iteration then you will obviously have problems with non-standard array indices.
How exactly JavaScript arrays are implemented differs from browser to browser, but they generally fall back to a sparse implementation - most likely the same one used for property access of regular objects - if using an actual array would be inefficient.
You'll have to ask someone with more knowledge about specific implementations to answer what excatly triggers the shift from dense to sparse, but your example should be perfectly safe. If you want to get a dense array, you should call the constructor with an explicit length argument and hope you'll actually get one.
See this answer for a more detailed description by olliej.
Javascript objects are sparse, and arrays are just specialized objects with an auto-maintained length property (which is actually one larger than the largest index, not the number of defined elements) and some additional methods. You are safe either way; use an array if you need it's extra features, and an object otherwise.
You could avoid the issue by using a javascript syntax designed for this sort of thing. You can treat it as a dictionary, yet the "for ... in ... " syntax will let you grab them all.
var sparse = {}; // not []
sparse["whatever"] = "something";
The answer, as is usually true with JavaScript, is "it's a bit wierder...."
Memory usage is not defined and any implementation is allowed to be stupid. In theory, const a = []; a[1000000]=0; could burn megabytes of memory, as could const a = [];. In practice, even Microsoft avoids those implementations.
Justin Love points out, the length attribute is the highest index set. BUT its only updated if the index is an integer.
So, the array is sparse. BUT built-in functions like reduce(), Math.max(), and "for ... of" will walk through the entire range of possible integer indices form 0 to the length, visiting many that return 'undefined'. BUT 'for ... in' loops might do as you expect, visiting only the defined keys.
Here's an example using Node.js:
"use strict";
const print = console.log;
let a = [0, 10];
// a[2] and a[3] skipped
a[4] = 40;
a[5] = undefined; // which counts towards setting the length
a[31.4] = 'ten pi'; // doesn't count towards setting the length
a['pi'] = 3.14;
print(`a.length= :${a.length}:, a = :${a}:`);
print(`Math.max(...a) = :${Math.max(a)}: because of 'undefined values'`);
for (let v of a) print(`v of a; v=:${v}:`);
for (let i in a) print(`i in a; i=:${i}: a[i]=${a[i]}`);
giving:
a.length= :6:, a = :0,10,,,40,:
Math.max(...a) = :NaN: because of 'undefined values'
v of a; v=:0:
v of a; v=:10:
v of a; v=:undefined:
v of a; v=:undefined:
v of a; v=:40:
v of a; v=:undefined:
i in a; i=:0: a[i]=0
i in a; i=:1: a[i]=10
i in a; i=:4: a[i]=40
i in a; i=:5: a[i]=undefined
i in a; i=:31.4: a[i]=ten pi
i in a; i=:pi: a[i]=3.14
But. There are more corner cases with Arrays not yet mentioned.
Basically walking the array for indexed entries while decrementing the length value and returning the hardened !! boolean of the falsy/truthy numerical result (if the accumulator is decremented all the way to zero, the index is fully populated and not sparse). Charles Merriam's caveats above should be considered as well and this code doesn't address them, but they apply to hashed string entries which can happen when assigning elements with arr[var]= (something) where var wasn't an integer.
A recent answer to that post has a link to this deep dive into how V8 tries to optimize arrays by tagging them to avoid (re-)testing for characteristics like sparseness: https://v8.dev/blog/elements-kinds. The blog post is from Sept '17 and the material is subject to some change, but the breakdown to implications for day-to-day development is useful and clear.
Sparseness (or denseness) can be confirmed empirically for NodeJS with the non-standard process.memoryUsage().
Sometimes node is clever enough to keep the array sparse:
Welcome to Node.js v12.15.0.
Type ".help" for more information.
> console.log(`The script is using approximately ${Math.round(process.memoryUsage().heapUsed / 1024 / 1024 * 100) / 100} MB`)
The script is using approximately 3.07 MB
undefined
> array = []
[]
> array[2**24] = 2**24
16777216
> array
[ <16777216 empty items>, 16777216 ]
> console.log(`The script is using approximately ${Math.round(process.memoryUsage().heapUsed / 1024 / 1024 * 100) / 100} MB`)
The script is using approximately 2.8 MB
undefined
Sometimes node chooses to make it dense (this behavior might well be optimized in future):
> otherArray = Array(2**24)
[ <16777216 empty items> ]
> console.log(`The script is using approximately ${Math.round(process.memoryUsage().heapUsed / 1024 / 1024 * 100) / 100} MB`)
The script is using approximately 130.57 MB
undefined
Then sparse again:
> yetAnotherArray = Array(2**32-1)
[ <4294967295 empty items> ]
> console.log(`The script is using approximately ${Math.round(process.memoryUsage().heapUsed / 1024 / 1024 * 100) / 100} MB`)
The script is using approximately 130.68 MB
undefined
So perhaps using a dense array to get a feel for the original AIX kernel bug might need to be forced with a range-alike: