During my years on Python development, I've always been amazed at how much much much faster things become if you manage to rewrite that code that loops though your ndarray and does something, with numpy functions that work on the whole array at once. More recently I'm switching more and more to node, and I'm looking for something similar. So far I have turned up some things, none of which look promising:
As far as I know, npms can be written in C++, so I'm wondering why there are no numpy-like libraries for node. Is there just not enough interest in node yet from the community that needs that kind of power? Is there a hope that ES6 features (list comprehensions) will allow javascript compilers to automatically vectorise native JS code to C++ speeds? Am I possibly missing something else?
Edit, in response to close-votes: Note, I'm not asking for "what is the best package to do xyz". I'm just wondering if there is a technical reason there is no package to do this on node, a social reason, or no reason at all and there is just a package I missed. Maybe to avoid too many opinionated criticism, I want to know: I have about 10000 matrices that are 100 x 100 each. What's the best (* correction, a reasonable fast) way to add them together?
Edit2 After some more digging, it turned out I was googling for the wrong thing. Google for "node.js scientific computing" and there are links to some very interesting notes:
Basically as far as I understand now, no-one has bothered so far. Also, since there are some major omissions in the js TypedArrays (such as 64bit ints), it might be hard to add good support by just using NPMs, and not hacking the engine itself --- something that would defeat the purpose. Then again, I didn't further research this last statement.