Mozilla is working to implement JavaScript "parallel (parallelism) computing" to take full advantage of hardware performance. Dave Herman is the chief researcher and strategy director at Mozilla Research. In a recent blog post, he mentions that they are experimenting with their JavaScript engine SpiderMonkey.
They tried to add a more flexible and powerful parallel computing primitive to JavaScript through a bottom-up, incremental approach. He first noted that:
What I'm talking about is not "concurrency (concurrency)" ... The asynchronous concurrency model of JavaScript is very popular and successful, and promises, ES6 generators, and the upcoming async/await are making it more and more good.
Here is a parallel computation that frees up the potential of devices (GPUs, SIMD instructions, and multi-core processors). WEB workers has done some work on multi-core parallel computing, but its worker threads are completely isolated and can only communicate through PostMessage.
One more radical approach is to turn JavaScript into a fully multithreaded data model like Nashorn. However, it requires the host Java program to carefully synchronize the script, otherwise the JavaScript application will not be guaranteed to "run to completion (Run-to-completion)". Furthermore, this process is accompanied by a large number of standardization and implementation efforts, which in turn bring about ecosystem risks.
On the other hand, Mozilla research and Intel Labs have done some experiments over the years in the Deterministic Parallel computing API (known as River Trail or PJs). But they have chosen a very difficult approach because it is difficult to find an advanced model that is sufficiently generic to fit a wide range of parallel programs.
As a result, they introduced the Sharedarraybuffer type. Unlike PJs, its built-in locking mechanism brings new locking patterns to worker threads, but in the same way some objects may suffer data contention. However, unlike Nashorn, this situation only occurs in objects that choose to use shared memory as a background store. If you create an object that does not use a shared buffer, you can be sure that it will never have data contention. Dave says this is a relatively conservative approach, but should be able to meet the needs of many scenarios. In fact, this approach was explored a few years ago. At last year's jsconf conference, Nick Bray of Google's Pnaci team also demonstrated a prototype that shared buffers in chrome.
Currently, Dave and his team are testing the Sharedarraybuffer API in SpiderMonkey and are drafting its specifications. Prototype implementations are already available in the Firefox daily build, and interested readers can download trials and provide feedback.
Mozilla is testing JavaScript parallel computing in SpiderMonkey