NEC launches a new model of its vector supercomputer

Share this on social media:

NEC Corporation has announced the launch of the new “SX-Aurora TSUBASA C401-8" vecotr engine for use in its “SX-Aurora TSUBASA" vector supercomputer that offers 2.5 times the computing performance and twice the power efficiency of previous models.

In this product, the number of cores has been increased from the previous 10 cores to 16 cores, and a new L3 cache has been adopted to achieve faster computation performance. In addition, the adoption of state-of-the-art processes has resulted in improved power efficiency.

The new system will be utilised for large-scale scientific computing at Tohoku University's Science Center (Tohoku University). Tohoku University is scheduled to begin operating the system in August 2023, featuring a total of 4,032 Vector Engines (VE) and a total theoretical computing performance of approximately 21 PFLOPS (Petaflops), making it the world's highest performance vector-based supercomputer system.

Tohoku University has already adopted previous versions of the SX Series for a wide variety of applications in various areas, including manufacturing, such as the design of airplanes and power generation turbines, where large-scale numerical fluid simulation plays an important role. Other applications include disaster mitigation simulation, such as the prediction of damage from tsunami or river inundation. This is in addition to simulations useful for daily living, such as risk assessment for heat stroke. Moreover, the university has started to explore the applications of quantum computing with the help of the SX Series, and going forward, Tohoku University will continue to expand contributions to a wide range of research and development.

NEC has developed a new power-saving SX-Aurora TSUBASA that incorporates many card-type VEs that combine LSI technologies and high-density packaging technologies, as well as high-efficiency cooling technologies that NEC has cultivated over many years of supercomputer development.