NEWS
Tags: 

Japan announces AI supercomputer

The Tokyo Institute of Technology has announced plans to build a new supercomputer designed to accelerate Artificial Intelligence (A.I) research.

The new system, known as TSUBAME3.0 is expected to deliver 12 petaflops of double precision performance - more than two times the performance of its predecessor, TSUBAME2.5.

‘Artificial intelligence is rapidly becoming a key application for supercomputing,’ said Ian Buck, vice president and general manager of Accelerated Computing at Nvidia. ‘Nvidia’s GPU computing platform merges AI with HPC, accelerating computation so that scientists and researchers can drive life-changing advances in such fields as healthcare, energy, and transportation.’

The system will use the latest Nvidia GPUs, the Pascal-based Tesla P100 GPUs, to reach a performance figure of roughly 12.2 petaflops which would rank the system among the world’s ten fastest systems according to the latest TOP500 list, released in November 2016.

TSUBAME3.0 is designed with AI computation in mind and is expected to deliver more than 47 PFLOPS of AI data crunching horsepower. When operated concurrently with TSUBAME2.5, the total performance is expected to be close to 64.3 PFLOPS, making it Japan’s highest performing supercomputer for AI applications.

Once up and running this summer, TSUBAME3.0 is expected to be used for education and high-technology research at Tokyo Tech, and be accessible to outside researchers in the private sector. It will also serve as an information infrastructure centre for leading Japanese universities.

Tokyo Tech’s Satoshi Matsuoka, a professor of computer science who is building the system, said, ‘Nvidia’s broad AI ecosystem, including thousands of deep learning and inference applications, will enable Tokyo Tech to begin training TSUBAME3.0 immediately to help us more quickly solve some of the world’s once unsolvable problems.’

Company: 
Twitter icon
Google icon
Del.icio.us icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon
Feature

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process

Feature

Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory

Feature

This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.

Feature

This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value

Feature

This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment