Open-source software used for neural simulation on K supercomputer

Researchers have used a Japanese supercomputer to carry out what they describe as the largest general neuronal network simulation ever.

The simulation, by the Riken HPCI Program for Computational Life Sciences, the Okinawa Institute of Technology Graduate University (OIST) in Japan, and Forschungszentrum Jülich in Germany, was made possible by the development of advanced novel data structures for the simulation software, Nest.

The relevance of the achievement for neuroscience lies in the fact that Nest is open-source software freely available to every scientist in the world. The team used the supercomputer, K, to carry out the simulation.

The team, led by Markus Diesmann in collaboration with Abigail Morrison, both now with the Institute of Neuroscience and Medicine at Jülich, succeeded in simulating a network consisting of 1.73 billion nerve cells connected by 10.4 trillion synapses. The program recruited 82,944 processors of the K computer. The process took 40 minutes to complete the simulation of a single second of neuronal network activity in real, biological, time.

Although the simulated network is huge, it only represents one per cent of the neuronal network in the brain. The nerve cells were randomly connected and the simulation itself was not supposed to provide new insight into the brain – the purpose of the initiative was to test the limits of the simulation technology developed in the project and the capabilities of K. In the process, the researchers gathered experience that will guide them in the construction of novel simulation software.

The researchers say the achievement gives neuroscientists a glimpse of what will be possible in the future, with the emergence of exascale computers.

'If petascale computers like the K computer are capable of representing one per cent of the network of a human brain today, then we know that simulating the whole brain at the level of the individual nerve cell and its synapses will be possible with exascale computers, hopefully available within the next decade,' said Diesmann.

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers