NEWS
Tags: 

Nvidia announces containerised applications for HPC and deep learning

At the SC17 conference, held this week in Denver Colorado, Nvidia announced that it has begun optimising applications for HPC and visualisation of HPC workflows in addition to partnerships with all major cloud providers.

Nvidia also announced updates to its Nvidia GPU Cloud (NGC) platform which will now provide containerised HPC, deep learning and HPC visualisation applications.

The company is targeting many of the top HPC applications including RELION a software package used in the discovery of gravitational waves – a discovery that won the Nobel Prize for physics in 2017.

During the first day of the conference, Nvidia announced that the top 15 applications and 70 per cent of top 50 HPC applications are now optimised for GPU acceleration. This data comes from an Intersect360 report which described Nvida as 'critical to the future of scientific computing.'

‘GPU computing has reached a tipping point in the HPC market that will encourage continued increases in application optimisation’ wrote Addison Snell and Laura Segervall of Intersect360.

In a pre-show talk, NVIDIA CEO and founder Jensen Huang noted that every major computer maker and cloud service has turned to the NVIDIA Volta architecture to accelerate data-intensive workloads.

Nvidia has now begun to containerise HPC and HPC visualisation applications to enable more users to take advantage of GPU acceleration. Nvidia has released software and tools as part of the NGC container registry that allows scientists to deploy applications and HPC visualisation tools efficiently using cloud-based GPU technology.

The world’s top 15 HPC applications, all GPU accelerated, include GROMACS, ANSYS Fluent, Gaussian, VASP, NAMD, Simulia Abaqus, WRF, OpenFOAM, ANSYS, LS-DYNA, BLAST, LAMMPS, AMBER, Quantum Espresso and GAMESS.

‘Today, one of the biggest market dynamics is the advent of AI,’ noted Intersect360 in the report. ‘Many organisations are looking to deep learning techniques to bring AI advancements to their products, services, or operations. These algorithms often rely on GPUs, to the extent that AI has become a major growth driver for NVIDIA.’

 

Other tags: 
Company: 
Twitter icon
Google icon
Del.icio.us icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon
Feature

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process

Feature

Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory

Feature

This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.

Feature

This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value

Feature

This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment