Interactive Supercomputing wins NSF grant

Interactive Supercomputing has received a grant from the National Science Foundation (NSF) for a software development project that seeks to enable scientists to transparently run their simulations on parallel architectures. The NSF grant comes on the heels of a similar government research grant this month from Oak Ridge National Labs.

The National Science Foundation grant is a joint project with ISC and Northeastern University called 'Commercial grade automatic and manual parallelisation and performance tools'. ISC and Northeastern will develop toolkits that parallelise the algorithms and models resulting from popular desktop Very High Level Language (VHLL) applications, such as Python and MatLab, and will compare the efficiency and code quality produced versus customised codes developed in more traditional programming languages such as C and C++. The goal is to enable NSF-funded scientists and engineers to tap into the capabilities of parallel processing to solve huge computational problems, while minimising development time.

The application suite provides a range of algorithms and techniques that help engineers and scientists understand physics-based wave and signal interaction under the surfaces of objects. These surfaces may include the ocean, the ground, human skin or a human cell. A common feature in all of these applications is that they process large image and sensor datasets. Consequently, the lack of computational processing power has hindered research in many of these problems.

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

Sophia Ktori highlights the role of the laboratory software in the use of medical diagnostics


Gemma Church explores the use of modelling and simulation to predict weather and climate patterns


Robert Roe reviews the latest in accelerator technology and finds that GPUs and coprocessors will be key fixtures in the future of deep learning


Robert Roe finds that commoditisation of flash and SSD technology and the uptake of machine learning and AI applications are driving new paradigms in storage technology.