NEWS
Tags: 

Programming difficulty is killing engineers' productivity

While supercomputers are thought to help accelerate engineering and scientific discovery, a new study sponsored by Interactive Supercomputing suggests difficulties in programming is increasingly becoming one of researchers’ biggest productivity killers.

The Development of Custom Parallel Computing Applications study conducted by the Simon Management Group surveyed more than 500 users of parallel high-performance computers (HPCs) from a range of industries including education, government, aerospace, healthcare, manufacturing, geo-sciences, bio-sciences and semiconductor. The report examines the software tools currently used, probes current application development environments, practices, and limitations, and catalogues critical issues and bottlenecks.

The study indicates that parallel code writing, programming efficiency, translation, debugging and limits of HPC software are the most frequently cited bottlenecks across all industries. Respondents indicated there is an urgent need to shorten the application development time of custom algorithms and models.

The largest category of respondents (42.3 per cent) said that a typical project takes six months to complete, yet nearly 20 percent of respondents’ projects consume two to three years of their time.

The majority of parallel application prototypes (65 per cent) are developed in very high-level languages (VHLLs) such as Matlab, Mathematica, Python, and R. While C and Fortran are frequently used to prototype, respondents overwhelmingly said they would prefer to work with an interactive desktop tool if the prototype could be easily bridged to work with HPC servers.

The disconnect stems from the fact that desktop computers cannot handle the processing and memory requirements of the huge amounts of data that many scientific and engineering problems analyse. The problem is only getting worse; according to the study, the average median-sized data set used in a technical computing application today ranges from 10 to 45Gb and is expected to swell to 200 to 600Gb in just three years.

'This study demonstrates that programming tools have not kept pace with the advances in the computing hardware and affordability of high-performance computers,' said Peter Simon, president of Simon Management Group.

Twitter icon
Google icon
Del.icio.us icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon
Feature

Robert Roe explores the role of maintenance in ensuring HPC systems run at optimal performance

Feature

Robert Roe speaks with Dr Maria Girone, Chief Technology Officer at CERN openlab.

Feature

Dr Keren Bergman, Professor of Electrical Engineering at the School of Engineering and Applied Science, Columbia University discusses her keynote on the development of silicon photonics for HPC ahead of her keynote presentation at ISC High Performance 2018 

Feature

Sophia Ktori explores the use of informatics software in the first of two articles covering the use of laboratory informatics software in regulated industries

Feature

Robert Roe discusses the role of the Pistoia Alliance in creating the lab of the future with Pistoia’s Nick Lynch