Scientists and engineers find computing is an important aspect of their day-to-day work, but for the most part, the computing itself is just a means to an end; that is, computing enables their scientific research and engineering development. Of course, some scientists become fascinated by computing itself, and become as expert in programming as they are in their scientific discipline.
Over the decades, we’ve learned how to exploit radio waves across the entire frequency spectrum; it’s hard to believe that, in years past, many of these frequencies were considered useless. Each frequency band dictates antennas of a different length and often with unusual requirements. To help find the best solution for any application, engineers turn to electromagnetic simulations.
The high-performance computing world moves very quickly. In early 2004, the world’s most powerful computer was the Japanese Earth Simulator, clocking in with a peak performance of an amazing 40 teraflops (1,012 floating point operations, or calculations, per second). At the same time, the most powerful computer in the United States was the ASCI-Q machine at Los Alamos National Laboratory, with a peak speed of 20 teraflops. Oak Ridge National Laboratory (ORNL) was far behind, with an IBM Power3 system of just over three teraflops.
To talk about the weather is as English as afternoon tea and scones. The phrase ‘what a wet winter we’re having!’ is an instant ice-breaker in any social gathering in the UK. But will HPC change British behaviour? While HPC may not be a topic of discussion at bus stops, it has a vital role to play in predicting the weather. Using HPC, meteorologists can model climates (the underlying conditions) and predict weather (the atmospheric conditions at a given time) with a high degree of accuracy.
First- and second-generation GPGPUs for HPC applications have been based to a large extent on the architecture from which the chips get their name: graphics. While current GPGPUs boost some HPC applications tremendously (see references), they haven’t been optimised for the data-crunching involved in scientific processing and have displayed a few weak points. These weak points are now being eliminated.
The complexities of drug discovery and development and the competitiveness of the pharmaceutical market have led to a work ethos where outsourcing is becoming more important. A pharmaceutical company has to be competent in such a diverse range of fields, from analytical chemistry to manufacturing and marketing, that outsourcing aspects of its work to contract research organisations (CROs) and other service organisations provides advantages in terms of the expertise these organisations offer.
It’s fashionable to scoff at Thomas Robert Malthus’ predictions, two hundred years ago, that human populations would grow until stopped by famine, disease or ‘moral restraint’. He wrote before the arrival of modern scientific crop research or contraception, and it’s unfair to blame Malthus for not foreseeing those breakthroughs. However, he was essentially right: the food supply expanded but remains finite, and contraception has not fundamentally disrupted the shape of the population growth curve, which is asymptotically approaching the vertical.
Gemma Church finds out how astronomers are using simulations to investigate the extremities of our universe
Turning data into scientific insight is not a straightforward matter, writes Sophia Ktori
The Leibniz Supercomputing Centre (LRZ) is driving the development of new energy-efficient practices for HPC, as Robert Roe discovers
William Payne investigates the growing trend of using modular HPC, built on industry standard hardware and software, to support users across a range of both existing and emerging application areas
Robert Roe looks at developments in crash testing simulation – including larger, more intricate simulations, the use of optimisation software, and the development of new methodologies through collaboration between ISVs, commercial companies, and research organisations