It is widely accepted that the electronic laboratory notebook (ELN) is an essential tool in today’s dynamic scientific R&D landscape. Improving R&D efficiency means optimising the everyday tasks of scientists. ELNs allow scientists to focus on the wet work rather than the documentation, while at the same time enabling them to share experimental results and learn from institutional knowledge and prior art.
Our facility is the national HPC resource for Poland, and we have two main functions: to provide an HPC resource for scientists, and also to research into computational science itself. We also store and provide access to all national scientific research content.
We strive to be truly interdisciplinary and open-access in all we do. Our aim is to foster open models in all areas – software, data, publishing, as well as open information services such as weather forecasting.
The planet’s changing climate and freaky weather is hitting the headlines more and more these days. Hurricane Katrina, Antarctica’s increasing temperature – and even floods in north Cornwall in the UK – show that, not only can Earth’s weather be erratic, it can be devastating.
But how can you predict the unpredictable? Simulations of any weather system must take into account a massive number of variables and contributing factors, making such modelling seemingly impossible. Or so it was until a few years ago.
Although we’re excited about the power of dual- and quad-core CPUs, consider the potential of an integrated circuit with many hundreds of cores – and these are already in your desktop or laptop PC in the form of a GPU, intended to accelerate graphics processing. In addition, board-level GPU products are becoming available as are preconfigured ‘personal supercomputers’.
It doesn’t surprise us that state-of-the-art optics software has made contributions to the development of impressive projects such as futuristic space telescopes or in the cockpit illumination of next-generation airliners, but we often overlook how such software lets us enjoy the benefits of everyday products whether automobile headlamps or LED flashlights. By examining those applications, you can gain an appreciation of the multitude of areas where this class of software can be applied.
Collecting scattered light
The clean-up surrounding large-scale disasters can involve massive amounts of manpower and resources, and take months or years to conclude. Victims from the attacks on the World Trade Center were still being identified as late as April 2008, more than seven years after the Twin Towers collapsed. The remains were identified through DNA analysis, and updated technology is being used to re-extract DNA to make new identifications.
As you read this, a new US president will be in his first weeks at the head of an administration informed by respected earth scientists including John Holdren, Jane Lubchenco, and Steven Chu. The words ‘earth science’ usually evoke those disciplines concerned with the lithosphere (particularly geology, seismology, and vulcanology), but public concern is rising about the effects of human interaction with the other three spheres as well – and all sectors of the earth sciences are intensive consumers of computing resources.
After losing a fortune in the South Sea Company bubble of 1720, Sir Isaac Newton said: ‘I can calculate the motion of heavenly bodies, but not the madness of people.’
As the world sits on the edge of a financial precipice it may be a good time to reflect on the fact that little fundamental research has been done on how financial markets work. Clearly there are non-linearities and very large numbers of variables, but logic would suggest that such situations are better handled with the help of powerful computers than simply the ‘gut instincts’ of traders.
Gemma Church finds out how astronomers are using simulations to investigate the extremities of our universe
Turning data into scientific insight is not a straightforward matter, writes Sophia Ktori
The Leibniz Supercomputing Centre (LRZ) is driving the development of new energy-efficient practices for HPC, as Robert Roe discovers
William Payne investigates the growing trend of using modular HPC, built on industry standard hardware and software, to support users across a range of both existing and emerging application areas
Robert Roe looks at developments in crash testing simulation – including larger, more intricate simulations, the use of optimisation software, and the development of new methodologies through collaboration between ISVs, commercial companies, and research organisations