The drug discovery process produces data at an astonishing rate, and the ability to organise this information across organisations has until recently been, at best, limited. An overarching, commercially-available informatics platform for drug discovery exists today only in partial implementations – why? Generally, installations consist of vendor or in-house applications specific to only one or a few functions of the drug discovery process, and are typically based on vendors’ internal expertise.
December 2008/January 2009
As chief of the NASA Advanced Supercomputing (NAS) Division at Ames Research Center, and project manager for NASA’s High-End Computing Capability Project, I am privileged to lead a distinctive supercomputing facility that supports the computational requirements of NASA’s key mission areas: aeronautics research, exploration systems, space operations, and science.
In a world where one hundredth of a second can make or break a race, supercomputing has become the secret weapon of Formula 1.
Broadly speaking, there are four main areas that affect the car’s performance: the driver, the engine, the tyres, and the aerodynamic. With FIA restrictions on engine development, and teams limited to a single tyre supplier, it’s the aerodynamics where engineers can make a difference.
In our report on the grid computing facilities at CERN , we noted that computational tasks are being farmed out to scientists around the world, not only because it gives these experts immediate access to never-before-seen data, but also because the CERN computers have reached the point where a new power-generating facility would soon have to be built for any further server expansion. A major limiting factor has become cooling capability. HPC suppliers universally agree that this is no unique situation, and many data centres are running into this problem.
Technology transfer means an astonishing number of things to different people, in almost as many contexts – from large scale international cooperation programmes through internal corporate dispersion of methodologies to local reusability of information. Underpinning it all, regardless of scale or purpose, are data analytic needs/benefit assessments – data analysis, or data itself, may often be the technology to be transferred.
The fundamentals of acoustics engineering have changed little over the decades. For instance, at its heart, a speaker still consists of a magnet, coil and some sort of diaphragm. Scientific software, however, has allowed researchers in acoustics to make some interesting discoveries and come up with innovative products and techniques.
Sophia Ktori investigates the use of informatics software to increase data integrity in the laboratory
Tim Gillett reports from PRACEDays 2016, held in May in the city of Prague
Robert Roe investigates the motivation behind the architectural changes to Europes fastest supercomputer, Piz Daint, housed at the Swiss National Computing Centre
Robert Roe discusses the merits of the latest storage technologies, including a push by storage providers to develop end-to-end platforms featuring intelligent data management systems
As AMD launches its latest FirePro GPU, Robert Roe investigates a new suite of open-source tools, released by the company that convert code from CUDA into C++