Hungary data centre inaugurated

CERN and the Wigner Research Centre for Physics have inaugurated the Hungarian data centre in Budapest, marking the completion of the facility hosting the extension for CERN computing resources.

About 500 servers, 20,000 computing cores, and 5.5 Petabytes of storage are already operational at the site. The dedicated and redundant 100 Gbit/s circuits connecting the two sites are functional since February 2013 and are among the first transnational links at this distance. The capacity at Wigner will be remotely managed from CERN, substantially extending the capabilities of the Worldwide LHC Computing Grid (WLCG) Tier-0 activities and bolstering CERN’s infrastructure business continuity.

WLCG’s mission is to provide global computing resources to store, distribute and analyse more than 25 Petabytes of data annually generated by the Large Hadron Collider (LHC). It is a global system organised in tiers, with the central hub being the Tier-0 at CERN.

'The experiments’ computing resources needs will increase significantly when the LHC restarts in 2015. Hosting computing equipment at the Wigner Centre to extend CERN’s data centre Tier-0 capabilities is essential for dealing with this expected increase, and to the success of our physics programme. The remote capacity will also contribute to business continuity for the critical systems in case of a major issue on CERN’s site,' said CERN director-general Rolf Heuer.

'A number of sciences currently face exponential data growth. This innovative approach with Wigner could point the way for research centres to run their services in the future.'

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers