Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

One billion collisions processed by Gordon

Share this on social media:

Gordon, the supercomputer launched last year by the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, recently completed its most data-intensive task so far – rapidly processing raw data from almost one billion particle collisions as part of a project to help define the future research agenda for the Large Hadron Collider (LHC).

Under a partnership between a team of UC San Diego physicists and the Open Science Grid (OSG), a multi-disciplinary research partnership funded by the US Department of Energy and the National Science Foundation, Gordon has been providing auxiliary computing capacity by processing massive data sets generated by the Compact Muon Solenoid (CMS), one of two large general-purpose particle detectors at the LHC used by researchers to find the elusive Higgs particle.

'This exciting project has been the single most data-intensive exercise yet for Gordon since we completed large-scale acceptance testing back in early 2012,' said SDSC director Michael Norman, who is also an astrophysicist involved in research studying the origins of the universe. 'I’m pleased that we were able to make Gordon’s capabilities available under this partnership between UC San Diego, the OSG and the CMS project.'