Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

University deploys InfiniBand network

Share this on social media:

The University of Oklahoma (OU) Supercomputing Center for Education and Research (OSCER) has deployed QLogic InfiniBand adapters and switches as part of its new Linux cluster, known as Sooner. OU researchers will use Sooner to study and model tornadoes, in hopes of improving early warning systems and reducing the financial toll and the number of lives lost to these powerful windstorms.

OU Information Technology's director of supercomputing Henry Neeman and his OSCER colleagues installed QLogic 7200 Series double data rate (DDR) InfiniBand adapters, based on the QLogic TrueScale architecture, and QLogic SilverStorm 9000 Series InfiniBand core and edge switches to Sooner during the 'in-place' transition from OU's previous cluster, TopDawg.

'OSCER serves more than 400 students, faculty, and staff in virtually every science and engineering discipline, as well as in medicine and business,' Neeman said. 'As we transitioned from our previous HPC cluster to Sooner, we were challenged with maintaining existing levels of service to the rest of the university with minimal down time. We successfully tackled that challenge and look forward to helping researchers as they attempt to uncover clues to the mysteries of tornado behaviour.'

Sooner consists of 534 compute nodes connected through QLogic 7200 Series DDR adapters in a 2:1 fat tree topology. The core switch of this network topology, the QLogic director-class SilverStorm 9240 288 port DDR switch, connects to 37 QLogic SilverStorm 9024 24 Port DDR InfiniBand switches, dispersed across 28 racks. The November 2008 Top500 list of Supercomputers ranks the Sooner Linux cluster in the 91st position and reports that Sooner has achieved 28 teraflops at 83 per cent efficiency.