Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Supercomputing essential for 'cyberscholarship' future

Share this on social media:

The marriage of  high-performance computing with digital libraries will increase the  scale of research and scientific discovery, according to a Cornell professor. Writing in the Winter 2008 issue of the Journal of Electronic Publishing William Arms, Cornell University professor of Computer Science, said: ‘High-performance computing can bring together vast quantities of material -- datasets, manuscripts, reports, etc -- that might never make their way into a traditional library.’

Arms added: ‘A scholar reads only a few hundreds of documents; a supercomputer can analyse millions.’

While noting that a person has a rich understanding of what is being read while a computer works at a very superficial level, Arms argues that profound new research will be made possible by the simple analysis of huge amounts of information. ‘Computer programs can identify latent patterns of information or relationships that will never be found by human searching or browsing,’ he added, and researchers will want computer programs "to act as their agents" searching billions of items based on intelligent guesses.

Arms believes that high-performance computing will be essential in ushering in the age of 'cyberscholarship' when much of the content of scholarly fields will be in the form of digital data collections that are automatically analysed by computer programs.

'Perhaps the university library will cease to be the largest building on campus and become the largest computing centre,' Arms said.