Skip to main content

Why big data means big bucks

In the second of his surveys of big data, Robert Roe finds more applications of high-performance computing to the financial services industry.

Big data is receiving attention as never before. This month, while the British Government announced a £42 million investment in a national ‘Big Data’ centre, private companies have also been looking at applications  – mainly in the financial services sector.

SAS, the US business analytics software house, is to set up a research and development centre in Scotland, focused on combatting fraud and financial crimes; two UK-based start-ups are vying for funding from the Emerging Companies Summit at Nvidia’s GPU tech conference, again for applications in financial services; while Maxeler Technologies took advantage of a visit to CeBIT by UK Prime Minister, David Cameron, and German Chancellor, Angela Merkel, to explain its Dataflow software and the spatial programming initiative OpenSPL.

The SAS research and development centre is aimed at developing solutions to tackle fraud and bolster security. According to a report on global fraud published by the Economist Intelligence Unit in 2013, 70 per cent of 901 companies surveyed had reported suffering from at least one type of fraud in the previous year, up from 61 per cent in the previous survey.

The SAS investment, based in Glasgow, Scotland, adds 94 advanced analytics specialists to the established 126 member team. It also takes advantage of Government funding: Scottish Development International (SDI) announced a grant of £1.29 million in 2013.

The use of analytics has become critical across industries, such as financial services, public security, retail, telecommunications and manufacturing, to support better decision making that will advance improvements in business operations, performance and innovation as well as preventing fraud and mitigating risk.

Mikael Hagstrom, SAS executive vice president, said: ‘SAS is proud to be building on its existing operations in Scotland. We initially established a global research and development team to create business applications to help modernise law enforcement, improve public safety, and enhance national security. The initial investments allowed SAS to see the real Scottish potential with easy access to Europe, the excellent pool of talent from universities, and its strong culture of innovation.’ 

Over the past 15 years, SAS has invested about £90 million in providing 80 universities across the country with access to its solutions. It has also set up a Student Academy programme, to give educational institutions the ability to train students in big data skills.

Scotland’s First Minister, Alex Salmond, said: ‘The new facility will position Scotland as an international centre of excellence for developing solutions to tackle fraud and security and will create a substantial number of highly-skilled, high-value jobs.’

Further benefit for the financial services industry could come from Nvidia’s GPU Tech Conference, this month. Part of the event consists of a one-day Emerging Companies Summit, which runs hosts two competitions for more than $500,000 in cash and prizes, from sponsors such as Microsoft and Cooley LLP, as well as NVIDIA. Among the dozen young companies competing in the ‘Early Stage Challenge’ for $100,000 cash are two UK companies whose businesses are aimed at big data and financial institutions.

Founded in 2006, Global Valuation offers its flagship product, GVL Esther, as a general purpose valuation and simulation engine for large OTC (over-the-counter) portfolios. It is a software-hardware solution and is capable of running large models of more than 100,000 transactions.

Using Supermicro hardware and accelerated by Nvidia K10 GPUs, a handful of servers in a compact cabinet, the company claims, are capable of processing a workload 10,000 times greater than that achievable by a large grid-computing solution.

The network bottleneck of traditional grids is bypassed as data traffic across external cables does not impact performance. Data traffic internally at the board does not come to impede the performance, as mathematical algorithms are carefully tuned up to the hardware.

The other UK-based company, Brytlyt, analyses big data using fundamentally new algorithms. The system is designed to allow database operations, such as JOIN and SORT, to be processed in massively parallel and scalable environments without loss of performance. Brytlyt’s software offloads database operations to general processing graphics processor units (GPGPU). GPGPU computing is the use of a GPU together with a CPU to accelerate general purpose scientific and engineering applications.

According to the company, the cluster topology is such that the platform can scale horizontally – without taking the system down. This means that new hardware, additional storage, compute or ingestion engines can be added to the system with no down-time.

UK-based Maxeler Technologies took the opportunity to get over the message about the importance of big data to the highest political levels when British Prime Minister David Cameron and German Chancellor Angela Merkel visited its stand at CeBIT this month. Oskar Mencer, Maxeler CEO, spoke to the two world leaders about the design of the company’s dataflow engines and their use in the finance industry where stock exchanges and banks employ the engines to accelerate risk analytics in real-time.

The dataflow computing technique focuses on optimising the movement of data in an application and uses massive parallelism between thousands of tiny ‘dataflow cores’ to provide what the company claims is an order of magnitude benefit in performance, space, and power consumption. Data is forwarded directly from one computational unit ‘dataflow core’ to another, as the results are needed, and is not written to the off-chip memory until the chain of processing is complete.

Dr Mencer said: ‘To have two world leaders visit our company stand and appreciate the significance and opportunity that spatial computing presents was a great honour.’ In addition, Maxeler’s MaxCompiler addresses latency issues associated with some of the more widely used computational languages, such as Matlab or Maple. Real-time analytics require very low latency if they are to be effective. This quest for the lowest latency is a constant battle for financial institutions where a competitive edge can mean huge profits.  

In the article ‘Why use modelling software for finance?’ on page 24 of the April/May 2014 issue of Scientific Computing World, Samir Khan, Senior Application Engineer at Maplesoft, explained that latency is a big concern for financial institutions conducting real-time analysis. He pointed out that financial organisations generally program straight into C++ or C#, as this allows engineers to get very close to the metal on the computer, programming right down to the memory registers as opposed to writing algorithms in computational languages such as Maple or Matlab.

Topics

Read more about:

Big Data

Media Partners