Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Laboratory informatics systems are fuelling efficiency

Share this on social media:

The petrochemicals industry is finding that laboratory informatics can save money and improve product quality, as Tom Wilkie discovers

Unwittingly and unwillingly but very publically, British motorists provided an object lesson in the importance of quality control in the petrochemicals industry, almost exactly six years ago. Newspapers and television reported mysterious engine failures by up to 10,000 cars in the southern UK requiring expensive engine repairs, all as a result of damage caused by silicon-contaminated unleaded petrol sold by some supermarket petrol stations.

Out of necessity, the petrochemicals business is global in its reach, so its quality control laboratories lie in different areas of the world and often use differing languages. Laboratory informatics needs to feed data and reports into other software, often massive, company-wide management information and control systems. The informatics reports are mission-critical – mistakes are very costly, but so too is any delay in getting results out to plant operations that use the data to tune the manufacturing units to maximise safety, product grade, and yield.

In some ways, then, the demands on a laboratory informatics system in the petrochemicals sector appear similar to those of the pharmaceutical industry – globalised and mission-critical. But there are important differences: petrochemicals is not a ‘regulated’ industry; and, in many instances, its production processes are continuous rather than batch-oriented. It’s possible, therefore, to use many differing systems, because the laboratory informatics systems do not need to be validated to standards set by an external body such as the US Food and Drug Administration. Cost-saving and a desire to have an ‘integrated’ software system may incline some companies towards an ERP-based system, but such software tends to be batch-oriented, and so a dedicated LIMS may be better suited to the huge volume of samples being taken and, if instrument integration is possible, there are huge savings to be made in eliminating manual data entry.

For Yves Dupont, senior manager for oil and gas at informatics consultancy LabAnswer, a dedicated LIMS will not only reduce costs but increase revenues ‘because you can provide data that is more real-time to the manufacturing process, so they can take action quicker to increase plant efficiency or decrease variance and thus produce more, higher-quality products.’ In contrast, even though enterprise resource planning software such as SAP has a quality control module built in, he believes that the manual data entry required is costly in itself, leads to errors and thus more costs as these have to be traced and corrected. ‘It’s easier to get to the data with a LIMS than to customise data-gathering from SAP,’ he said. 

The challenges of continuous processes

Most companies, he continued, are going down the route of building in quality at every stage of the process. This means that between 50 and 90 per cent of the samples going to the analytical laboratory are from in-process testing, rather than raw materials or output. ‘That tells you a lot about the focus of the company,’ he said. He also believes it is appropriate to the continuous nature of the process: ‘You don’t have a batch so it’s difficult to do finished-product testing, but you can tap the pipeline.’ In contrast, he feels that for most ERP systems and their quality control modules, ‘the basic transaction is creating a batch’ even though with continuous processes, you never finish the batch.

‘Generally speaking, what we see because of history is a LIMS system at each site potentially with local reports or workflows, rather than a global centralised LIMS’ serving the company as a whole, he continued. There is usually a local implementation. ‘In the pharma world, they would use a more centralised approach partly because of the validation costs associated with each site having its own instance, customisations, and processes.’ In the regulated environment of the pharmaceutical industry, it is usually cheaper to bring in a single, pre-validated system for the entire enterprise across all labs and countries than to pay separately for individual validation.

In petrochemicals, by contrast, each site will adapt the LIMS to meet the requirements of its own workflows. Mergers and acquisitions have also lead to a proliferation of LIMS from different vendors at different sites, he said. ‘It’s only when there’s a corporate upgrade that you’d see a platform rationalisation and simplification project.’ 

Efficiency and product quality

The payoff between investing in a LIMS and improvements in efficiency and quality of product are just as evident in northern Europe, in the view of Adam Wahlund, Marketing Manager for Bytewize. Over the past decade, he said, ‘we have experienced an increased need for LIMS in petrochemical laboratories. I think companies have learned how much time and money they can save by regular analysis. Thanks to a LIMS, the laboratory work is more efficient and accurate. If, for example, there is water in insulating oil, the quality of the oil is decreased. Gas in transformer oil means that the transformer isn’t working properly. By regularly taking samples and sending them to the laboratory, plants can decrease the wear on machines and also the oil costs. Ultimately, companies avoid disruption in production, which improves cost-effectiveness.’

Bytewize has been supplying informatics software to the petrochemicals industry, predominantly in the Nordic countries, since 1999. Its first such customer, VP Diagnose, could not find a system that fitted its needs as they had to save a lot of information about the source of the sample (in their case, the transformer from where they take the oil sample). This led to customising O3Lims for the specific needs of a petrochemical laboratory.

Direct input from instruments and tight integration with existing company software are common themes here too: ‘When we get a new customer it is very likely that it already has other software installed, something that increases the demand of developing a flexible LIMS that is easy to integrate. Connecting LIMS to the instrument software can save a lot of time as you decrease the need for manual handling; results and other data can get imported automatically.’ 

Wahlund said that even replacing an older system puts pressure on the flexibility of the LIMS: ‘Many laboratories want to import historical data into the new system and instead of typing in data manually from maybe 10 years back, we write a script and transfer it automatically.’

The same themes – integration with pre-existing software, immediate access to data, and the efficiencies that come from using a LIMS – are evident at Ras Laffan Industrial City in Qatar. Here, the world’s largest gas to liquids facility, Pearl GTL, was established by Shell and Qatar Petroleum in 2006. It currently processes 1.6 billion cubic feet of wellhead gas each day, using Shell’s Middle Distillate Synthesis process to convert the gas into fuels, lubricants and other products.

According to Ajith Kumar, senior business analyst for Qatar Shell GTL: ‘With billions of investor dollars and tens of thousands of jobs at stake, data management was a major priority. We needed condensed, accurate information at our fingertips at all times.’ For its testing laboratories, Shell opted for a Thermo Scientific SampleManager LIMS. At Pearl GTL, this LIMS is integrated with an operations management system (known as OTTER), process historian (PI), the oil movement and batch tracking system, laboratory instruments and other production systems. Instead of sending test results manually to operations, technologists and process engineers, at Pearl GTL results become available to all relevant parties within the PI system as soon as they are authorised in SampleManager.

Colin Thurston, director of product strategy, process industries, at Thermo Fisher Scientific, cites an example of the benefits of this integration: when panel operators need to move oil to new tanks in preparation for shipping, they do not have to wait to be notified of test results, minimising demurrage charges for loading delays that can cost as much as $35,000 per day. ‘Since Pearl GTL opened, the facility has incurred no demurrage charges, an outstanding feat for an operation so large,’ he said.

But there are benefits from integration in the other direction – with the instrumentation, Thurston continued. Sample points in the field are marked with radio frequency identification tags so that when field staff perform sample rounds: ‘A handheld computer guides them to each sample point and then automatically records the required information. The data are then instantly transferred to SampleManager from the field, saving Pearl GTL an estimated 2,400 man-hours a year.’ Mansoor Al-Shamri, laboratory manager for Qatar Shell GTL, stressed the benefits: ‘Field operators can do their jobs faster and also more accurately, since they’re not recording readings by hand.’

Lessons from life sciences?

It is not only in process quality control but also in research and technology development that petrochemical companies have a need for informatics software to manage the huge amounts of data that they generate and produce knowledge that is useful for scientific and business decisions. And these other applications also face similar problems of accommodating data generated by legacy systems, of crossing geographical and linguistic boundaries, and of integrating with existing software.

For Shikha O’Brien, VP business development USA for Dotmatics, the aim is a fully integrated system that can be accessed piecemeal in order to enable scientists to collaborate. ‘Irrespective of how users have captured their data, scientists should have access to it in a format that makes sense to them,’ she said.

Both she and Glyn Williams, VP of product delivery at IDBS, see parallels with the pharmaceutical and life sciences industries, more so perhaps than is evident in the quality-control area. ‘Today, collaboration is a necessity, not a luxury in the life sciences,’ O’Brien said. ‘People access and make decisions on data across multiple departments, and we see the same model now being applied to petrochemicals.’ Williams added: ‘We do see a lot of commonality. There may be different emphases, but research is research.’

Users do not need a complicated system, O’Brien continued. Dotmatics offers an off-the-shelf web-based ‘dashboard’ that is data-agnostic. It connects with disparate data sources, retrieves and presents the information in a format that the end-user wants. The petrochemicals industry is a challenge, she went on, because it has a lot of data, often held on legacy or in-house systems from as far back as the 1980s and it is only recently, in her view, that the industry has started looking at the life sciences model to see how it can bring in a proper informatics solution to capture and retrieve data.

For Glyn Williams, one very big growth area, where he sees parallels with the life sciences, is in biofuels. There is a similarity to early-stage pharma, he said, where the companies need to protect their intellectual property (IP). In the case of biofuels, the IP may be the process rather than the product itself but the challenge to the informatics system is the same – ensuring that everything is documented and recorded in a way that will stand up in patent litigation, if need be. IP protection is one of the major things that electronic lab notebooks do, he said, which is why the biofuels companies are interested in such solutions.

The structure of the biofuels industry mirrors that of biotechnology, with lots of small companies carrying out early stage work and hoping to sell on to the big oil majors (or, indeed, sell the entire company and its intellectual property portfolio). Their need is to have ‘systems that are flexible and track the decision making processes -- what they have done and the results, to see if they are successful -- and then move on to the stage of “can you scale it up?”.’

Although the emphasis is often on novel procedures, he sees the whole enterprise as very similar to pharmaceuticals, where once a compound is developed the challenge is to scale that up efficiently and cost effectively.