Skip to main content

Up with quality and out with paper

Picture for a moment, if you will, a modern oil and gas plant in India – a shining exemplification of the newly advanced economic development that characterises the four BRIC economies (Brazil, Russia, India, and China). Selling its products on the international market, the plant has to test each batch before export to certify that it conforms to the specifications and standards that the customer expects.

So how did the analytical laboratory – where this crucial testing and certification is carried out – communicate the certification to the dockside before shipping? Until just a few years ago, the data transmission technology was a man on a bicycle.

Picture it, if you will – but not during the monsoon season.

It is, admits Kim Shah, director of marketing and new business development for the informatics division of Thermo Fisher, a particularly egregious example. But failure of industrial plants to integrate their laboratory operations with the rest of the company’s processes is commonplace around the world – in developed economies as much as in BRIC economies or in developing nations.

Thermo decided to introduce its ‘Connects’ programme some four years ago specifically to address the issue.

Connecting up all the elements to make a single, integrated process is critical, too, in the view of Paul Denny-Gouldson, vice president of strategic solutions at IDBS. ‘We’re seeing a massive applicability in the oil and gas industry, where some of their processes are continuous,’ he said. Interestingly, this application to oil and gas has developed out of informatics’ role in a very different area: implementing ‘quality by design’ (QbD), especially in biological processes. Biologics are growing in importance for the pharmaceutical industry, but the production of large molecule biologics is usually a continuous process, rather than batch manufacturing, and so IDBS developed its Bioprocess Execution System (BPES) with this in mind. Now it is seeing that the experience gained there can be applied to other continuous processes, as the example of oil and gas demonstrates.

Connecting the island

According to Shah: ‘Our sense was that, in many companies, the laboratory was treated as an island by itself where testing was done; at some velocity, information would go out from the lab to other parts of the organisation; and then some decisions would be made as to what to do with the material.’ So the company set out to convince its customers that connecting all this information in an integrated fashion would reduce errors due to manual transcription and dramatically speed up the flow of that information.

He explained: ‘We launched Integration Manager to provide a hub, or a bus, in the middle that could connect anything to anything: all kinds of instruments at one end – high-end intelligent instruments like mass spec or chromatographs, but even pH meters and balances – and at the other side it provides connectivity to an SAP or manufacturing system.’

In a continuous rather than batch production process, especially in biologics production, the monitoring of the process parameters is absolutely critical, according to IDBS’ Paul Denny-Gouldson. ‘These need to be used pretty much in real-time to check the system is running as it should do. We have approached this as a holistic problem and have sought to join up what are, in many cases, distinct parts of the business – so you do not end up with this age-old problem of silo data and a lack of understanding of what’s gone before. This whole concept of connecting the downstream guys with the upstream guys is really, really important. What we’ve done is provide frameworks that support the data capture, the intellectual property capture, and the actual process development concept that goes all the way through, and connects all these departments up.’

The paperless lab and quality by design

Two concepts permeate this drive to integration: one is the paperless laboratory and the other is Quality by Design.

According to Shah, the concept of paperless lab has been enthusiastically adopted by the pharmaceutical industry in Europe. But, he warned: ‘The reality is that paperless lab is not one thing or one solution; it is really a goal that the organisation has to try to achieve.’ In many ways, he continued, paper is just a stand-in for a manual process: ‘In other words, if paper is involved, then a human being is involved. Then the chances of errors creep in. Removing paper is really removing the human being from that process step, and improving the quality and velocity of your data.’

For Denny-Gouldson, attention to process is critical also: knowing what pieces of data have to be transferred is one element, but the second one is understanding and re-engineering the process by which that data moves. ‘In so many cases, we have seen paper systems being used to drive these environments and there is no knowledge of what is going on. We have been able to map those processes and the data requirements into a holistic environment.’

It’s not just paper: ‘Excel is a data graveyard,’ he remarked. But the point of integrating the analytics and having a real-time reporting of the analyses is that it allows the production plant to ‘build up a knowledge base: one of the questions in biologics is “do we understand our process to the point that, when we see a deviation – e.g. the temperature goes too high – do we know what that’s going to do downstream?” They can apply their process knowledge more effectively to get quality by design – you design in the quality because you know so much about the system you are running. Once in place, our system allows people to optimise their process much more easily than in a paper or Excel-based workflow.’

But such a re-design does not mean that a regulated workflow would be done in a different way and then have to be revalidated. According to Trish Meek, director of product strategy for the informatics division of Thermo Fisher, all companies want to preserve, as much as possible, the investments they have made in their existing technologies. The points at which the different elements of the process are connected together will be dependent on software systems already in place as well as the process itself. ‘Our tools are flexible – people can tailor to make the appropriate connection points for their process. You’re not looking at a one-size-fits-all system,’ she said.

She stressed: ‘It is not realistic to say “you need to throw out this or that system”. You have to look at the tools you have already implemented and invested in, and how do you get the best out of them. What is the process today, and how can we optimise it given those systems and constraints?’ The business value lies in making those connections between systems. She echoed Denny-Gouldson’s point: if quality data is getting to the process management system in real time, ‘it can change the process conditions by which they are manufacturing the product’. If there is a deviation from the standard conditions: ‘They might want to change the temperature and pressure in the manufacturing process to address the quality data and get the process back under control – if you do not get the quality data in real-time, then you are getting that data too late to impact change in your business.’

She continued: ‘It’s an easy vision for upper management. If the quality data is connecting to my ERP then I know immediately if a product is out of specification and I can stop product going out the door or I can, hopefully, get my process under control in time, so that I do not even finish the process in that way but change the parameters and save the product.’

And, according to Shah, a seamless flow of data allows for real-time reporting, via a dashboard, to senior management in an organisation. ‘If you do all the other things, one of the benefits you get is the ability to get flexible reports in real time of what data is important to you – at your fingertips,’ and the ability to present information on a dashboard in this way was a feature of Integration Manager.

There is one further driver towards the paperless laboratory, Shah pointed out, and it is perhaps an unexpected one: ‘The Cloud is now becoming a big factor in people’s thinking and planning. It’s obvious that you are not going to be able to take things to the cloud on pieces of paper. That is going to make organisations keep all their data electronic and connected.’

Changing the laboratory

One aspect of integration that is sometimes overlooked amid the discussion of getting the process side of operations connected up, she pointed out, was integration within the laboratory itself: ‘One of the things people have been surprised by is in-laboratory integration – driving for real laboratory process integration.’ She spoke of how she had visited laboratories where something as simple as the balance was not integrated but this proved to be ‘a huge bottleneck in the process and was really slowing down the upload of that data into the LIMS, and thereby that data going into the process and being actionable within the organisation.’

In the pharmaceutical industry, Quality by Design has been encouraged by the US Food and Drug Administration but, she said: ‘Those kinds of principles already existed in the other industries that we deal with. In-line and laboratory data goes hand in hand to the understanding of your final product and to making sure everything is running smoothly in the process.’

Nonetheless, according to Denny-Gouldson, the role of the analytical laboratory and the analyst are changing. Rather than being a service where samples are created, sent off, and then checked into the laboratory, nowadays: ‘They are intrinsically integrated into the process. In real time, as much as possible.’ And in an echo of Meek’s point about improving processes within the laboratory, he said that IDBS had been witnessing the increasing use of robotics and automation in its customers’ laboratories. For Denny-Gouldson, too, the aim is, as much as possible, to work with the systems that customers already have in place. ‘We don’t say to people get rid of everything and start new. With BPES (Bioprocess Execution System), we are putting a layer over the top of those current investments and infrastructures like LIMS and the analytical data management infrastructure. The bit we are adding is the data management around the process. We are changing the way they can access that information and the way in which they collect that and optimise their process.

The IT is the easy part, he continued; change is where it gets difficult. ‘What we provide is that foundation layer that allows organisations to develop in steps. You cannot do this as a big bang; it has to be iterative so the organisation can manage the change effectively. Once in place, our system allows people to optimise their process much more easily than in a paper or Excel-based system. They can apply their process knowledge more effectively to get to quality by design – you design in the quality because you know so much about the system you are running.’

Flexibility

One outcome of integration is greater flexibility and less rigid standard operating procedures, he said. Managers have more flexibility to react to deviations from the specifications of the process conditions. Quality by Design is the goal of all these organisations whether big pharma themselves or contract manufacturing companies.

Because the biologics environment is so much more complicated that small molecule manufacturing, according to Denny-Gouldson the BPES allows managers to use a risk-based approach: ‘It collects much more data and thus allows them to go back and do that sort of analysis where they can ask: “Do we know what happens and can we let the system flex?” rather than saying “No, we’ve got to throw it all away and start again”.’

IDBS has found that the BPES, developed largely in response to the pharmaceutical industry’s move to biologics rather than small molecular drugs and by its interest in Quality by Design, is also applicable – directly so – in one significant area of the petrochemicals industry: biofuels. Biofuels, of course use the same sorts of biological systems – fermenters and the like – to produce bioethanol, so with an understanding already gained of biologics, IDBS can apply its technology very quickly.

But because so many of the processes in the wider oil and gas industry are continuous, according to Denny-Gouldson the applicability of the principles developed with BPES range far beyond biofuels: ‘We’re now working with a number of process chemical, and oil and gas, customers and providing a very similar approach – where you have to understand your process; you have to understand your data; and how you get that data.

‘It’s not about running the plant and getting analytical data – you’ve got to optimise that process. They have really latched on to that concept with us.’

It is the underlying approach that is applicable, in other words. And the effect can be massive.

Given the massive volumes that flow through oil and gas plants, Denny-Gouldson remarked, even a one per cent improvement, resulting from the optimising of the process ‘can translate into hundreds of millions of dollars over a year’.



Biologics – which includes large molecules and antibody drug conjugates – are developed in a similar way to small molecules, but the production mechanism relies on living organisms. ‘You have to express the protein using a biological system,’ IDBS’ Denny-Gouldson said.

A decade ago, only five per cent of medicines were biologics but predictions now are that up to 80 per cent of the world’s drugs will one day be biologics of some sort.

There are four stages in developing the production process and they all need to be optimised together. First molecular biology and cell biology has to be deployed by the cell-line development group to optimise the cell line and the production of the material using that biological system. ‘There is a lot of structured data that needs to be captured,’ Denny-Gouldson remarked – a need not only to track what has been put into that biological system but also how that is to be scaled up.

‘The next group says “I need to run that on a very large scale”. A lot of things that the cell biology guys do is optimise the construct to work at different scales.’

The next stage is to optimise the scale up to 1,000-litre or 10,000-litre fermenters. To do this effectively requires real-time data capture from the fermenter.

‘Not just one or two variables,’ Denny-Gouldson continued, ‘They’re capturing hundreds of different measurements in very small time spaces. Maybe every second, they take a reading from 300 or 400 different probes and they are measuring the pH of the media, the adhesion of the cells, the temperature, the nutrients and micronutrients in the media, and looking at how the production and the purification of the product is affected by all these different parameters.’

Because biologics production is a continuous process rather than being done in batches, ‘The monitoring of these things is absolutely critical and the understanding of what changes affects your downstream elements, such a purity and viral loads, are very important.’

The next step, he explained, ‘is when you pull in your purification guys. Howsoever the material is produced, I need to be able to take that and purify it until it is safe to use.

‘You have to measure all sorts of elements and that it is continuous production makes that complicated. The final part, that runs across all of this, is analytical sciences – the ability to test samples and get results and understand the system from an analytical point of view.

‘These need to be developed and used pretty much in real time and to check is the system running as it should do.’

Topics

Media Partners