Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Out with the old and out with the internal

Share this on social media:

Outsourcing, end-of-life LIMS, and disruptive technologies such as the cloud, are driving changes to the laboratory informatics landscape, as Tom Wilkie discovered at the Paperless Lab Academy

The laboratory informatics industry is facing unprecedented change, both because the pharmaceutical industry is outsourcing more and more of its activities, and also because many large deployments of laboratory information management systems (LIMS) are nearing the end of their lives but will not be replaced like-for-like.

According to Patrick Pijanowski, pharma and life-sciences partner at US-based consultancy LabAnswer, pharmaceutical companies and other informatics users are not willing to invest in a new LIMS simply to get an incremental upgrade. What they want is ‘transformational change’, he told the Paperless Lab Academy (PLA) in Barcelona, Spain, on 14 April. What is happening is ‘different from what we have seen before. It’s different from what I’ve seen in 20 years in the business,’ he continued.

Not quite so fully articulated during the sessions at the PLA, but forming a pervasive undercurrent nonetheless, was the issue of disruptive technologies – specifically the cloud and, to a lesser extent, big data. The two tended to be bracketed together as if the cloud was seen, albeit implicitly, as the only technology capable of dealing with the challenges presented by big data.

Concerns over data security and the protection of intellectual property and personal information continued to make some delegates uneasy about jumping immediately into the cloud. But, as the PLA progressed, it became clear that, in a world of externalised pharma companies, current technologies and processes are not performing well either.

In contrast, although mobile devices such as tablets and PDAs featured frequently, these appeared to be regarded not as disruptive but rather simply as continuing business more or less as usual.

Pharma outsourcing will force standards on informatics

The implications for informatics of the ‘externalisation’ of the pharmaceutical industry dominated the event. In his keynote address, Pijanowski told the meeting that ‘today, the pharmaceutical industry is outsourcing virtually every function,’ with the result that data integrity and system security were major concerns.

According to Rachel Uphill, pharmaceutical companies are outsourcing not just to contract research organisations (CROs) and contract manufacturing organisations but also to academics, making the need for common data formats and standards all the greater. Uphill is business consultant, early IP and data strategy, to the pharmaceutical giant GSK and is also an advisor to the Allotrope Foundation, which was set up by the pharmaceutical industry some three years ago to build a framework for open data standards.

In the absence of standards the situation today, she said, was one of incomplete, incompatible software; no standard data formats; and inconsistent metadata. It is hard for researchers to mine data for useful information, she pointed out, because the metadata is stored elsewhere and is often captured incompletely, inaccurately, or sometimes not at all due to free-text manual entry.

Allotrope’s goals, she said, were to create re-useable software components; an open document standard; and an open metadata repository. The outcome will be a software toolkit to allow developers to embed a set of standards for analytical data in software utilised throughout the whole of analytical chemistry on different instruments from different vendors.

The project has ambitious deadlines, with the first public release expected in 2016, only three years since software development started and only four years since the foundation was set up. The hope is that eventually the ADF – the Allotrope Data Format – will be embedded in instrument and informatics vendors’ software. But as a temporary expedient, to promote the usefulness of ADF before 2016, converters are being developed to transform data to ADF as a separate step.

The progress is all the more remarkable since the Allotrope Foundation has to move carefully so as to conform at all times with US anti-trust legislation and avoid any suspicion that the pharmaceutical companies might be coming together in a cartel. Consequently, the secretariat for this highly technical exercise is provided by a team of Washington lawyers, while the technical work is being carried out by the German software company Osthus.

Replacing old LIMS – transformational not incremental change

The pharmaceutical industry’s growing interest in data analytics discussed by Rachel Uphill had also been highlighted by Patrick Pijanowski in his keynote address, as one of the reasons that change was happening in the informatics industry now. But a second driver for change, he continued, is that existing mature laboratory informatics systems are being replaced because they have come to the end of their lives or because systems are no longer supported by the vendors.

‘LIMS is a ubiquitous common-core data platform and represents a huge investment when it has to be replaced,’ he said. Companies will be unwilling the spend the money just to get an incremental upgrade, he believes; instead they will go for transformational, holistic change.

Pijanowski’s thesis was exemplified in the talk that immediately followed his presentation, when Christian Wolf described the FELD project (Future Environment Laboratory Domain) within Bayer Pharma. Wolf, who is head of IT Systems at Bayer’s Global Chemical and Pharmaceutical Development, explained that three of the major laboratory systems that the company had been using were no longer supported by the vendors. Indeed, one vendor was no longer in business.

Faced with this situation, ‘we did not want to replace but to have more time for science and spend less time on documentation,’ he said. They also wanted to use common IT platforms across three departments, which were geographically separate from each other.

But in another theme common to many presentations to the PLA, Wolf said that they wanted to harmonise laboratory processes as well as the informatics systems. Other objectives were to eliminate data redundancies and fragmentation, to have self-documenting processes, to connect up all the instruments in the laboratories, and to optimise data reporting.

The new system is expected to go live towards the end of this year, with roll-out to the analytical laboratory early in 2016. The optimistic scenario was, he said, that the resulting efficiencies would recoup the cost of the installation within two years. When pressed, he conceded that ‘there are no official pessimistic scenarios’.

Dead data and lost knowledge

Although the Bayer Pharma project was intended to harmonise working practices across three disparate laboratories, it was not an outsourcing project – all three sites were part of Bayer. The risks of outsourcing were highlighted by Ryan Sasaki of ACD/Labs in his talk to the Paperless Lab Academy. His talk picked up and further developed Pijanowski’s points about data analytics and data integrity in an externalised world. ‘Externalisation leads to lost knowledge,’ he warned. The CROs to whom the pharma companies outsource operations will build up their own expertise and understanding of lab processes as a result and thus create knowledge that is not being captured by the pharma company itself.

Sasaki pointed out that there is a difference between small molecule and biologics in terms of where the intellectual property (IP) resides. It is usually the case that the IP for small molecules lies in the chemical structure – which the pharma company can capture for itself. However, often the IP for biologics lies in the process, not the product. Investing in laboratory informatics software will therefore pay off relatively early in the case of small molecules – in the discovery or development lab – but it will be much later in the case of biologics.

The difficulty in capturing such knowledge is all the greater because currently around 60 per cent of the data exchange between a CRO and the sponsoring company is being done by email and PDFs. ACD/Labs regards this as ‘dead data’, he told the meeting: ‘If the only way you’re receiving data from your CRO is via PDF, then you’re losing a lot of knowledge.’

The inadequacies of such a method for transferring scientific information also increase the risk that the ‘proof of identity’ of a sample might be lost in the transfer of materials between the contractor and its client – another form of lack of knowledge by the pharma company itself. As an example of just how serious the consequences can be, Sasaki cited the example of Bosutinib, a tyrosine kinase inhibitor that has received approval for the treatment of adult patients with some forms of chronic myelogenous leukemia (CML). However, as a selective kinase inhibitor, the compound is the subject of much more medical and basic research. Some three years ago, much of this was invalidated (including findings published in scientific literature) and had to be redone because researchers had unwittingly been using an isomer of Bosutinib instead of using the genuine compound. The compounds had been synthesised by contractors and they had unwittingly produced biologically inactive isomers and this had not been detected by many of their customers.

To combat some of these issues, he offered the ACD/Spectrus platform as one route to a ‘universal data language’ that could avoid the need to deal with ‘dead data’ in an externalised world and instead move to active data and knowledge generation. ACD/Labs was, he said later, a partner in the Allotrope initiative and saw that the work of Allotrope not only as complementary to ACD/Labs own line but, by helping promote the concept of more usable data and metadata, as bringing about a wider realisation of the advantages of moving away from the dead data syndrome.

Disruptive technologies

The disparate threads of outsourcing, maintaining in-house knowledge and quality control, and disruptive technologies – both the cloud and big data – were brought together neatly in the presentation from Nicolas Goffard. He is bioinformatics platform manager for the French start-up biotechnology company Enterome Bioscience, which was spun out of Inria, the French public research agency dedicated to computational sciences, in 2012.

Enterome Bioscience is looking for bacterial genes specific to disease and is sorting through human faeces to find them. The aim is to improve treatment of chronic metabolic, gastrointestinal, and autoimmune diseases. Although patients may have a genetic susceptibility, the conditions are also triggered by environmental factors including an imbalance in the intestinal bacterial ecosystem.

A start-up, Enterome has to outsource its sample collection and analysis, and has to send data out for biostatistics and bioinformatics analysis. According to Goffard, however, it is still vital, under whatever ‘virtualisation’ model might be adopted, to keep quality standards high even despite the pressures of trying to achieve results rapidly. Enterome reached for a cloud solution in the form of Core Informatics’ LIMS to manage the data collection, the data generation, and the results of the data analysis. Enterome outsources the process of collecting the stool samples and then processing them to extract material suitable for analysis; the DNA extraction and sequencing is the next step; and the final, added-value step is the application of bioinformatics and biostatistics methods to analyse the data. It opted for the Core LIMS web-based product, together with Pipeline Pilot, he said, because the system had to be highly configurable to support new workflows and new partners. ‘We didn’t have any trouble using the cloud’, Goffard said. ‘It’s secure.’

The Paperless Lab Academy continues to be possibly the most interesting and thought-provoking event dealing with laboratory informatics. By providing a synoptic overview of the informatics landscape and identifying trends and developments for the future, this year’s meeting of the PLA if anything surpassed the event in Amsterdam last year.

About the author

Dr Tom Wilkie is the editor for Scientific Computing World. 

You can contact him at tom.wilkie@europascience.com

Find us on Twitter at @SCWmagazine.