Skip to main content

Creating a 'right first time' environment

Steve Bolton and Robert Packer comment on why laboratory IT systems should not exist as an island within food and drink organisations

Despite the global economic downturn, the food and drink industry has continued to grow in recent years, fuelled in part by continued urbanisation in India and China. The increase in competition across the globe, combined with economic pressures and the need for operational efficiency, can act as both motivation and opportunity for fraud and misrepresentation at every step in the supply chain. In the US alone, cases of food fraud and adulteration were up 60 per cent last year, according to the US Pharmacopeial Convention. The short-cuts range from adulterating products with cheap but dangerous compounds, as we saw in the 2008 melamine crisis in China, to more subtle efforts like false labelling of organic products or country of origin.

The result is an expanding role for data collection, analysis and management, driven from two sides: food processors are working hard to protect a global business worth some $20 trillion a year by increasing the amount of testing and screening of incoming ingredients. And governments have reacted in some of the world’s largest markets, notably in the United States, where the Food Safety Modernization Act (FSMA) will require companies, among other steps, to keep electronic records of the testing history of imported ingredients. That effectively extends the reach of FSMA rules on data acquisition and archiving beyond the national borders and out to any producers or suppliers who deliver into the US market.

The locational focus for this critical and growing enterprise activity is the QA/QC laboratory. This is where, increasingly, informatics systems are being integrated with instruments and processes to enable suppliers and manufacturers to maximise their operational excellence and efficiency while adapting to the rigors of an increased testing regimen.

In an industry, such as food and drink, where profit margins are tight and competition is fierce, a ‘right first time’ environment can deliver significant cost reductions and increase sample throughput without affecting the quality of data. To create that environment, bench level activities must be controlled and automated to eliminate guesswork and prevent errors. Standard operating procedures (SOPs) must be followed for every method, every instrument and every lab supply, and routine tasks must be automated to reduce the analysts’ workload and opportunities for error. With those conditions in place, rework will be reduced leading to an increase in sample throughput. Automated processes that are integrated into a laboratory information management system (LIMS) can reduce errors, speed the review of data, and provide a simple framework for verification and review.

Increased quantities of data mean that the easy transfer, storage and analysis of data from various sources are essential. When this is implemented correctly, the analysis of multiple data sets can yield more information than one set alone, leading to increased protection against food fraud, without sacrificing productivity in the lab. Given the sheer amount of testing and test data being generated, manual, paper-based processes must be replaced to effectively automate and control data capture and test execution at the bench level. Data storage records must be centralised and controlled, but have a balance of allowing immediate access to the right analyses for validation of results.

Efficient integration of QA/QC testing with manufacturing and product release processes is essential to support manufacturing and product release processes. It’s important that laboratory IT systems not exist as an island within the organisation, but instead ensure rapid, seamless transfer of data and information to wider enterprise systems, such as SAP. It’s clear that informatics integration across the lab is needed for a variety of reasons. For one, the data generated by an instrument is often not the final result. A wide variety of data manipulations may need to be applied to the instrument data before it is ready for the lab’s information management system. For example, an instrument result may need to be corrected for sample weight before being reported. And the integration needs to be flexible as laboratories use their instruments in many different ways. Different configurations of the instrument can result in data being generated in a different format. The good news is that effective integration does not require a wholesale replacement of existing IT systems and instruments.

While instrument interfacing is not inexpensive, that doesn’t mean it is costly. Studies have shown that for instruments which are used routinely, or for instruments such as ICP, AA, and chromatography that produce large amounts of data per sample, the return on investment (ROI) can be less than a year. The cost of professionally installed interfacing will typically amount to less than 10 per cent of the total LIMS cost – a small price to pay to protect the integrity of the data.

The cost of using incorrect data is a factor that is rarely considered when looking at the ROI of interfacing. Without a proper interface, all of the data collected to the laboratory information management system is suspect, due to typographical errors. Time needs to be spent reviewing that data and even with the best review processes in place, incorrect data can go undetected compromising the quality of the final product and placing the brand at risk. The real benefits of integration of IT systems and instruments are improvements in operational efficiency and the generation of more reliable data. And reliable data is increasingly valuable in a world of mounting safety requirements that surround food QA/QC.

Steve Bolton is marketing program manager, Informatics, at PerkinElmer. Robert Packer is Food solutions development leader at PerkinElmer.

Media Partners