Skip to main content

Food safety first

Whenever my fiancé and I cook for our friends, we ensure that the vast majority of dishes don’t include any beef products. The reason for this is that our one friend, Michelle, doesn’t eat beef – and hasn’t done so since the 1990s when the UK was hit by the BSE crisis. In 2008, as a precautionary measure following news that a California-based abattoir failed to prevent sick animals from entering the food chain, the US Department of Agriculture (USDA) implemented the largest recall of meat in US history – 143 million pounds of beef. That same year, Chinese authorities were faced with the discovery that milk and infant formula had been contaminated with melamine. As in the case of my friend, instances like these can have a profound effect on consumer confidence.

Media attention does little to quell fears and the food and drink industry is coming under increasing scrutiny as individuals and organisations look for reassurance through the tightening of safety standards. As a result, companies are becoming subject to regulations similar to those applied to pharmaceuticals. One example, and representing the first major revision to the US’ food safety system in more than 70 years, is the FDA Food Safety Modernization Act (FSMA). Signed into law on 4 January 2011 by US President Obama, this essentially shifted the focus to the prevention of contamination and safety issues, rather than reaction to problems after they have occurred. The need to comply with these stringent regulations – which now require demonstration of full traceability from farm through to manufacture and distribution – and with those in Europe as well as other parts of the world, is driving the demand for informatics solutions.

Jay Ross, senior product manager at Starlims, agrees that from a laboratory perspective, the increase in compliance needs, coupled with health scares such as those highlighted earlier have led agencies around the world to look far closer at how food and drink is tested. The obligation for companies to demonstrate traceability throughout the entire production process is, of course, critical given that should a contaminant be detected, the manufacturer involved needs to be able to quickly and effectively isolate the particular batch, where it originated from and what materials were involved in order to issue a recall.

According to Ross, the increase in regulations has also led to laboratories carrying out much more sophisticated testing and tracking the samples in more of a GLP (good laboratory practice) sense to that level. This has resulted in the amount of microbiology work being done increasing significantly. ‘That type of work used to be very basic, where labs were simply testing for a specific type of bacteria,’ he explains, ‘but they now have to do more exploratory work to look for a variety of bacteria. This increases the level of complexity.’

In the field

The complexity of the tests, combined with the traceability demands presents a challenge as many companies have to be able to gather results from tests carried out remotely. Colin Thurston, director of product strategy at Thermo Fisher Scientific, offers an example in that a number of grain and cereal producers require farmers to run initial tests on the materials before buying the crops. An obvious downside to this is that non-technical people are being expected to carry out lab-type analysis.

Another issue arises when considering infrastructures and ensuring they are robust enough to deal with problems such as the farm not having broadband access. ‘The question then becomes how to get the test result data up to the food producers’ information network,’ says Thurston. To deal with both of these difficulties, Thermo Fisher Scientific has been showcasing the ability to collect data via handheld devices, such as smart phones and tablets. In addition to the benefit of data networking capabilities, ‘these devices are becoming more powerful in that they have the ability to not only collect the data, but add value to it such as date and time stamps and location,’ he adds. Controls and verifications of where that information is coming from can also be added.

The company’s Merlin platform runs on iPhones and iPads and enables users to download information from the Laboratory Information Management System (LIMS), such as a series of samples that need to be collected, or to capture data by scanning barcodes. The dictation capabilities of those devices can also be used to make notes about the samples being collected. One of the most important considerations – especially in the light of requirements – is emphasised by Thurston who states that, ‘there’s a completely closed loop between the information being collected in the field and the lab data that’s generated when those samples are tested in the factory.’

He continues by saying that the way regulations are unfolding concerns not just the material – i.e. the food and drink products being created – but ensuring that only the right people are able to interact with certain systems. Essentially, the requirements are intended to ensure that anyone who is analysing the data in the lab has been trained in the work they’re doing. ‘Many food and drink companies are audited on a regular basis and so have to be able to demonstrate historical records showing that at a particular time a specific user was entitled to carry out a certain test, and that test was passed before the product was approved for shipping,’ Thurston explains. ‘There’s an entire metadata that goes alongside the results the lab generates.’

According to Thurston, the increase of distributed systems, whereby some information may have originated in the farm that grew the grain, means that the LIMS needs to cope with data being generated both inside and outside the lab. Add this to the volume of the data and many are being pushed towards a cloud-based implementation. That trend is, he explains, in part due to the fact that the data itself is also becoming far more distributed – there is no longer one lab that’s providing all the results for a particular box of breakfast cereal, for example.

He notes that he expects the private cloud paradigm to continue to grow despite the fact that the food and drink industry is often quite conservative in its use of technology. He attributes this growth to companies seeing the benefits of remote outposts being able to use the same data infrastructure as internal operations. ‘When companies consider the infrastructure costs,’ Thurston continues, ‘being able to work on a software-as-a-service model in terms of licensing will also rise in popularity.’ He comments, however, that this is less important to global manufacturers that don’t have cyclical requirements based on harvest times, but instead, such as Coca-Cola, constantly produce products. Thurston also warns that while many of Thermo’s customers are currently investigating cloud-based solutions, ‘I’m not sure they’re actually doing it whole heartedly.’

Challenging times

The hesitation to embrace new technology is not a new issue, but requirements are creating situations where compliance would not be possible without an appropriate informatics solution in place. One example, and addressing the overall competence of labs in Europe, is ISO 17025. This, Thurston explains, outlays a number of different areas that must be covered by the lab, such as ensuring there is a complaints management solution, control of non-conforming tests, record training and authorising validation. As it is a legal requirement for European companies and for any company exporting food to Europe, it has wide-reaching implications given the globalisation occurring within the market. 

This globalisation has ‘fundamentally changed the requirements and led to micro-segments,’ comments Michael Doyle, director of product marketing and principal scientist at Accelrys. ‘To support that, companies can’t distribute what they’re doing; there has to be some commonality in order to manage diverse goals, sources of material, innovation and production in a more holistic manner.’

Many companies face revaluating both their infrastructure and ability to adapt to changing requirements as a result of another big trend in recent years: consolidation. Starlims’ Jay Ross comments that as the big food and drink conglomerates buy up smaller producers, they end up with multiple information systems. ‘A typical challenge for any industry in a consolidating phase is that the organisation suddenly has different cultures that have unique ways of doing things and so managing the expectations of end users is tough,’ he says.

When accessing the gaps within an infrastructure, and engaging a vendor to come in and implement a LIMS, many companies believe that a chosen standard can be imposed from the top down. However, as Ross explains: ‘The impact of a top-down approach is often underestimated, especially when looking at global organisations where the differences go beyond business culture and into regional and ethnic.’ He adds that while many people can agree that good project management is the answer, the term is often not truly understood – the key lies in effectively managing expectations by communicating regularly.

This sentiment is echoed by John Gabathuler, director, Industrial & Environmental at LabWare, who comments that the as a result of the globalisation of the market, the drive towards consolidation and advances in technologies and processes, organisations are having to constantly evolve and adapt to deliver business growth and competitiveness. ‘The informatics system should facilitate that process rather than become a barrier to growth,’ he says. The Labware Enterprise Laboratory Platform provides for business, compliance and technology requirements using one fully-integrated laboratory platform and then delivers on an enterprise scale.

‘By providing a single fully-integrated suite consisting of Labware LIMS and Labware ELN (Electronic Laboratory Notebook), the ability to analyse, capture, check trend and report with full traceability and automation at each step greatly facilitates adherence to national and international regulations,’ Gabathuler explains. He continues by commenting that the Labware solution provides ‘the ability to manage and store complete supply chain testing, including full genealogy from raw materials through to the finished product provides for the highest levels of visibility and assurance required.’

Labware is not the only vendor taking an integrated approach and as Thermo Fisher Scientific’s Colin Thurston suggests, ‘Laboratory informatics solutions in the food and drink industry need to be connected from the manufacturing line, through the quality control process in the lab, to the enterprise data systems that contain information about batch approvals, shipments and ordering.’ Thermo’s integration platform, Integration Manager, uses a common architecture to connect everything from the instrumentation to the data coming from the manufacturing line. This allows users to connect directly to online analysers and compare that data to the information coming out of the lab.

Thurston explains that the same architecture enables the exchange of data to ERP (Enterprise Resource Planning) software and manufacturing information systems. ‘Because we use a common platform for data interchanges,’ he says, ‘it’s straightforward to connect our solutions to third-party technology – a key advantage given that, for the most part, there is no common standard across different manufacturers. Some people use Oracle ERP, some have SAP and others will have manufacturing execution systems from Honeywell, Yokogawa or AspenTech, for example.’ He adds that there is a real heterogeneous landscape in terms of the enterprise systems companies need to connect to and so having a common platform to do all that connectivity makes things much easier.

PerkinElmer’s Dale Seabrooke agrees that there are many good point solutions on the market; however they are limited if not communicating with one another. ‘Integration of those informatics products is key,’ he says, but adds that as systems integration is done, users inevitably find various gaps such as tracking asset utilisation. But there are solutions that can fill those gaps. PerkinElmer’s Ensemble for QA/QC, for example, is a portfolio of products that rests on a common platform. According to Seabrooke, this ensures that they work seamlessly together and with other systems. He notes that the areas of the portfolio customers are most interested in are the LES (Laboratory Execution System) and LIMS solutions, as well as the company’s robust inventory system.

Taking a different line, Starlims is one of the only centralised web-based solutions in the food and drink market. The company’s Jay Ross explains that while the solution is web based, it feels like a normal application that works in a way users are familiar with. ‘First and foremost,’ he says, ‘the benefit of choosing a web-based product is that it dramatically reduces operating costs. There is also no deployment cost which means that companies don’t need to engage their IT departments to get the system installed – users simply open the browser in order to access it.’

From paper to programs

Taking things from a business perspective, PerkinElmer has just released Asset Genius, which is designed to get utilisation data from instruments as opposed to analytical results. It details how much an instrument has been used and by whom so that management can understand the use of their instruments. Seabrooke comments that this will not only impact preventative maintenance costs, but the justification of additional equipment or the redistribution of equipment as well. ‘What’s important,’ he says, ‘is that companies take the best practices they have and apply them to their electronic solutions.’

Another driver towards efficiency and cost reduction is paper replacement. ‘If people are using paper for routine analysis, either in a log book when tracking the calibration status of an instrument, or a work sheet when doing sample preparation, and then typing that into a LIMS or other system, it will inherently have errors,’ Seabrooke suggests. ‘It also doesn’t allow them to utilise the data in other systems.’ The PerkinElmer Laboratory Execution System (LES) links a paper replacement tool to pull data from the LIMS, to update a calibration system and check against training records.

Seabrooke warns that ‘historically, paper replacement has been, to a large degree, only done in the pharmaceutical industry, where it’s been a very lengthy and expensive project.’ With this in mind, he says that the reluctance within the food and drink market to adopt a technology that’s seen as a very significant and time-consuming investment is more than understandable. In answer to this, he says: ‘We have spent a lot of time developing a tool that breaks that paradigm by getting companies to take the paper they have, turn it into a PDF and use rapid e-form development and implementation (REDI) and drag controls over, effectively preparing electronic forms in hours rather than days.’

These time savings have become more critical in the US since the patent law changed from first to invent, to first to file. According to Accelrys’ Michael Doyle, this change is partly behind the growing need to have document production processes, as it no longer matters who made a discovery; it’s now about who files the paperwork first. A further benefit of moving away from the use of paper notebooks is again highlighted by Doyle, who says: ‘Experiments are expensive and time consuming and in many cases companies find themselves reproducing them as they find it difficult to access paper sources if notebooks are used.’ He concludes by adding that a move towards electronic formats enables companies to reach in, draw trends and search and mine data effectively.



Topics

Read more about:

Laboratory informatics

Media Partners