Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Managing change in the laboratory to deliver more value

Share this on social media:

Technical innovation is inescapable and, Peter Boogaard argues, will force us to re-think our laboratory processes to increase their value to the enterprise

In our scientific software community, there is a fundamental mismatch between what we can achieve from a technology perspective (nice to have) and what we should do from an overall business perspective (must have). During the past decade, major new technologies and software capabilities have been introduced in both consumer and professional markets. Some of these may increase efficiency and quality, or may result in a decrease in operational costs.

In many cases however, these new developments are a prompt to rethink, urgently and fundamentally, whether and how we should apply them in our day-to-day operations. Many research, manufacturing, and regulatory procedures have been unchanged for years and need to be revisited urgently. Integrating legacy, silo-based departments is becoming a priority on the agenda in many boardrooms.

But simplification is easier said than done. Underlying processes are becoming more complex every day.

Where to start? How to prioritise? Which key performance indicators to define? What lessons can be learned from other industries? Do we know the risks involved when embarking on change management?

There seems almost to be an issue of fatigue with respect to corporate change. In our personal lives, we experience a significantly lower barrier to adopting new technologies and processes. We easily accept new ways of working, for example when we transfer personal monies through our electronic banking application. We are not afraid of using the cloud when we check in for our next flight. We submit millions of messages on social media and chat services, without any great anxiety.

Yet we are geniuses at finding excuses not to use the same technologies and processes when we perform our daily work in corporate life. Even when the gains from these new ways of working might be significant, the tendency is to pass them by on the other side. In this article, I want to challenge the reader on how we can identify areas where these new technologies and processes can improve our operations from both the scientific and the economic perspective.

 

We always have done it this way…

The discussion about paper versus paperless is almost as old as commercial computers. In 1975, just after the introduction of the first personal computer Scelbi (SCientific, ELectronic and BIological), Business Week was predicting that computer records would soon completely replace paper.

In point of fact, it took more than 25 years before paperless operations have become accepted and adopted successfully. Yet fewer than 10 years have elapsed since consumer GPS technology was introduced, while Google Earth was introduced just five years ago. Why are there significant differences in the adoption curves for these new technologies? And more importantly, what can we learn and apply from our personal to our professional life? 

Let’s take a step back and answer some of the basics first – to understand what we want to achieve with our scientific processes. Some of our scientific processes have become a matter of habit and have become deeply embedded in our daily routines.

Adopting a client-focused (requester) mind-set may help us. For example, a patent lawyer is not interested in all the scientific details of how you created new discoveries. He needs a simple proof of evidence, and a confirmation that standard operating procedures were followed. Internally, a detailed document should be available, in case there is a need for additional evidence. When a marketer is exploring competitor’s products, he is unlikely to be interested in the absolute amount of a certain chemical concentration, or a physical attribute. He needs to understand it only relative to his products. In my market observation, I have observed that the language we speak is not always in line with the expectations of our lab requesters.

Right, first time

As with any language, effective communication relies upon context. Metadata that enables one to understand whether one is comparing apples with apples, or apples with oranges, is critically important.

Across all sectors, within every R&D process, data is generated in ways that carry vital contextual information for all consumers of that data. This context could be as simple as temperature variation, or as complex as a genome, but capturing instrument, sample preparation methods, analysis parameters, or observational data is essential to R&D. Even a ‘simple’ measurement is not meaningful unless the experimental and analytical conditions are specified, and any subjective observations and conclusions are associated with the data.

Overall there are three basic principles to optimising the integration of data:

1. Capture the data at the point of origin to eliminate human error and to reduce system complexity;

2. Simplify and implement self-documenting processes to eliminate transcription errors and avoid unnecessary retyping of data. In a recent survey, 32 per cent voted that data integration in a paperless laboratory will eliminate manual entries and data transfer; and

3. Ensure that metadata is captured in a structured way. Raw data represents a set of unstructured data points. A data file without context or meta-data information is meaningless.

Adopting automated, self-documenting, data capturing processes at source increases the value of scientific data. To re-use experimental data in other processes requires accurate meta- and context data. Systematic tagging of metadata to objects will significantly facilitate more effective searching.

 

This is happening already in the consumer electronics industry. For example, when capturing a photograph or film, a smartphone will by default systematically add metadata to the object. Examples include GPS location, weather conditions – sometimes even the user’s personal condition such as heartbeat, etc.

The scientific community should expect to see the introduction of similar developments. Modern balances may automatically include temperature and humidity while recording and transmitting the weight. Balance and titration manufacturers are adding value to their instruments by implementing approved and pre-validated methods in their firmware. Chromatography data handling systems (CDS) can add instrument parameters to raw data files, such as temperature of the oven, pressure of the mobile phase, and frequency of data collection. Modern systems also report run-time deviations of instrument settings.

It may sound a small step, but all this will significantly reduce variability and validation efforts in the laboratory and manufacturing operations. Scientists should be expecting ‘more for less’ – resulting in fewer points of failure during operation, less customisation of software, and automated, consistent documentation to reduce validation efforts.

Traceability increases data integrity

A non-invasive, end-to-end strategy creates full lifecycle traceability from R&D to manufacturing operations. Agile processes will increase product releases more rapidly, decrease engineering change cycle times, and increase asset and resource utilisation significantly.

Automating correlation of real-time manufacturing data to process and product specifications decreases cycle times. Process analytical techniques (PAT) technology is expected to grow significantly in the next decade. Over time, in-line, @line and on-line analysis will substitute for off-line (batch oriented) manufacturing processes. International regulatory authorities such as ICH, FDA and ISPE are evaluating these new processes intensively and developing new best practice workflows.

These processes will change how QA/QC laboratories operate over the next decade. The QC functions are expected to decrease while the laboratory will change into a knowledge centre to support the new business processes. ERP and PLM applications will integrate the QC laboratory more cost effectively into the value chain. Electronic Lab Notebooks (ELN) and Lab Execution Systems (LES) will play a crucial role in increasing end user adoption to support knowledge management.

 

Managing the change

The process of change management starts at the source. New technologies will have a significant impact how operating budgets will be defined in the next decade. The days when software was purchased as a capital investment (Capex) are changing to a new model based upon a ‘pay-as-you-go’ philosophy (Opex).

In the traditional enterprise business software market, customer relations management (CRM) applications such as SalesForce.com started this business model. Popular applications, such as Photoshop and Office 365, are following rapidly. It can be expected that scientific software will follow this model in the years to come.

Software suppliers are themselves under pressure to maintain their maintenance, support and license revenues. Community collaboration and social networking is changing the value of traditional vendor helpdesks. Have you ever tried typing a question you would like to ask your helpdesk into Google? What was the response time? Initiatives such as the Allotrope Foundation will address the need to create sustainable interchangeable data standards, allowing users to be less dependent on proprietary scientific data formats2.

Peter Boogaard is an independent laboratory informatics consultant and founder of Industrial Lab Automation, which provides services to address harmonisation, integration, and consolidation of business processes in development and manufacturing. Industrial Lab Automation organises the Paperless Lab Academy, for which Scientific Computing World is media sponsor. Taking place in Barcelona, the 2015 event focuses on how to Manage the change process in R&D and QA/QC laboratories in pharmaceutical, biotechnology, consumer goods, and chemical industries. The interactive congress offers actionable insights on how to adopt a new mind-set in daily work. The academy includes 16 hands-on workshops, many practical presentations, a networking reception, conference dinner, and live demonstrations from more than 20 leading vendors. www.paperlesslabacademy.com

References

1 Source Capterra - http://www.capterra.com/history-of-software

2 2014 - Joining Up The Laboratory - Scientific Computing World by Peter Boogaard