Skip to main content

A look to the future

What are the biggest factors driving change in today’s laboratories?

Jacqueline Barberena, global marketing and product director at Abbott Informatics: ‘The biggest factors driving change in today’s laboratories can be summarised by: expected year-on-year increases in performance and productivity; more stringent data integrity and compliance requirements; end-to-end integration of workflow processes; and the ability to track sample QA/QC requirements. Additionally, newer technologies, such as mobile capabilities, apps, and cloud deployments are becoming more widespread as lab users demand the ability to access secure data any time, anywhere.

‘More than ever, as data grows exponentially daily, organisations need solutions to manage these growing amounts of data, but that also go beyond data management. This can be accomplished through advanced analytics, which will enable them to make business critical decisions in a timely fashion, despite the enormous amounts of data and information flowing throughout the organisation. Laboratory informatics solutions are changing to support customers facing the above-mentioned challenges.’

Andrew Anderson, vice president for innovation and informatics strategy at ACD Labs:

‘The driving factors for change in laboratory workflows and software are the transition from document-driven knowledge management to a data-driven paradigm. The ability for any data source (from design to planning, to execution, to analysis, to decision/conclusions). Another key change is the introduction of ‘virtual collaboration’ the ability for scientists to effectively communicate/collaborate with colleagues in other parts of the world.

‘Historically, a human would ‘associate’ data to the appropriate data repository. In order for a data-driven decision making paradigm to be effective, data sources, repositories, and decision support interfaces must be integrated without human intervention. Furthermore, virtual collaboration also required digitisation of data in a way that it can be manipulated and reviewed, ideally, as though it were collected “on premises”. While we are still seeing these transitions in effect, informatic workloads are currently increasing to support such integrations and data sharing technologies – with the envisioned payoff being less human work over time.

‘We believe that any application software must afford data integration directly (or as much as possible). While there are many applications that integrate some data sources, many data types are relegated to abstracted representations. This is especially pertinent for analytical characterisation data, which is at the heart of critical decisions made in research and development on a daily basis. LIMS and ELN, in particular, often refer to analytical experiments, but the rich data that is acquired to support decisions is often abstracted to either pictorial representations or alphanumeric summaries (e.g., area percent = X, identity was confirmed by NMR, etc.). Future systems, including LIMS and ELNs, must afford more functional integration components to provide direct access to the analytical data that can be reviewed and re-examined easily – either conforming to relevant data standards when possible, or able to easily support custom integration.’

Daniela Jansen, director of product marketing at Dassault Systèmes:

‘Software and hardware vendors are moving towards a platform offering, but in most cases it is a closed, proprietary platform. Platforms can provide data continuity and traceability, and support decision making across the product lifecycle. A platform also allows to move away from point to point integrations that are cumbersome and expensive to maintain.

‘Platform-based systems allow for a substantially different architectural set-up: small dedicated agile applications can be plugged into the platform in a modular fashion, services like reporting or instrument interfacing can be used by the different applications. This approach avoids overlap and redundancy of capabilities, improving user experience and work efficiency. It also lowers the cost of ownership.

‘The increased adoption of cloud technology is based on both, the natural progression from legacy systems as well as the need to transform laboratory operations. Many legacy systems are about to run out of support, so organisations are looking for replacements. Due to the technological advancements, firms have the opportunity to not only move to a newer system but to also take a new approach to their laboratory informatics landscape.

‘They can move away from rigid monolithic systems that often come along with duplication in capabilities, to a platform-based approach with modular applications and common services allowing them to transform the way labs are working today. And they can leverage the cloud technology many of the new solutions are offering, allowing them to adapt the scale, cost and capabilities of their deployment to their current needs. In times of ongoing merger, acquisitions and divestments, this is a compelling value proposition.’

How will the development of new technology, such as AI, deep learning, or the IOT, impact the development of lab software?

Barberena, Abbott Informatics:

‘There is no doubt that technology is and will impact the way labs operate. There are a few technologies that are a game changer, as they can shape labs in the near future.

‘Machine-learning solutions – over the next few years every app, application and service will highly likely incorporate AI at some level. In the LIMS sector, this has already been applied to add business value to organisations, in the form of advanced analytics and advanced user experiences. Big data analytics can help organisations innovate and run their businesses in a much more optimised fashion. Being able to see and understand the lab’s data is crucial for organisations to make better, more informed decisions.

‘Speech recognition – one of the strategic technology trends, as identified by 2018 Gartner report, are the so-called conversational platforms that “will drive a paradigm shift in which the burden of translating intent shifts from user to computer”.

‘Augmented reality – this is a technology that can amplify human performance and human experiences. In laboratories, AR coupled with holographic technology can add an entirely new dimension, supporting to solve real problems and increase efficiency by displaying, for example, relevant holographic information on top of Lab physical layout.

‘A few other examples on usage could be: imagine you wear protective glasses when executing a test method; if those were smart glasses, they could display the method SOP as a hologram, to help follow step-by-step instructions. If you wear protective gloves, which make it awkward to use a laptop or touch-screen device, how about having the option to use gestures and voice commands to interact with the LIMS software, via AR holographic screens? AR is here to stay. Applications are virtually limitless.

‘Blockchain – this will certainly continue to impact the development of lab software.  When managing data from concept to consumer, as many labs do, multiple parties need access to the same sources of data at the same time.  Having access to review, confirm, and edit this data simultaneously, via Blockchain, will continue to enhance lab’s capabilities. 

‘Cloud – there is strong trend towards externalisation of IT and R&D work. Naturally, cloud deployment supports this trend and, according to Gartner, by 2020, 80 per cent of the laboratory software solution will operate on the cloud. This is a great point and time to influence an organisation’s cloud strategy.

‘Mobile – in three to four years, operating in the lab without mobile devices will be considered ‘unheard of’, just like running a business without email is inconceivable in today’s world. More than ever, users spend less time in the office and expect to be able to work anytime anywhere, and have access to all their work resources. A LIMS with mobile-capable features can provide users access to the lab from virtually anywhere.

‘UX – the expectation from software nowadays is very high. This is impacted due to Consumerisation. Organisations should stick with products that invest in UX as it will help reduce support and training.

‘LEAN – this is a growing trend that can bring laboratory optimisation and performance to a whole new level. It helps identify waste (those additional activities we do everyday and do not contribute to the final outcome).

‘In today’s high-capacity laboratories, the number of tasks ongoing at any one time and the amount of data being generated on a daily basis has swelled exponentially. Additionally, as businesses expand globally, now, more than ever, organisations need laboratory software that is capable of managing large amounts of data in real time, and that is able to integrate with numerous and disparate systems and instruments.’

Anderson, ACD Labs:

‘From our perspective, the promise of these technologies is well established – from both a productivity and innovation perspective. We believe that two sustained efforts must be undertaken to fully realise the benefits of such emerging technology:

‘Increased efforts in connecting all data sources to, broadly-speaking, prediction interfaces. Currently, AI/ML/DL can be limited based on the training data types available. Infrastructure development – connecting high-volume data sources to prediction interfaces will require significant investment.

‘We envision a future where data, application functionality, and prediction capability are available on demand, in whatever interface a scientist chooses. Moreover, if this future is realised, there are two benefits.

‘We believe that the amount of effort that scientists undertake to prepare summaries of their effort will be dramatically reduced. The time spent shuttling between applications will also be dramatically reduced.

‘The effort to design, plan, execute, analyse, and then summarise experiments is currently supported by a large number of monolithic applications.

‘In the case of synthesis laboratories, a scientist uses separate synthesis design, inventory, instrument control, and decision support software interfaces, all in one day. The future of laboratory software, as described above, reduces the number of interfaces substantially.’

Jansen, Dassault Systèmes:

‘In conversations with our customers, we have identified time-to-market being the ultimate driver for change. Personalised health and the desire for more precision therapies is changing the way that they are developed. Knowledge capitalisation is basic for leveraging of new and existing knowledge in the lab and next-gen manufacturing with moving from large batches towards continuous manufacturing has a deep impact on the analytical instruments and methods used, as well as on the related data analytics. And total quality efforts are attempting to make compliance and quality an asset, instead of a cost.

‘Laboratory informatics need to allow users to work not only in a more efficient and cost-effective way while remaining compliant, but they also need to provide the flexibility to adapt to completely new ways of working. They need to be able to deliver contextualised data in real-time for faster decision making.

‘Machine learning technology itself doesn’t need to be adapted. It just needs to be used in the right way. It is more about identifying the right data and providing the data in the right format to be leveraged. Data needs to be standardised and contextualised for meaningful outcomes. Dassault Systèmes does provide the tools today, and in order to help our customers to leverage their data, we are actively engaged in data standardisation projects of consortia like Allotrope or the Pistoia Alliance.’

How might the tools or workloads a typical user might deal with change over time?

Barberena, Abbott Informatics:

‘In today’s competitive landscape, organisations need to take a holistic approach to their lab processes and identify areas of waste.

‘The future of laboratory software will evolve into more fully integrated solutions, from concept to consumer, as well as customer portals in which the systems, lab techs and clients are more tightly integrated, with the end goal of entering data once and enabling it to flow through all of the business processes, through to either the end-user or client. ‘Most organisations choose ‘the best in breed’ solution for a given function, for example: best stability study module, best environmental monitoring module, best inventory manager, equipment manager, spec manager, experiments notebook. What these organisations don’t realise is that they need to develop interfaces to these solutions, which is costly not only to build but also to maintain. It is also challenging to work with multiple vendors, because when an issue arises, vendor one will point to vendor two and vice versa, and the ‘user’ is stuck in the middle. Instead, organisations should seek a bundled solution that covers the large majority of the business needs.

‘Most vendors and organisations are still focused on entering and managing the data into the system. The focus should switch to insight management. Customers should be able to use the data to get better insights to run the organisation smarter.’

Jansen, Dassault Systèmes:

‘Systems consolidation and convergence will provide laboratory users with a new experience and transformation in efficiency. Traditional systems will start being replaced by more agile, user-friendly cloud-based lab informatics applications that are based on a holistic open-platform approach.

‘Organisations will start leveraging the Internet of Things (IoT) in the laboratory. This will provide them with more data – delivering more data and more insight, as well as more reliable data of higher quality and integrity.

‘IoT will allow users to work more efficiently in the lab, as it will remove many time consuming non-value adding steps from the workflows, as the ‘things’ are not only limited to lab equipment but can also include wearable devices, google glasses, biometric bracelets, motion sensors, location beacons etc. At the same time, it will improve data integrity and quality, as data transfer from and to the devices is automated. And through the introduction of more sensors, labs will be able to generate more data faster, that can improve and accelerate decision making.’



Topics

Read more about:

Laboratory informatics

Media Partners