Laboratory technology changes
What do you think will be the biggest change in the laboratory?
Richard Milne VP and general manager of digital science at Thermo Fisher:
One of the things that 2020 has done is it has created a tipping point in the laboratory. People will find they need to reconfigure lab space and the configurations of labs. But we are also seeing higher demand to make the lab available.
How can people work from outside the facility? How can people access data when they are not inside the laboratory? How can they collaborate when they cannot travel as easily as they did? How can they control their experiments and run their lab without being inside it?
A lot of those drivers that we are seeing in all aspects of society at the moment are going to continue to persevere. We are finding that there is a very significant increase in people wanting to see technology shoulder some of the burden and assist in the changing parameters of the workspace. We are seeing it in every industry and every walk of life, but it is also being clearly illustrated in the laboratory.
We are doing a lot of work around cloud-based computing platforms, collaboration environments, data moment and data storage. There is a lot of work going on at the moment around connected instruments and how data can flow more easily.
Stephen Hayward, Biovia technical marketing manager:
Transformation of the user experience with the inclusion of advanced techniques. For example, voice recognition, where a use case would be dictating observations and results utilising scientific awareness in the recording process.
Transfer to cloud-based systems, driven by corporate IT policy changes – this enables more remote access, which is becoming critical in times of a pandemic
AI – truly leveraging all existing knowledge from the lab to better guide future work. Finally, augmented reality technology which can transform lab process execution – visualising additional contextual information, or alerts about sample status and pending tasks.
Many lab technicians are considered ‘essential workers’, so they have continued working during a pandemic. But the way of working has changed. Teams working in the lab are now split into smaller groups that are working in shifts to minimise contact while covering the workload. This makes efficient and flexible laboratory scheduling critical.
Additionally, all tasks that are not related to physical activities in the lab are now performed remotely. Therefore, it is important to be able to work with experimental data away from the lab while retaining contextual data for decision making, which is typically supported by cloud solutions.
How has Covid-19 changed how the scientific ecosystem works together?
Arvind Kothandaraman, general manager diagnostics, PerkinElmer:
A major takeaway from Covid-19 has been that every second matters when it comes to a response. In order to be more nimble and agile, labs require tools with high levels of sensitivity and reliability in order to detect disease, develop therapeutics and discover preventive measures that can be taken before there is an opportunity for a surge to begin. Early detection and diagnostics are vital for labs, as screening becomes the new normal.
We will also see a shift towards molecular testing and surveillance in general over the next year or two. Covid-19 necessitated this shift, and labs have realised that they must be equipped with life science and diagnostic tools to better manage the spread of infectious diseases now and in the future. While we hope to never experience a pandemic of this magnitude again, it is in our best interest for labs to proactively conduct surveillance to better manage the potential risk.
Collaboration among scientists is the backbone of labs. The fight against Covid-19 has been prioritised across the globe, and this has accelerated how all organisations work in a united effort to ultimately serve the public. In that sense, pharmaceutical and biotech, which are conventionally considered competitors, have joined together to work towards the same goal.
Information sharing will help ensure the abundance of testing kits and therapeutics for everyone and everywhere. The collaboration has been unprecedented, and we’ll see this approach continue in many ways moving forward.
Dr Barry Bunin, CEO of Collaborative Drug Discovery (CDD):
While the concept of remote working is nothing new, the Covid-19 pandemic has created a new reality where many scientists are forced to spend less time in the lab and instead work from alternate locations. One of the main challenges that comes with this is data availability – do you have access to your data outside of the lab? How do you share data with colleagues and collaborate when everyone is physically separated? And if you do share your data, how do you make sure there is adequate access control to prevent unauthorised access? These are important considerations, in particular in deadline-driven projects where achieving specific research milestones is critical for the success of the organisation.
At CDD, we have been enabling scientific collaborations for the last 16 years through our CDD Vault-hosted informatics platform. Anyone with access privilege for a project can manage and analyse data from any web browser anywhere in the world, and the whole team can work together in real time, even when separated by physical distance. The value that our solution brings has never been greater in today’s world of virtual companies and distributed research teams.
In fact, we have written a white paper on the subject of remote data access, and interested readers can find it on the Scientific Computing World website.
What are the biggest challenges that lab users face?
Stephen Hayward, Biovia:
Higher project throughput with the same number of staff requires efficient tools for handling data, observations and result analysis. Distributed labs means that all data must be available to all team members in real time.
Although most labs have electronic solutions in place, they are very rarely integrated and users are forced to transfer data and results manually between systems, which is time-consuming and error-prone. Only flexible, integrated solutions across sites and collaborating partners can fulfill such requirements.
No competitive lab can work without digital support for process automation, data capture, sample management, data sharing and analysis. Most laboratories in the life sciences space have already implemented digital solutions for compliance and efficiency reasons.
‘We are observing that laboratories in other process industries, as well as discrete industries, have started to implement electronic tools too, but are still in a less digitised and connected state. We expect that this will continue to increase as the benefits of efficiency, better innovation, decision making and faster time-to-market are too compelling.
Disconnected systems, rigid applications and lack of sufficient support from corporate IT in labs is making it difficult for organisations to deploy new technology. Only a transformative approach will enable labs to move to a truly digital lab that also allows them to leverage new technologies. This transformative approach is typically supported by scientifically-aware collaborative business platforms.
Oscar Kox, CEO at iVention:
We started with cloud from day one and we said ‘we want automatic upgrades’ we do not do custom software. And if we do some custom software it will be taken up into the core product, so that we can still do the automatic upgrades.
We have now developed the automated OQ scripts [IQ/OQ Validation Protocol] and system to enable the automated execution of these scripts. When we release software and you do it agile, every two weeks, it is impossible to have everyone testing the software manually.
For other providers, it can take months to upgrade and even longer to complete the validation. We do not want to see upgrades taking longer than days. We will validate, put the OQ scripts in a document and even add some screenshots. The system will try to login in Chrome or Edge and this is how we do it. It generates a complete report.
We are still doing really well in implementations because we can work with people using Teams meetings to do configurations together, because we are in the cloud. From an implementation standpoint it is massive, because some people who cannot work right now as they cannot go to the office, they can test software, when normally they would do that next to their day jobs.
The second thing is that if you look at the old-fashioned implementations from the more conventional providers, you need a VPN connection and then there are still Excel files that are stored on your PC in the office. This means you cannot get to all of your data. The old-fashioned implementations with the client on the PC in the office means that you cannot access that when you are working from home.
What role do automation and integration technologies play in overcoming laboratory challenges?
Richard Milne, Thermo Fisher:
We are looking increasingly at how you can connect devices to look at their operating parameters and confirm that they are online and within range.
We want to be able to do that from a mobile phone at a safe distance to ensure that your experiments are running, regardless of where you may be.
We are doing a lot of work with our laboratory automation team at the moment. One of our most recent, big and impactful solutions that we released into the market was the Amplitude solution, which is the high-volume Covid-19 testing solution that is being used across the world at the moment.
There is significant integration between our laboratory automation and our digital science team to make sure that [Amplitude] is working with minimal touch, minimal operator system.
Trish Meek, director of marketing at Thermo Fisher:
One of the key things we are seeing when we talk about connectivity and lab automation and ensuring that organisations are leveraging their data effectively, is the value of the scientist.
For years people were filling in the gaps, with a lack of automation the scientists would fill the gap and do that work. I think there is a recognition from organisations that the scientists are their greatest asset, and any way that they can automate and integrate data and the workflow means that scientists can be more effective and focus on the science itself.
When we talk about integration with our customers, there are a few different pieces. There are the partnerships they have between organisations, the integration with outsourcing and so we work with our customers to facilitate the integration between Contract Research Organisations (CROs) and Contract Development and Manufacturing Organisations (CDMOs) that they are working with, as well as within their own organisation.
Our pharma services group is implementing our capabilities to manage their [customer] CDMO operations across 25 sites. This has enabled us to partner with them as a customer ,but also as a part of Thermo Fisher to find the ideal state as they talk about their scientific ecosystem, and how they take it from where they are today.
Richard Milne, Thermo Fisher:
We recognise that a lot of organisations have legacy investments in software applications and many of those can be deeply embedded in their processes. Rather than looking at this as a revolution where you disrupt the existing situation. We are looking at it more as how do you integrate into that, and therefore protect the investments that people have made in other software tools but provide integration across those, so people can get an easier customer experience and also the value of the integration from the different toolsets.
I don’t think Covid-19 is causing it. This is just my perspective, I am basing this on my own conversations with customers, but my feeling is that Covid-19 is the catalyst for this change. It would have happened anyway, but slowly and more sporadically.