FEATURE

Dealing with data

Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory

Mark Gonzalez, technical director at Labware

 

What technologies are requested by laboratory users?

Mark highlighted that there are clear divisions between the two primary groups of existing customers and potential users.

‘In terms of technology the question that existing users are asking about most often is mobile. That is not to say that they have a clear plan on how to use the technology  but they have smart phones and tablets in their personal lives and they want to know what we could do using that layer,’ said Gonzalez. ‘That is the technical question that we get the most from our existing users. They don’t tend to ask about cloud because they have a running system. The IT department might be interested in moving to the cloud but since they have the system already running and they are not likely to want to change that in the short term,’ Gonzalez added.

Gonzalez noted that mobile technology as a solution for laboratory users ‘is a solution that needs to solve real-life problems’.

‘What we want to do is solve the right problems we don’t want to just throw out a bunch of technology that doesn’t really solve anything of any business value.’

One example that he noted was the ability to use mobile devices in untethered mode. This could allow users to perform actions such as entering data without a continuous connection to the LIMS server. Once the connection is re-established the data can be automatically sent to the LIMS system. ‘One value of mobile technology is that people could work remotely to collect data, even if they don’t have a connection to the LIMS server,’ concluded Gonzalez

Renaud Acker, chief operating officer at Agilab

What are the main challenges that your users face when deploying digital informatics technology?

Renaud Acker explains that, for many of AgiLab’s customers, ‘change control’ is the main challenge: ‘Processes have changed by using a new generation of software. Users must be trained, standard operating procedures (SOPs) must be adapted, data handling and traceability must be managed in a different way.

‘This means that lab software should be user-friendly for daily use. Screens must be clear with adapted vocabulary,’ stated Acker. ‘However, it must also be adapted to laboratory processes and objectives in order to increase efficiency and productivity and finally it must facilitate collaboration between scientists. 

‘The main issue is to handle data, not only to store it but also to be able to use that data effectively,’ stressed Acker. ‘Laboratories are producing and accumulating more and more data from experiments, analysis, bibliography and other areas. For instance, one screening campaign could generate hundreds of thousands of results, a query on a citation source like Pubmed can report thousands of references.

‘The challenge is to centralise data, to manage and gather information, to generate knowledge from data – and to keep track of what has been done, how it has been done, if it has worked or not. Big data technologies will be very useful to annotate, explore and exploit the whole set of data generated in labs and gathered from external public sources.

While there are clear benefits to using the latest software, cost of investment can be a big issue that prevents companies from replacing legacy infrastructure – but it is not the only reason, as Acker explains. 

‘There are at least two main reasons why labs don’t move easily to new lab software. Many companies and labs have spent fortunes in their first generation of lab software. Secondly, they have customised these products with considerable effort and money, so they aren’t eager to move. When a Lab needs to be compliant to GMP, GLP, etc it has many other points to manage: change control, system validation, certification and audits.’

All of these aspects can make a move more challenging, but ultimately choosing not to upgrade impacts agility – and the speed and quality of further laboratory operations.

Another aspect that AgiLab was keen to stress was that cloud deployments are increasingly seen as a good choice for many laboratories. However, the move to cloud based informatics requires a user to change their mindset as they move from silos of data to a more fluid model of shared data sets and collaboration.

‘Labs are still working in silos,’ added Acker. ‘New R&D processes should break this logic in order not only to exchange data but mostly to anticipate issues by gathering scientists working on a project. Collaboration is essential for R&D project success. Cloud applications could help to exchange data and ideas between labs in different locations, between industrial, partners and academics.’

Acker concluded that cloud-based laboratory informatics is growing due to a number of factors including their robust security, the potential for hosting management of services off-premise and the use of cloud subscription models that can reduce initial investment and running costs.

Oscar Kox, business delopment manager at Ivention

How important are digital technologies to the modern laboratory? 

‘There is a lot of innovation available in the market but I don’t think many labs are picking it up as early adopters,’ said Kox. 

‘People should ask themselves how important is it adopt new technologies – to innovate in the lab. Having worked in this industry for more than 20 years – of course it is important. You want to see new technology getting into the laboratory either because you want to reduce FTE, you want to increase throughput or improve quality.’

Kox gave an example of large implementation that iVention is managing in Europe that is consolidating as many as seven individual implementations with their own custom software, with additional software connected to it.

‘They cannot upgrade everything all at once,’ he said. 

The presence of custom software in each implementation means that each installation is essentially a new piece of software.

‘Now if you compare this to the capabilities of a web-based system you can rollout to all of those sites without custom software – there is a big benefit,’ said Kox. 

‘If there is a LIMS project that people who are now looking for a new LIMS or ELN, the decision they make now will affect them for the coming five to 10 years, because that is the investment that you are looking at.’

Kox stressed customers should ask themselves: will this big conventional LIMS vendor help me to innovate? ‘That is where the gap comes in. There’s a lot of innovation out there but can I adopt it right now, because of the systems I have in place?’

He explained that iVention has installed systems across very large organisations. He gave an example of a pharma client who wanted to roll out a system for 300 users across seven countries, over eight months. Cox also mentioned that this solution was hosted for the client by iVention. 

‘I don’t think there are many of those rollouts completed successfully with a conventional LIMS system,’ said Kox.

‘They are a big company with their own IT department and we are hosting it for them because we have all the technology in place to automate everything, so all the upgrades can be done automatically.’ He explained the success of this rollout has meant this company is now using iVention as a strategic partner for much larger rollouts in the future.

Kox said: ‘I have seen organisations with very old software, which can be costly and time consuming to maintain and upgrade. Some IT directors would say the upgrade would cost more than the original installation, so they either try and run for a few more years or select a new system.’

He said one of the main challenges when dealing with legacy LIMS or ELN systems is a lack of maintenance and upgradability: ‘The biggest thing I see is customers paying maintenance and they cannot upgrade. Support cannot help them because they have an old version and in many cases this support money is wasted because the system is too old to be properly supported.

‘I would strongly recommend firms look at their maintenance contacts and ask themselves “what are we getting back from it?”’

Next: The smart laboratory >

Feature

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process

Feature

Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory

Feature

This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.

Feature

This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value

Feature

This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment