Screener supports outsourced pharma research

Genedata, which provides software for drug discovery and life science research, is the latest software provider to announce that its product can be used to help large pharma companies outsource to and share data and results with Contract Research Organizations (CROs).

The implications of outsourcing in the pharma industry were a hot topic at the recent Paperless Lab Academy (PLA) in Barcelona. As reported in Scientific Computing World, the ‘externalisation’ of the pharmaceutical industry dominated proceedings at the PLA. In his keynote address, Patrick Pijanowski of Lab Answer told the meeting that ‘today, the pharmaceutical industry is outsourcing virtually every function,’ with the result that data integrity and system security were major concerns. Others voiced concerns that pharma companies could lose core competencies to their contractors, resulting in Lost knowledge and dead data.

Roche has adopted Genedata Screener across its operations globally to reduce costs for R&D. According to the software company, it provides a framework for systematic import, processing, and end-result propagation for all types of screening experiments, making it ideal for different collaboration models among pharmaceutical companies and CROs.

The issue of disruptive technologies – specifically the cloud and to a lesser extent, big data – was also addressed at the PLA. Some delegates were clearly uneasy about hitching themselves to the cloud due to concerns over data security and the protection both of intellectual property and of personal information. But, as the PLA progressed, it became clear that, in a world of externalised pharma companies, the lack of standardisation of data formats means that even with current technologies companies are losing knowledge and are unable to extract the maximum value from their data.

Genedata is promoting both a web-based and cloud-compatible versions of its Screener software so that it can be installed at a CRO site, on-site, or in the cloud. The software can be set up to reflect the company’s own business logic and thus allow a pharma company to maintain consistent workflows with its company-specific business rules for results regardless of the CRO doing the analysis.

However, successful collaborative research with external partners requires all parties to have insight into the data produced by an experiment, the data analysis methods applied, and the rationale for decision making (e.g. specific artifact corrections or decisions on hit-calling and compound progression). Having both in-house and external researchers using the same consistent methodology and standardised processing of results makes the research easier.

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process


Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory


This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.


This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value


This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment