NEWS

Dassault brings PLM to the life sciences

Dassault Systèmes has launched three new products expanding the scope of its product lifecycle management (PLM) technology into the life science and biotechnology markets with teh intention of helping research and development in the life sciences. 

Dassault Systèmes’ industry solution experiences for pharmaceutical and biotechnology companies introduce a single digital environment to streamline the scientific and operational processes involved in developing new drugs. They allow for biological and chemical modelling and simulation, open collaborative discovery and research, unified laboratory management, efficient production processes and integrated quality and regulatory management.

Dassault has identified these areas as critical capabilities for developing advanced pipelines and bringing effective patient therapies to market faster and at lower costs.

The new products are ‘ONE Lab,’ ‘Designed to Cure’, and ‘Made to Cure for Biopharma’. They complement the company's existing ‘License to Cure for Biopharma’ and extend the company's product lifecycle management (PLM) technology into the life sciences and biotechnology. The aim is to provide digital continuity to help companies accelerate and improve therapeutics discovery, development, approval, production and patient adoption.

'ONE Lab' integrates people, resources, processes, data, analysis and documentation so that laboratories can leverage knowledge and more efficiently collaborate on researching, developing and testing products.

‘Made to Cure for Biopharma’ leverages process and quality data and knowledge across multiple organisations and geographies, optimising processes and products and reducing process development times and technology transfer costs.

‘Designed to Cure’ uses collaboration, common knowledge, predictive analytics and virtual design and simulation to model and identify higher quality candidates earlier in the process.

 ‘License to Cure for Biopharma,’ first launched in 2014, ensures regulatory compliance and high quality process management.

The life sciences industry faces cost pressures and long cycle times that leave little room for innovation. Research and development expenditures exceed those of the aerospace and automotive industries, yet the success rate of new drug development from its initial concept phase is less than one percent. This risk is compounded by patent expiration, global competition, strict regulatory requirements, and increasing scientific data sets and operations - both within the enterprise and with outsourced partners working in silos.

These new products give pharmaceutical and biotechnology companies access to an end-to-end, holistic approach to digitally transform the complex processes behind the design, development and production of novel therapeutics.

Twitter icon
Google icon
Del.icio.us icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon
Feature

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process

Feature

Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory

Feature

This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.

Feature

This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value

Feature

This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment