AI 'has potential to revolutionise life sciences' – Pistoia Alliance

Some 44 per cent of life science professionals are using or experimenting with AI and deep learning, while 94 per cent expect an increase in use of machine learning within two years.

These are findings from a survey carried out by the Pistoia Alliance, a global, not for profit alliance that works to lower barriers to innovation in life sciences R&D.

The organisation surveyed 374 life science professionals on AI, machine learning (ML) and neuro-linguistic programming (NLP). The survey found interest in these technologies is high, with almost half (44 per cent) of respondents already using or experimenting with AI. However, a number of hurdles to their widespread application were also identified – with technical expertise the most cited barrier for AI (30 per cent) and for ML/NLP (28 per cent).

The Pistoia Alliance believes collaboration between stakeholders is essential to overcoming these barriers; leading to ‘augmented’ AI that works alongside humans for positive outcomes. In particular, given how crucial data is to building AI algorithms that reveal meaningful insights, collaboration over data standards, benchmark sets, and data access, will be essential.

The survey found that beyond a lack of in-house technical expertise, issues around data are a particular stumbling block to AI projects.

Specifically, respondents stated that access to data (24 per cent) and data quality (26 per cent) were two of the biggest barriers to AI projects within their organisation. These same issues – access to data (26 per cent) and data quality (19 per cent) – were again cited when respondents were also asked about obstacles to ML and NLP projects. Life sciences and pharmaceutical R&D currently generates huge volumes of data, which is supplemented with growing data sets collected from health devices and sensors connected to the Internet of Things (IoT). However, access to this data and the formats that data are stored in, vary wildly. Quality data is fundamental in ensuring AI gives accurate and true outputs; this is a significant finding, and one the industry will need to overcome in order for AI to assist researchers.

'AI has the potential to revolutionise life sciences and healthcare – all the way from early preclinical drug discovery to selecting precision treatments for individual patients,' said Steve Arlington, president of The Pistoia Alliance. 'Our survey data shows that while life science professionals are already exploring how AI, ML and NLP can be used – there are clear gaps in the knowledge, data, and skills, which will enable more pharma and biotech companies to achieve tangible results from AI.

'Impediments to success, such as a lack of industry-wide standards for data format, will need to be addressed, if the potential of AI and ML is to be realised. We urge those in the pharmaceutical, biotechnology and technology industries to explore ways in which they can collaborate now, to find answers to common problems of the future.'

When asked about the use of AI and associated technologies, the results revealed applications are varied. The majority (46 per cent) of AI projects currently take place in early discovery or preclinical research phases; NLP is also employed by just under a third (30 per cent) during early phase research.

Other applications of AI were given as development and clinical (15 per cent), and imaging analysis (8 per cent). More than a fifth (23 per cent) of respondents are using ML for target prediction and repositioning, followed by biomarker discovery (13 per cent) and patient stratification (5 per cent). However, adoption is not universal; a notable number of respondents are not using AI (11 per cent), NLP (27 per cent), or ML (30 per cent), at all.

Moreover, 8 per cent of respondents admitted they knew ‘next to nothing’ about AI and deep learning, highlighting the need for greater education and knowledge sharing.

Other tags: 
Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

Robert Roe reports on developments in AI that are helping to shape the future of high performance computing technology at the International Supercomputing Conference


James Reinders is a parallel programming and HPC expert with more than 27 years’ experience working for Intel until his retirement in 2017. In this article Reinders gives his take on the use of roofline estimation as a tool for code optimisation in HPC


Sophia Ktori concludes her two-part series exploring the use of laboratory informatics software in regulated industries.


As storage technology adapts to changing HPC workloads, Robert Roe looks at the technologies that could help to enhance performance and accessibility of
storage in HPC


By using simulation software, road bike manufacturers can deliver higher performance products in less time and at a lower cost than previously achievable, as Keely Portway discovers