ARM’s latest chip targets AI

ARM has taken a step into the artificial intelligence market with the announcement of a new micro-architecture - DynamIQ - specifically designed for artificial intelligence (AI)

ARM has stated that this new chip is enabling higher performance ceilings than ever before without compromising on ARM’s trademark energy efficiency.

In a recent blog post, ARM’s GM of Compute Products Group, Nandan Nayampally, stated: ‘DynamIQ technology is a monumental shift in multi-core microarchitecture for the industry and the foundation for future ARM Cortex-A processors. The flexibility and versatility of DynamIQ will redefine the multi-core experience across a greater range of devices from edge to cloud across a secure, common platform.’

ARM has announced its latest micro-architecture will directly address artificial intelligence applications with dedicated processor instructions for Machine Learning (ML) and AI, Increased multi-core flexibility that SoC designers can use to scale up to eight cores within a single cluster and safer autonomous systems through the increased responsiveness for Advanced driver assistance systems (ADAS) and increased safety capabilities which enable Automotive Safety Integrity Level (ASIL) D compliant systems – the highest integrity requirements based on the ASIL system.

ARM is joining the race for computing dominance in an increasingly competitive market of processing technologies specifically optimised for AI applications. While true artificial intelligence is still many years away, precursor technologies such as machine learning and natural language processing are already growing at a huge rate.  Chip makers are responding to this by investing heavily in computing technologies that can support these emerging applications.

In November 2016 market research firm, MarketsandMarkets, published a report into the artificial intelligence industry. The company reported that this market is forecast to grow at a compound annual growth rate of 62.9 per cent from 2016 to 2022 to more than 16 billion by 2022.

‘Naturally we see it as our responsibility to address the industry demand for ubiquitous AI, autonomous systems and accelerating the integration of virtual worlds toward a mixed reality experience. To address these needs, we are previewing our new ARM DynamIQ technology to enable our partners with higher performance ceilings than ever before without compromising efficiency’ said Nayampally.

Nayampally continued: ‘DynamIQ technology will be pervasive in our cars, our homes, and of course our smartphones as well as countless other connected devices where machine learning is applied to the zettabytes of data they generate – both within the cloud and at the device level – advancing AI for a more natural and intuitive user experience.’

Other tags: 
Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process


Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory


This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.


This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value


This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment