Skip to main content

From musicianship to parallel programming models

EPCC's director of research Michele Weiland describes the 70 different projects running at the centre and laments the UKs departure from the EuroHPC project

Please tell us a little about your background and qualifications...

I grew up in Luxembourg and after many years of studying music at the Conservatoire and wanting to become a professional musician, I decided that maybe that lifestyle was not for me after all and that I wanted to opt for a 'safer' career instead.

Having always enjoyed maths, science and computing, I moved to Germany to study medical informatics at the University of Heidelberg, and then on to Edinburgh to do computing at Edinburgh Napier University. This was followed by a PhD at the University of Edinburgh, where I was working on the probabilistic modelling of musical structures. It was during my PhD that I was first exposed to the idea of 'performance' in computing – some of the models I was running took several days to complete and I began to wonder if this could be improved. Towards the end of my PhD, EPCC were advertising for entry-level software developers. Staying in Edinburgh and at the university appealed to me, so I applied, got the job, and I have been at EPCC ever since. I am now a senior research fellow, and since November last year I am also EPCC’s director of research.

How does your HPC centre/research centre use computing for research?

EPCC is probably most well known in the world of HPC for running the UK’s National HPC services, as well as for our teaching (we have run an MSc in HPC programme for the past 20 years) and training. But we do much more than that: we work with industry to help them use HPC systems, we do software development with a focus on high performance, and in recent years we have moved into the data analytics space.

In addition to all this, we do research into the fundamentals of HPC (such as the interoperability of parallel programming models, load-balancing techniques, domain specific languages or energy efficiency); how to best exploit novel hardware such as Intel’s DCPMM persistent memory, the Cerebras-CS1 wafer-scale chip, or the latest generation of FPGAs for high-performance and data intensive computing; and evaluating and applying performance optimisation techniques to scientific software to enable massively parallel and efficient computation, for example for digital twin simulations.

Can you outline the type(s) of projects undertaken at your facility, and how researchers access computing resources?

At EPCC we are lucky that we host a wide range of systems for HPC and data intensive research – it means our own staff generally have local access to any of the resources they need for their work, be it use of the resources or research into the resources themselves. External (non-EPCC) researchers can access the computing resources we host in multiple ways: they might be awarded compute cycles as part of grants (this is mainly for the national services), they might qualify for free access (depending on the planned use or where they are based), or they can pay for compute time, including dedicated support. There are many routes – anybody who would like access and is not sure how to go about it should get in touch.

EPCC is an entirely project-driven organisation – right now, we have more than 70 different projects running, from short software development and consultancy contracts with industry, to multi-year and international research collaborations. As part of this, we have a long history of being involved in European projects and many of our proudest achievements are linked to collaborations with partners in Europe (such as the Fortissimo and NEXTGenIO projects). Sadly the fact that the UK cannot take part in EuroHPC project means that this is no longer an option for us. It is a great shame to lose the opportunity to work with long-term collaborators in this way, not just for EPCC but for the whole of the UK HPC community.

What are the computing trends you see happening in your area of research?

The current trends in HPC are driven by two main considerations: the need for ever greater performance and power density; and the convergence of 'traditional' HPC with data-driven use cases that employ AI and ML techniques. The performance and energy requirements lead to heterogeneous system designs, as well as an increasing interest in hardware specialisation. The arrival of data-intensive workloads on HPC system has also had an impact on system design (for example, offering high-throughput as well as high-performance file systems), but it has mainly influenced the software stack: technologies such as containers are now commonplace, and programming languages such as Python are widely used for numerical applications. Users increasingly demand better usability and more flexibility, and these demands have to be balanced with maintaining efficiency and the highest possible performance, security and reliability. 

What are the main computing challenges you face in your research?

For us, the computing challenge is the research: how do we use new (and existing) hardware efficiently to enable scientific discovery, how do we ensure we keep abreast of the rapid changes in the hardware landscape, and how do we support others to do the same? That is what, for me at least, keeps working in HPC exciting every day and I welcome the challenges these changes bring.

A computing challenge that I welcome less however is related to the growing gap between society’s day-to-day reliance on computing and its fundamental understanding of how computing works. Supercomputing and data analytics are skills that require specialist training and experience, but this is often not recognised (and this is probably worse in the data space than in HPC). I agree that using HPC systems should be accessible to and achievable by all with the right training, but the mindset that it should always be simple is dangerous and it sets false expectations that HPC centres and researchers have to work hard to set right. Sometimes computing is difficult.

How could you further increase the speed or efficiency of your research in the future?

Our research is predominantly on the technologies that underpin HPC and data intensive computing. The single most important aspect that makes it more efficient is collaboration. HPC is such a complex and fast-moving field that in order to stay ahead you need to work and engage with others – be they end-users, researchers or HPC centres.

Finally, do you have any fascinating hobbies, facts or pastimes you'd like to admit to?

I like spending as much time as I can outdoors, if and when the Scottish weather cooperates (which is more often than people might think!). I enjoy pottering in the gardening, or at least I try to keep the weeds and rabbits at bay with moderate success. I also like (purely recreational) cycling and running. During the winter, cycling might be swapped for skiing – I love snow and winter sports. When I’m not outside, I’m probably reading, playing computer games or watching TV – not necessarily in that order. I don’t think any of this qualifies as 'fascinating', but it keeps me sane and relaxed.

Interview by Tim Gillett

 

Topics

Read more about:

HPC

Media Partners