Skip to main content

Unlocking Cosmology

 

Space, the final frontier. James T. Kirk may have to alter this statement as simulation software is helping astronomers probe the most distant reaches of our universe to propose fascinating new theories.

Why is the universe filled with galaxies and stars, instead of the structureless form it had shortly after the Big Bang? Why do galaxies die? How do you build a radio telescope to probe the outer reaches of the universe? And how can we make such radio telescopes as accurate as possible?

Astronomers are developing increasingly innovative simulation tools and methods to answer such fundamental and vital questions and work around the restrictions of our current observational capacities. 

Investigating dying galaxies

In the universe, some galaxies actively form stars, while others have close to no star formation activity. The first group are considered to be alive and the second are dead. This very strong bi-modality between the two populations (alive and dead) means that the mechanism leading to their death must be abrupt. However, it is not clear yet what causes this death.

Researchers at the Astrophysics Department of the French Commission for Atomic Energy and Alternate Energies (CEA) are studying star-forming galaxies at the point where the cosmic star formation activity was the highest in the universe. It is not possible to use real-world observations to investigate this phenomenon as current telescope resolutions are not high enough to reveal the details of the galactic interiors.

Dr. Orianne Roos, researcher at the astrophysics department of the CEA, explained: ‘I instead use numerical simulations, in order to model the physical mechanisms at play in their evolution, including star formation activity, galactic outflow driving and supermassive black hole physics.’

The goal of this research is to see whether or not supermassive black holes (which are very energetic objects at the centre of some galaxies) can kill their host galaxies, either by removing all the gas content quickly through strong and mass-loaded galactic outflows to quench the host galaxy; or by delaying or suppressing in-place star formation by heating the gas very efficiently. This latter hypothesis also provides a link between star formation and the energy releases observed by supermassive black holes.

The issue is that such simulations need to be compared to observational data, but such observations are not resolved enough to see the interior. So, statistical comparison is used to compare many observations of such galaxies with low resolution. The resolved large-scale behaviour is then derived and the team compares this with the large-scale behaviour in the simulations. If the large-scale behaviours agree, the team can deduce that its numerical model of the internal physics is coherent with observations. Roos added: ‘Numerical simulations are test-beds for current theories and only the careful comparison with observations can provide insight on whether a theory holds or not.’

The team studied three galaxy masses, which span the whole range of masses in typical star-forming galaxies of the early universe, and three feedback configurations. Feedback is how a supermassive black hole (SMBH) or a star affects the host galaxy that ‘feeds’ it with its gas. Typically, active SMBHs release energy in the centre of their host, leading to high velocity winds expelling gas, and heating or ionising their surroundings.

Stellar feedback is mainly composed of winds from young stars and supernova explosions. To disentangle the effects of SMBH feedback and stellar feedback, the researchers used three configurations: only SMBH feedback, only stellar feedback and both feedback. With different resolutions (from 12 parsecs down to 1.5 parsecs), this makes a total of 24 simulations that were run during 11 million core hours.

A parsec is an astronomical unit of length used to measure large distances to objects outside of our solar system. One parsec is equal to more than three light years.

Active SMBHs have long been accused of killing the galaxies in which they live because of the huge amount of energy they release into it. However, the researchers found that moderately active SMBHs are not able to rip the gas out of their host and they are not able to suppress or delay in-place star formation, even when the interplay between supermassive black hole physics and stellar feedback physics is accounted for.

They are, therefore, not galactic quenchers. Roos explained: ‘What is even more surprising is that the amount of gas swept out of the host by both the active supermassive black hole and the stars is lower at higher mass, whereas black holes are more massive and more powerful in them.’

‘Therefore, even though the energy is there, the coupling with interstellar matter is so inefficient that the gas of the galaxy is not abruptly ripped out (it is to some extent, but not enough to kill the host suddenly). Further investigation is thus needed in order to explain why and how galaxies suddenly die,’ Roos added.

Our universe’s structure

Why is the universe structured in the way it is? Stars, quasars, galaxies and clusters of galaxies all exist in our current universe – but linking this structured universe to the smooth matter distribution of the early universe is not yet fully realised.

If we can understand the cosmic reionisation process, which refers to a period in the early universe where predominantly neutral intergalactic material was ionised by the first luminous sources, then we could start to understand how structures such as stars and galaxies formed, and why our structured universe exists.

Researchers from the University of Sussex are using simulations to model the cosmic reionisation process over a huge range of length scales. Dr. IIian Iliev, reader in astronomy at the University of Sussex, said: ‘Our simulations are aimed at understanding this process, which is inherently multi-scale, from the small scales at which stars form (parsecs) up to very large, cosmological ones (hundreds of megaparsec or more).’

This research does not just need to incorporate such multi-scale processes, but must also model multiple processes, including photoionisation and radiative transfer, gas heating and cooling processes, gas flows, gravity and many others. Iliev added: ‘The radiative transfer simulations are very computationally expensive, so require very large parallel machines to run efficiently.’

The researchers are now expanding their work to incorporate more processes, as Iliev explained: ‘We are currently working on new versions of our radiative transfer code ,which will incorporate not just ionising radiation (as now), but also Lyman-Werner radiative transfer and X-rays. We are also improving our galaxy (sub-grid) modelling and including new physical processes for more detailed and realistic modelling.’

The on-site processing facility for the data captured by the LOFAR array.

SKA simulations

Finding answers to astronomical phenomena like cosmic reionisation in the real world involves the use of radio astronomy. Radio telescopes are large structures spanning hundreds of metres, which are usually comprised of dozens of antenna arranged in an array to capture long-wavelength radio waves and make observations.

The SKA (Square Kilometre Array) facility will be the world’s largest radio telescope. Once it has been built, SKA will be 50 times more sensitive and 10,000 times faster at mapping the sky than the world’s total current radio astronomy facilities. It is also hoped SKA can be used to answer fundamental astronomical questions, such as: how did the universe form and evolve? Does extraterrestrial life exist? What is dark matter and dark energy?

The prospects and possibilities of SKA are monumental, and the radio telescope is relying on simulation software to analyse its structure and optimise its design before construction begins in 2018.

Researchers from the Stellenbosch University are using simulations to analyse both the electromagnetic systems and the mitigation of Radio Frequency Interference (RFI) between adjacent antennas and other systems for the structural analysis of SKA.

Dr. Danie Ludick, electrical and electronic engineer from the Stellenbosch University, said: ‘The research includes multi-physics simulation; for example, a coupled mechanical-electromagnetic analysis of quantifying environmental effects of sensitive equipment (such as a radio telescope) in the harsh desert environments where these structures are typically used. I am also looking at accelerated simulation methods for the fast, yet accurate simulations of antenna arrays.’

The team is using Altair’s Hyperwork Suite for its analyses, including OptiStruct for the structural analysis of SKA and computational electromagnetics software product FEKO for the electromagnetic analysis.

Simulation is the only viable option for studying the interference characteristics of SKA in detail, but this requires considerable computational power to incorporate the extensive verification of the computational electromagnetic (CEM) model and the electrical size of the structure. FEKO features a parallel method of moments (MoM) solver to speed up the simulations so they could be completed in days at the Centre for High Performance Computing in Cape Town. The simulations sometimes produce surprising results, as Ludick said: ‘For the structural analysis, it was interesting to see the impact of the backing structure for a large reflector dish, specifically when it causes the main reflector to deform slightly.’

Simulation results are also compared with measurements from the MeerKAT radio telescope to validate the FEKO model and enable the rigorous RFI studies required to optimise the design, layout, shielding and bonding recommendations, to mitigate the interference between these extremely sensitive antennas and systems.

New solvers are continually being added or further improved to continue to address the challenges this massive project poses. Ludick added: ‘We are putting a lot of effort into streamlining the coupled mechanical electromagnetic analysis of large dish reflector antennas, specifically to automate the process – which up to now is a bit manual. We are also looking at improving our CEM analysis using domain decomposition methods where possible.’

Antennas of the front-end receiver system, part of the LOFAR array.

Turning the sky on its head

For radio telescope projects like SKA to thrive, we also need to improve our understanding of our own planet and its ionosphere, which is a layer of the Earth’s atmosphere that contains a high concentration of ions and free electrons, and is able to reflect radio waves. The ionosphere is very problematic for radio telescopes that work at low frequencies (< 1 GHz) and, for those telescopes working at frequencies below about 300 MHz, it is probably the single biggest limitation to accurate instrumental calibration.

An innovative new methodology, called IONONEST, for assessing the effect of space weather on radio signals in the ionosphere has been devised by researchers from the University of Manchester’s Jodrell Bank Centre for Astrophysics. Dr Anna Scaife, head of the Interferometry Centre of Excellence at The University of Manchester, told Scientific Computing World: ‘When we started the IONONEST project, our thinking was ‘if our measurements are so sensitively affected by changes in the ionosphere, why don’t we turn things around and try to use the telescope to make measurements of the ionosphere instead?’

The researchers decided to focus on frequency-dependent absorption of radio waves as Scaife explained: ‘This could provide information on the height profile of the electron density through the lower ionosphere. Information on this height dependence is often lost by ionospheric probes, which predominantly focus on integrated electron density measurements. The absorption technique we adopted is also sensitive to the lowest part of the ionosphere, the D-region, where again many other techniques are not.’

However, recovering height profiles from frequency dependent absorption measurements is an ‘inverse problem’, where the measurement itself doesn’t provide the quantity of interest directly, but needs to be inverted. ‘There are different ways of tackling this kind of problem and we adopted a numerical approach, iteratively simulating our measurements from a model and comparing/fitting them to the data,’ according to Scaife.

One of the strengths of this approach is that the fitted parameters of the model can then be used to track the time-dependent behaviour of the ionosphere, as Scaife added: ‘Characterisation such as this is key to understanding how different space weather events impact on the ionosphere and, consequently, how to predict and respond to them.’

The IONONEST methodology used a ‘Bayesian approach’, where the ionospheric electron density profile as a function of height based on an input model, as well as the instrument response to that profile, are iteratively simulated. The simulated response is compared to measured data to assess its accuracy, the parameters of the model are updated and the process is repeated to improve the accuracy.

Sciafe explained: ‘We do not measure absolute accuracy, but rather the degree of similarity between our measurements and our simulations. We use this to characterise the allowable range in our recovered parameters. There are lots of ways to do this. In the IONONEST code, we make an assumption of uncorrelated Gaussian noise on our measurements and use this to construct a likelihood function that tells us how well the measured data fit the model we are assuming. We use this likelihood, combined with prior information on our model parameters, to calculate the posterior probability – that is, how well the model fits the data.

‘Finally we calculate a quantity known as the Bayesian evidence, which is the probability of the input model itself. We use the evidence value to make a relative assessment of different underlying input models, and decide which one is most likely to provide a true description of the data.’

Visualisations of the large scale outflows and small-scale star-forming clumps.

The team now hopes to compare more detailed and physically motivated models of the ionospheric electron density, in order to further its understanding of this complex region. ‘Expanding these models to include not only height information, but also wider spatial volumes of the ionosphere, would be an excellent path forward. To do this would require measurements from an array of telescopes; our ambition in this area is to see this realised using the European LOFAR array and, potentially in the future, the SKA1-LOW telescope,’ Scaife added.

As this last statement indicates, the fields of observational and theoretical astronomy are interlinked. As one field improves, it opens the door for new observations or theories to feed the other field. And so the cycle of research continues until, eventually, the secrets of our universe may be unlocked.



Topics

Read more about:

Modelling & simulation, Astronomy

Media Partners