An image of a galaxy in space

Project awarded funding to investigate how astronomy can be communicated and studied by ‘listening’ to data

27 June 2023

4 min read

A new project to explore the Universe using sound has been awarded over £500k funding.

The grant will fund a team of researchers at the to create new approaches to sonification, investigating how astronomy can be communicated and studied by ‘listening’ to data.

‘Sonification’ is a term for presenting data using non-verbal sound, but is much less common than data visualisation techniques.

The project will build on the development of a free, open-source code, , which gives researchers the tools they need to listen to and translate their own data into frequencies that can be heard by the human ear.

It is funded by an Early Stage Research & Development Grant from the UK’s Science and Technology Facilities Council. 

Using STRAUSS to sonify the stars

Using STRAUSS to sonify the stars appearing at night, where the note and volume tells us about the colour and brightness of each star. The full 360 video, which can be played using a VR headset, also uses spatial audio to map the position of stars in the sky.

 

Dr James Trayford, from the ’s Institute of Cosmology and Gravitation, wrote the code for STRAUSS. He said: “We’re excited to win this grant. By popularising listening to data, we hope to open astronomy to more people. 

“Astronomy in particular has a strong visual bias. We are used to the stunning images of planets, stars and galaxies provided by telescopes, collecting the light arriving to us through the silent vacuum of space.

“By representing this data through sound, we hope to gain new perspectives; both from those already familiar with the data to those experiencing it for the first time.” 

The project focuses on a broad range of audiences - from classroom education to public engagement and research tools for professional astronomers.

Dr James Trayford

We're used to sound conveying information beyond language, from simple notifications on our phones, to music expressing complex thoughts and feelings. By drawing from and building on these techniques to convey data, we hope to understand it in new ways. Sight and hearing are good at picking up different things, and using both could lead to new discoveries.

Dr James Trayford, Research Fellow

Dr Trayford believes that the potential of sonification is not just as a stand-in for visuals, but also to convey information that may be hidden by more traditional approaches. 

He said: “While listening to data may be a new concept for many people, we are used to sound conveying information beyond language, from simple notifications on our phones, to music expressing complex thoughts and feelings. By drawing from and building on these techniques to convey data, we hope to understand it in new ways. Sight and hearing are good at picking up different things, and using both could lead to new discoveries.”

Dr Trayford and his team have been working on two new papers. The first, to be presented at the International Conference for Auditory Display 2023 in Sweden, will introduce the STRAUSS code for the scientific community. This showcases its diverse applications, from sonifying scientific data for analysis, to making immersive soundscapes of the night sky for those who can’t experience it visually.

A second paper,, uses the STRAUSS code to explore new forms of data coming from telescopes such as JWST. In particular it tests a new technique, converting the frequencies of light directly to audible frequencies, allowing scientists to hear specific resonances of different chemical elements and their motions - beyond what can be seen in an image alone. 

Dr Chris Harrison, co-author of the papers and co-ordinator of the team’s wider project from Newcastle University said: "One of the unique features of our code is its flexibility to be used for a wider variety of applications. We have demonstrated its effectiveness for making educational resources and planetarium shows, which are also accessible to blind and vision impaired audiences, through to data analysis tools for cutting-edge astrophysics research."

You might also like...