How one astronomer hears the Universe

17 Nov 2021 CategoryGender identity and sexual orientation at work Author Umain Recommends

In this article:

Astronomy is inextricably associated with spectacular images and visualizations of the cosmos. But Wanda Diaz Merced says that by neglecting senses other than sight, astronomers are missing out on discoveries.

For 15 years, Diaz Merced, an astronomer at the International Astronomical Union (IAU) Office for Astronomy Outreach in Mitaka, Japan, has pioneered a technique called sonification. The approach converts aspects of data, such as the brightness or frequency of electromagnetic radiation, into audible elements including pitch, volume and rhythm. It could help astronomers to avoid methodological biases that come with interpreting data only visually, argues Diaz Merced, who lost her sight in her twenties.

Last month, she co-organized the IAU’s first symposium dedicated to diversity and inclusion. The event, in Mitaka from 12 to 15 November, showcased, among other topics, efforts aimed at presenting cosmic data in formats that are accessible through senses other than vision.

Diaz spoke to Nature about how bringing these efforts to mainstream science would boost accessibility — and discoveries.

How did you begin your work with sonification?

Sonification has been around for a long time. In 1933, for example, US physicist Karl Jansky reported detecting the first radio waves from space, as an audible hiss in his antenna. But at some point, visualization came to dominate the way we interpret astrophysical data. When I was an intern at NASA in 2005, my mentor, Robert Candey, wanted me to create a prototype data analysis tool that would familiarize blind people with space-physics data. So we developed software that could map astronomical data into sound — its pitch, rhythm and volume. Then, in my 2013 PhD dissertation at the University of Glasgow, UK, I proved that it is useful.

How did you do that?

I presented users with simulated data of astronomical spectra and asked them to look for a characteristic double peak that indicates a black hole. We had people try to identify signals that were masked by noise by using vision only, by combining visual interaction and sound, and by using audio only. We found that when you combine audio with visual interaction, your sensitivity to the signal improves. That tells us we need to focus on a transition to studying how these methods can benefit everyone, not only visually impaired people.

Can you describe a real-world example?

There are many. Sonification can help us to study the habitability of an exoplanet, by understanding how much high-energy cosmic and solar rays interact with its magnetic field or atmosphere. Such interactions cause fluctuations of electromagnetic emission from that star system that vary in a way that relates to frequency . But because astronomers usually separate out different frequency components into many graphs, this is easy to miss. With sonification, we can listen to all the different frequencies together and pick out the signal from the noise.

Is it common to use sound as well as visual approaches to represent astrophysical data?

No. People are courageously focusing on developing software to allow interaction with sound and using tactile interactions. But there is no work for people who are sighted, or efforts to learn, for example, what timbres increase our ability to understand data with sound, and which hinder it. We have to have it available, and not just for people in outreach. Right now, we are missing discoveries because we are only focused on some visual ways of interacting with the data. I get disappointed when people focus on using these techniques for ‘learning playfully’, when we should be focusing on mainstreaming.

What do you mean by mainstreaming?

Mainstreaming means equalizing participation, and progress, through all the different ways of approaching research. It would mean that visually impaired people, as well as others who are marginalized, could participate equally in the mainstream of science and choose how they want to progress in the field. Right now, we do not enjoy that freedom.

Are there many visually impaired astronomers working in the field today?

There are only perhaps four visually impaired astronomers in the world — few are mainstreaming. If they are, often it’s because they still have a bit of visual-field perception. Even with text-to-speech readers, it is challenging to go through charts and footnotes, to manipulate data and publish and read papers. We also need to be able to recuperate from loses of time because of disability. If I can’t come to work on Tuesday perhaps I can come on Saturday, or work between 3 p.m. and 9 p.m..

Has the situation improved over the past decade?

Today, people are making a bit more of an effort to facilitate accessibility. My collaborators in Argentina — Beatriz Garcia and Johanna Casado — and I are analysing and developing a sonification software prototype that is designed around the user’s experience. At the meeting, we also found people that are realizing that tactile interactions can also successfully represent data and be used for mainstreaming. Now we have to provide the scientific field with the evidence to help them to change their mindset.

What are your personal hopes for the future?

I would love be a working astronomer, carrying out research. But right now I don’t see any opportunity, so I have to look at the bigger picture. When my contract runs out on 31 December, I’m going home to Puerto Rico, where I hope I can teach. I want a field where we all work as equals, and where factors such as age, disability, gender and socio-economic status do not control my progress. I want my field not to underuse, misuse or neglect the human potential we all have for exploration and inquiry, and to trust that we all may contribute just as we are.

You can read the complete article here.