By Elton Gomes
Using the 5,000th photo of a sunrise on Mars, British scientists were able to create a two-minute piece of music. The picture was captured by NASA’s rover Opportunity.
The team used a technique called “data sonification.” “We are absolutely thrilled about presenting this work about such a fascinating planet,” said Domenico Vicinanza, Director of the Sound and Game Engineering (SAGE) research group at Anglia Ruskin University, Cambridge, as per an IANS report.
Here’s how a sunrise on Mars actually sounds like:
What is NASA’s Opportunity rover?
The Opportunity rover is a robotic rover that has been active on Mars since 2004. It was launched on July 7, 2003, as part of NASA’s Mars Exploration Rover program. The rover landed in Meridiani Planum on January 25, 2004, three weeks after its twin, Spirit, landed on the other side of Mars. However, earlier in 2018, the rover has been unable to communicate due to a dust storm.
NASA said that it will be continuing its current strategy to make contact with the Opportunity rover. The American space agency believes that winds could increase in the next few months at Opportunity’s location on Mars, which could result in dust being blown off the rover’s solar panels. NASA said it would reassess the situation somewhere around January 2019.
What is data sonification?
In simple words, data sonification is the process through which data is converted into sound. The basic principles are similar to visualization. However, where visualizations generally use elements such as lines, shapes, and colours, data sonification uses sound properties such as volume, pitch, and rhythm.
“Image sonification is a really flexible technique to explore science and it can be used in several domains, from studying certain characteristics of planet surfaces and atmospheres to analysing weather changes or detecting volcanic eruptions,” Vicinanza said, as per a report in India Today.
How was the soundtrack made?
The scientists created the soundtrack by scanning a picture from left to right, pixel by pixel. They then monitored the brightness and colour information and combined them with terrain elevation.
Through data sonification, they were able to assign each element a specific pitch and melody so that a photograph could be translated into music.
The quiet, slow harmonies are a result of the dark background, while the brighter, higher pitched sounds towards the middle of the piece are created by the sonification of the bright sun disk.
Talking about the benefits of data sonification, the team said that the technique can be applied in health science where provide scientists can come up with new methods to access the occurrence of certain shapes and colours – this could be useful in image diagnostics.
Audience to have first-person experience of sunrise on Mars
Dr Vicinanza, from the Anglia Ruskin University, and Dr Genevieve Williams, from the University of Exeter, will present the world premiere of the piece. The soundtrack has been named Mars Soundscapes, and it will be presented at the NASA booth at the forthcoming Supercomputing SC18 Conference in Dallas, which will be held on November 13.
For a surreal experience, the piece will be presented using both conventional speakers and vibrational transducers so that the audience can feel the vibrations with their hands.
Other instances of creating music from data
Vicinanza’s has previously composed music, which was based on particle data used to discover the Higgs boson. He was also involved in composing music from magnetometer readings from the Voyager mission.
In 2014, to mark CERN’s 60th anniversary, Vicinanza composed a 12-minute piece (for harp, guitar, two violins, a keyboard, a clarinet, and a flute) based on data from the four major experiments at the Large Hadron Collider.
Elton Gomes is a staff writer at Qrius
Stay updated with all the insights.
Navigate news, 1 email day.
Subscribe to Qrius