Welcome to our multisensory map – a portal to the captivating world of Artsoundscapes!

By Daniel Benítez and Lidia Alvarez-Morales

The dissemination strategy of the Artsoundscapes project has moved a step forward with the development of a multisensory interactive map, released on our website in May 2024. We envision this map as a direct glimpse into the scope of our research, serving as a window into a world where archaeology, acoustics, and technology converge.

On a first instance, the multisensory map was conceived as a comprehensive visual narrative of the geographical expanse covered by the Artsoundscapes extensive fieldwork campaigns in Siberia, Catalonia, Valencia, Cádiz and South Africa. In other to showcase the group’s work around the world, other campaigns in which the project has collaborated, or that belong to previous research projects, have also been included. A second goal was for the map to serve as a multisensory repository to experience the acoustics of the great variety of rock art sites we have studied, so that we transcend the complexities of data representation by offering a more intuitive and accessible representation of the auditory landscapes we have documented.

In this entry of our blog, we explain the development process of the multisensory map, emphasizing the integration of spatial audio rendered by means of auralizations with visual material gathered on-site.

Where We’ve Been: Protecting the location of the rock art sites

The first task in the development of this multisensory map involved setting the location of all the sites studied during each fieldwork campaign, so that the sonic journey was accurately represented. However, finding a compromise between accuracy and rock art preservation was challenging. Having developed the first version of the map with the sites’ exact locations, the archaeologists in the project raised the need of not revealing such information since many of these open rock art sites are particularly susceptible to the impacts of human interference, lacking conventional fences or barriers. Therefore, this cautious approach was adopted to mitigate the potential risk of these fragile and often unprotected cultural manifestation falling victim to vandalism.

Fig 1. Example of two rock art motifs extracted with an angle grinder, Abric de Benirrama (municipality of La Vall de Gallinera, Province of Alicante – Spain). Photograph by Margarita Díaz-Andreu.

In view of the above, instead of using the exact coordinates for each rock art site on the interactive map, we decided to set them at an approximate location. This is clarified in each card on the map.

Fig 2. Screenshot of the map showing the approximate location of the studied sites.

The Wonders We’ve Seen: Selecting the Visual material

For each rock art site, we gathered a comprehensive set of photographs documenting various aspects of the fieldwork, including some general views of the rock art sites, details of the art, and the acoustic tests. Some of these photographs are taken with the 360-degree camera (Ricoh Theta) to capture not only panoramic views of the surroundings but, more importantly, a 360-degree view from the position of the loudspeaker (representing the perspective of the emitter) and from the microphone position (listener perspective).

Fig 3. Top –Partial view of one of the rock art panels at Christmas shelter, Kamberg, South Africa. Bottom – Picture taken during the acoustic measurements at the same site from one of the listener positions. Photographs by Neemias Santos da Rosa @Artsoundscapes project.

In particular, the 360-degree photographs taken from the perspective of the listener are used in the map to create the immersive videos – commonly referred to as 360-degree videos or spherical videos – latter posted on YouTube. In these videos the orientation of the viewpoint can be modified by using a mouse or any other compatible devices. By adding Ambisonics audio to the 360 videos, it is also possible to dynamically change the aural perspective, by synchronizing the visual movement with an appropriate sound representation while the video is reproducing, making the videos interactive.

Hosting these immersive videos on Youtube serves as an excellent dissemination method in terms of interaction and accessibility to the research findings.

Restoring Sound to Rock Art Sites: First-Order Ambisonics impulse responses & Auralizations

The sonic material included in our multisensory map comprises a selection of sound extracts related with prehistoric instruments contextualized in each site by means of auralization techniques. An auralization represents the process that allows a pre-recorded sound to obtain the acoustic properties of a specific place [1], i.e. enabling the reproduction of any audio as if it were emanating from a specific location in such place. Basically, to generate an auralization two elements are required: an anechoic sound – i.e. a sound recorded in a reflections-free environment, where background noises, echoes and/or reverberation are not present – and an Impulse Response (IR) which represents the acoustic signature of a site for a specific pair of source and listener configuration.

The anechoic recordings selected for this map were made by the Artsoundscapes project back in 2021 (find out more about the process in this link). They feature a variety of prehistoric instruments such as flutes, shakers, and bass drums, serving as a homogenous representation of how diverse sonic elements would manifest in the distinct rock art sites explored. While these sounds cannot be considered a plausible representation or a recreation of the actual rhythms and melodies used by those who created the art, they allow to sonically explore the acoustic characteristics of each site. Each particular sound (an afro Venezuelan flute, an Australian “big seeds” percussion shaker, a cow horn and a bass drum) lasts 15 seconds, conforming a 1 minute video in each card.

Obtaining the spatial IRs that represent the acoustics of each site involved a methodology described in detail in other entries of our blog (see this one or this one for further information). In summary, throughout each measurement campaign, a comprehensive set of HOA (High Order Ambisonics) recordings is gathered on site with a Zylia microphone by following international standards [2]. The recordings of the test signal used are later processed to obtained the spatial IRs under the MATLAB environment with the “MIMO script”, developed by Adriano Farina (details here), or its updated version, SIMO script, depending on the equipment (loudspeaker) used in field. Then, the spatial IRs in HOA format (16 channels) are decoded to first-order Ambisonics (4 channels) offering a spatial representation which aligns with YouTube’s specifications. This process ensures a rigorous and standardized treatment of the acquired audio data as we detailed in several publications.

This initial phase of processing the audio data feeds into the second processing phase: creating the auralized sound files. This involves convolving the anechoic recordings with the first-order Ambisonics IRs. The convolution process, along with other tasks such as audio joining and normalization, are also performed in MATLAB.

Although in each studied site, many source-receiver combinations have been considered, only a representative one from each site has been selected for the videos due to the need to reduce the amount of data presented. Also, in order to sonically compare sites easily, the same audio track has been auralized with the selected S-R combination in all the sites.

Tips for Using the multisensory Web Map

Using our map is really easy. You just need to have a look at all the locations marked in it. As you may see, each colour represents a different fieldwork campaign. By clicking on any of the pins, a site card will pop up, including some information, the 360º video and a picture of the selected S-R combination taken during the acoustic measurements.

The 360º videos are hosted on YouTube, but they are linked to the map. As explained above, YouTube works with first-order Ambisonics audio, and renders it into binaural audio in real time. Thus, using headphones is mandatory for having the right spatial audio representation! YouTube also adapts the acoustic experience depending on the angle, so whenever you turn around, the sound will follow your “head” movements. Try to look backwards to feel how sound comes from behind.

Here’s a brief video-demo to give you a sneak peek

Note: This multisensory map has been designed and developed by Daniel Benítez and Lidia Alvarez Morales. @Artsoundscapes project.

Without further ado, we happily invite you to immerse yourself in this map, letting you not just visually but sonically explore this selection of rock art sites. We encourage you to explore several rock art sites on the map to compare how the sound experience varies (or stays the same) across them. Just click at the link below and enjoy!

REFERENCES

[1] Vorländer, M. (2020). Auralization. Berlin/Heidelberg, Germany: Springer International Publishing.

[2] ISO 3382-1:2009, Acoustics—Measurement of room acoustic parameters. Part 1: Performance spaces (International Organization for Standardization, Geneva, Switzerland, 2009).