👂 What if we could hear a caress?

Published by Adrien,
Source: CNRS INS2I
Other Languages: FR, DE, ES, PT

At the Institute of Intelligent Systems and Robotics (ISIR - CNRS/Sorbonne University), Alexandra de Lagarde and Louise P. Kirsch ask a surprising question: what if we could hear a caress? From this reflection was born Audio-touch, a research project exploring new ways to recreate social touch at a distance.


Converting affective tactile gestures into interpretable sounds is the goal of the Audio-touch project, conducted at ISIR as part of the ANR MATCH program. This approach is based on a simple yet promising observation: both touch and hearing rely on the perception of vibrations. It emerges in a context where physical interactions are becoming scarce—social isolation, digitalization, virtual interactions—even though touch is essential to our development, emotional well-being, and stress regulation.

Members of ISIR, including Alexandra de Lagarde (PhD student), Catherine Pelachaud (CNRS research director), Louise P. Kirsch (lecturer at Université Paris Cité), and Malika Auvray (CNRS research director), have worked to determine whether gestures such as a caress, a pat, or a rub on the skin can be sonified to convey socio-affective information to a listener.

To achieve this, the researchers recorded skin-to-skin gestures using vibrotactile sensors, then transformed this data into sounds, called audio-touch stimuli. These sounds were then presented to participants in a series of four experiments designed to assess their ability to recognize the nature and intention of the gestures.

The results show that sounds derived from tactile gestures are interpretable in a coherent manner:
- The gestures are well categorized (1st experiment),
- Their emotional intentions are identified (2nd experiment),
- And the surface on which the gesture is performed (human skin versus inanimate object) significantly influences the perception of the gesture and the emotions conveyed (3rd and 4th experiments).

These findings, published in the journal Proceedings of the National Academy of Sciences (PNAS), suggest that hearing can, to some extent, convey cues normally transmitted through touch. In other words, it is possible to "make heard" a tactile intention.

The project lies at the intersection of several fields: social cognition, sensory perception, human-agent interaction, haptics, and sound design. While the ANR MATCH project explores touch illusions in virtual reality through multisensory feedback, Audio-touch specifically examines the role of the auditory channel in transmitting social connection.

By demonstrating that affective gestures can be recognized through their sound translation alone, Audio-touch opens new avenues for designing affect-sensitive interfaces and, more broadly, for developing alternative forms of multisensory communication in digitized environments.
Page generated in 0.187 second(s) - hosted by Contabo
About - Legal Notice - Contact
French version | German version | Spanish version | Portuguese version