[LUM#20] The Metaverse in Motion

Improving communication between people who interact in virtual worlds is one of the goals of the European ShareSpace project, in which the EuroMov laboratory is participating. This challenge requires reintroducing movement in all its dimensions into these metaverses to facilitate social interactions. Benoît Bardy, a researcher specializing in the study of human movement, explains1.

Imagine a coworker offers you a cup of coffee. Their hand reaches for the cup sitting on the table in front of them and picks it up. Will they bring it to their mouth to drink, or will they hand it to you? In reality, you probably don’t even need to ask yourself this question, because you’ve already figured out their intention. “His gaze shifting from the cup to you, the way he reaches for it, the angle of his elbow, the direction his wrist takes—these are all signs that provide very early clues about your conversation partner’s intention, explains Benoît Bardy, a researcher at the EuroMov laboratory.

Sensory-motor communication

All these cues, which researchers refer to as “movement primitives,” form the foundation of sensorimotor communication, which plays a major role in social interactions. “Eye contact, gestures, emotions—these expressions are central to how we communicate, explains the movement specialist. These are all cues we’re able to pick up on in face-to-face interactions, but what happens when the person you’re talking to isn’t in the same room? “Then we lose a lot of our ability to detect these intentions in our conversation partners’ gestures.”

This is a major research challenge at a time when virtual reality is on the rise and the metaverse—defined as a hybrid world that can transcend the real world, particularly through augmented reality—is garnering increasing attention. How can we identify these motor signatures of intention within digital environments? This is one of the key challenges addressed by the European research program ShareSpace, in which the EuroMov laboratory is participating.

A project with an innovative vision: the creation of future hybrid social spaces shared by humans and avatars. Here, the well-known social sensorimotor primitives are captured using innovative connected mobile sensors, then reconstructed using a new extended reality technology. “This allows us to facilitate virtual communication by recreating these gestures, which we can amplify or even dampen depending on the application, explains Benoît Bardy.

Avatar

And to explore the metaverse, ShareSpace incorporates four levels of interaction into its scenarios that blend the real and virtual worlds. At the first level, called L0, humans interact in the real world to better assess, identify, and calibrate sensorimotor situations. “At this level, we offer synchronization games in particular, which are recognized as vehicles for empathy and social bonding, explains the movement specialist.

At the next level, or L1, everyone finds themselves in a virtual world, where each person has an avatar that mimics their movements. “This is a non-autonomous avatar that replicates our movements and interacts with the avatars of other people.” In practice, each participant stays “at home” wearing a virtual reality headset and meets up with others in the metaverse.

Level L2 is characterized by a certain degree of autonomy granted to the avatars. “For example, when playing a synchronization game together, if one of the participants begins to fall out of sync with the rest of the group, their avatar can adjust its sensorimotor response to realign with the group.” Finally, at the highest level (L3), we meet Sarah, an artificial intelligence built using a mathematical model. “If one of the participants falls behind in the synchronization game, Sarah is able to notice this, go find them, and guide them in a personalized way to resynchronize them with the group. We’re creating a hybrid world that blends reality, virtual reality, and augmented reality.”

Virtual Rehabilitation

What are the practical applications of this shared metaverse? “SharesSpace focuses on three areas: health, sports, and art,” replies Benoît Bardy. In the health sector, the project aims to treat lower back pain by enabling patients with chronic back pain to participate in group rehabilitation sessions with a therapist. These sessions are unlike any others, where real-world and virtual patients follow the guidance of an autonomous avatar. “A therapist represented by their avatar can closely monitor whether the patient is performing the necessary exercises correctly and correct them if needed, explains the specialist from the EuroMov laboratory.

On the athletic side, ShareSpace helps cyclists learn to ride together in a group in a safe environment . “It’s an app that’s particularly well-suited for children, who will learn, for example, how to navigate around an obstacle with the help of Sarah, who amplifies parents’ movements to make their intentions easier to understand,” explains Benoît Bardy. An app will also be available for cyclists with a competitive mindset, “where, on the contrary, you sometimes need to know how to hide your intentions to better prepare a breakaway from the peloton, the researcher explains. See you at the ShareSpace booth in Paris during the 2024 Olympics.

Finally, on the artistic front, the project’s creators have given carte blanche to a collective of artists from the European Ars Electronica Foundation, who will use this technology to create an immersive performance at the Art and Technology Festival in Vienna in September 2024. The performance will be accessible both in person and virtually.

UM podcasts are now available on your favorite platform (Spotify, Deezer, Apple Podcasts, Amazon Music, etc.).

  1. Euromov (UM, IMT Mines Ales)
    ↩︎