23-25 Apr 2026 Montpellier (France)

Programme

MOCO’26 In the pink of Health: Conference programme

 

Conference Day 1 – Thursday 23 April

Location: Cité des Arts

Time

Programme

8.00 – 8.30

Conference registration - Coffee & tea

 

8.30 – 9.00

Welcome to MOCO'26 – Patrice Guyot – Grégoire Bosselut – Director CDA

 

9.00 – 10:30

Doctoral Consortium: MOCO Emerging Scholars

Chairs: Théo Velletaz, Martin Leguennec

 

5 min + 2 min Q&A

  • Brenda San Germán Bravo. Body-Informed Effects for Supporting Emotional Self-Regulation in a Mixed Reality space
  • Théo Dupuy, Victor Lopes de Souza. Cautious predictions to support decision makers in movement-related areas.
  • Lili M. Rampre. Cyborg Sensing: A Kinotechnic Inquiry into the Epistemic Infrastructures of Movement and Perception.
  • Léo Chédin. Exploring Choreographic Processes Involving AI.
  • Atilla Juliana Vrasdonk et al. Kinetic Energy and Flow in Co-Improvising Flamenco Dyads.
  • Romaric Sichler. Learning to Teach Gestures: Adaptive Feedback for Human–Machine Co-Learning in craft.
  • Léo Mercier et al. Movement Sonification Integrated to Rehabilitation-Readaptation.
  • Roos Van Berkel. Moving with Care: The Agency of Digital Movement in Socio-Material Practices.
  • Botao 'Amber' Hu. On Improvisation and Open-Endedness: Insights for Experiential AI.
  • Hadil Abba et al. The Sense of Touch in Healthcare HAPTIMED: A Digital Twin of Haptic Perception for Educational Purposes -

 

10.30 – 11.00

Coffee Break

11.00 – 12.30

Paper Session #1 - Embodied Interaction, Movement & Perception

Chair: TBA

12 min + 3 min Q&A

  • Lottridge et al. Moving Contexts: How Culture, Context, and Movement Histories Shape Whole-Body Interaction in Aesthetic Environments
  • Preisler et al. When Bodies Resonate in Sound: Sonifying Interpersonal Movement Dynamics in Dance
  • Weber et al.Dynamic Abstract Avatars Impact Dancers' Sense of Embodiment and Movement Choices
  • Mardamootoo. Moving Through Volume
  • De Blanc et al. Weight-sharing trust and wooden floors: Identifying moderating factors in physically integrated dance
  • Velletaz et al. Automatic and perceptual assessment of motion coordination in dyadic dance

 

12.30 – 14.00

Lunch Break

 

14.00 – 15.30

Paper Session #2 - Dance, Choreography & Creative Practice with Technology

Chair: TBA

12 min + 3 min Q&A

  • Rajko et al. Choreographic and Improvisational Approaches To Interrogating Robotic Systems
  • Correia et al.Fantasies, Obscurities and (Dis)Connections: Three Case Studies of Dance Artists' Creative, Embodied and Political Engagement with AI
  • Hou. Playing the Museum: The Body as Interface with Central African Traditions
  • Stergiou et al. Digital Queens: A case study on cloth simulation, motion capture and XR technologies for addressing costume-choreography challenges
  • Baltas. Extending the Site: XR modalities for Site-Specific Dance – A Comparative Study of XR Technologies in Studio-Based Practice
  • Sicchio. p5score: A Computational Framework for Choreographic Notation and Real-Time Movement Composition

 

15.30 – 16.00

Coffee Break

16.00 – 18.00

Practice Works and Posters

 

Practice Works #1- Chair: Julien Laroche

  • START: Science, arT, reseARch and Transgression
  • PosePilot-GOM: A Web-based application for dexterity analysis of human movement
  • A pen "IMU inside" : a Sensor-Enhanced Pen for Exploring Sound While Writing
  • PosePilot-Ergo: A web-based application for ergonomic analysis and human motion quantification
  • PyEyesWeb: An open source toolkit for multimodal movement feature extraction
  • Drifting Bodies Through Algorithms

 

Poster #1 - Chair: Stéphane Perrey

  • Gasparotti et al. Effects of cognitive-motor training in virtual reality on anticipatory brain functions and balance of professional dancers
  • Rokeby et al. Enriching the Kinematic: Approaching New Methods for Machine Learning with Bodies That Move at the Edge
  • Zhu. How AI Leads in Creative Practice: From Mentorship Dialogues to Extended Narratives
  • Whatley et al. Dance, disability and robots: interdisciplinary possibilities for reframing ‘healthy bodies' in performance
  • Chiu. Embodied Ethics in Digital Futures: Choreoethics and Motion Capture in Digital Dancescapes
  • Ayache et al. The Choreography of Thought: How Interpersonal Coordination Reveals Shared Cognition
  • Sutton-Chanari et al. On the fractal complexity of sacrum motion during walking
  • Ioannis. Musicians' Movement Repertoires and Emergent Coordination: Scapular Kinematics, EMG, and Struggle in Higher Music Education
  • Zhang. Reframing Human–Machine Movement through Laban Spatial Logic: Toward a Temporal and Embodied Framework of Relational Vitality
  • Daveau et al. Embodied Gestures: recognizing static hand movements with lightweight neural models
  • Taleb-Salah et al. Motion Capture for Ergonomic Assessment: Inertial vs. Computer Vision Based on YOLOv11
  • Pyaraka et al. Humanoid Robot Navigation in Shared Care Spaces: A Human-Aware Navigation Framework and ImplementationHumanoid Robot Navigation in Shared Care Spaces: A Human-Aware Navigation Framework and Implementation
  • Chafik et al.IMU-Based Detection of Load Carriage for Ergonomic Risk Assessment
  • Lahya et al. Deep Learning for Physical Load Estimation: Insights from ViLoad Video Dataset

 

19.00

Evening

Gala

 

 

Conference Day 2 – Friday 24 April

Location: Cité des Arts

8.30 – 9.00

Conference registration - Coffee & tea

 

9.00 – 12.00

Practice Works and Posters

 

Practice Works - Chair: Patrice Guyot

  • Tethered: Biophysical Sensing toward Affective Somatic Integration
  • Holding Time Main-Tenant as a Practice of Palliative Health
  • Creative Movement Hacking: Can We Combine Ideokinesis and Immersive Technologies to Enhance Embodiment?
  • Creativity Tools for Movement-based Artistic Practices in Extended Reality: Performances based in Fantasticos
  • The Emergence of a Dance: A Sensitive Experience of Movement
  • The Z of Touch: Crystallizing the Interoceptive Axis of Blended Touch

 

Poster #2 (10:00 – 12:00) - Chair: Stéphane Perrey

  • Kobayashi et al.GenreMix Analyzer: Visualizing Probabilistic Composition of Dance Styles for Supporting Dance Learning
  • Skjeldal et al. Studying Embodied Expression in Drumming for Virtual Systems
  • McKendrick et al. Don't Let Me Be Misunderstood: Guiding Acting Practice through Negative Robot Behaviour and Contextual Intentions
  • Grebel et al. Battles as Interactive Ecologies: Designing with Embodied Roles in Hip-Hop Performance
  • Glover et al. Sample entropy analysis of variability in sit-to-stand-to-sit movements of people with or without chronic pain
  • Faux et al. Dynamical 2D-DFA for movement analysis in obstetrics
  • Kolokotroni et al. Illuminating Emotions: Evaluating the Emotional Impact of Lighting on Animated Characters in Animation and Video Games through Motion Capture
  • Neville. Agiles: Creativity and Mobility through embodied participation in Immersive Environments
  • Vincs et al. Virtual Volumetric Bodies Interacting with Squishy Balls and Shiny Fish: Towards a more inclusive XR interaction system
  • D’adamo et al. SoniFootsteps: Movement-Triggered Footstep Sounds to Modulate Body-Weight Perception, Gait and Emotion
  • Soga and Sra. VR Dance Puppet: Movement Creation by Controlling Partial Body Parts Using a VR Device
  • Stein et al. Tapxophone: Towards Engaging Finger Rehabilitation using Computer Vision and Music
  • Hollerweger et al. Streaming Open Sound Control data from a commercially available IMU suit in real time for performative sonic arts projects
  • Gong et al. DVF-Generator: A Physics-Aware Conditional Generative Model for Respiratory Motion Synthesis in Liver SPECT
  • Kantan. Beyond Deterministic Mappings: Audiovisual Correspondence in Movement-Controlled Generative Music

 

12.15 – 13.30

Lunch Break

 

13.30 – 15.30

Paper Session #3 - Machine Learning, AI & Generative Systems for Movement

Chair: Gérard Dray

12     min + 3 min Q&A

  • Lawrence et al.Interactive Machine Learning can recognise complex movements, but does it make us happy?
  • Yang et al. Designing Generative AI for Real-Time Multi-User Interaction in Co-Creative Dance
  • Faurent et al. Learning Human Rhythmic Movements: Adaptive CPGs for Synchronized Virtual Agents
  • Akbas et al. Cross-Modal Retrieval-Augmented Generation for Craft Gestures Learning: Enabling Dialogue with Multimodal Pedagogical Contents
  • Trolland et al.Exploring Movement-Led Co-Design for Interactive Lighting in Performance
  • Beller. Exploring “Synekinian Pairs”: Manual-Vocal Gesture Integration in Experimental Contexts

 

15.15 – 16.30

Keynote #1

A. Refsum Jensenius & L. Bishop

Chair: J. Laroche

50 min + 20 min

Bodies in motion in the concert hall and beyond

Motion is at the core of how we experience and understand the world, express ourselves, and interact with others. At RITMO, we explore motion that is artistic, expressive, and interactive within a highly interdisciplinary research environment that integrates musicology, psychology, and technology. Increasingly, we are moving our research out of the lab and into the real world, where we can study people behaving as they normally do and capture unique occurrences that cannot be simulated in laboratory conditions. This transition is illustrated by the MusicLab and Bodies in Concert projects, which investigate bodily activity and experiences in performers and audience members during live concerts. From data collected during a series of symphony orchestra concerts, we are learning how musicians coordinate their expressive motion and how audiences synchronize in their bodily responses to the music. We are also learning how to adapt new technologies for real-world data capture, and exploring how captured data can be used for artistic purposes. In this talk, we will discuss some of the concepts, methods, and findings from this line of research.

Alexander Refsum Jensenius is a music researcher and research musician, working as a professor at the University of Oslo, where he directs the fourMs Lab, RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion, and MishMash Centre for AI and Creativity. His research explores human musicality by combining artistic and scientific research approaches. Web page https://people.uio.no/alexanje

 

Laura Bishop is a researcher at RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion and the Department of Musicology at the University of Oslo. Her research investigates the cognition of music performance, expressivity, interactivity, and the role of the body in imagining, articulating, and responding to music. Web page
https://people.uio.no/laurabi

 

16.30 – 17.00

Coffee Break

 

 

17.30 – 19.00

Performance Promenade

  • ZAGHAREED: A Human and AI Co-Created Film Extending a Dance
  • The origins of intelligence: A performative statement on the primacy of movement
  • Real-Time Full-Body Multi-Player Interaction with AI Dance Models
  • Performance of "SensualMap 2.0 Meets The Source"
  • The Emergence of a Dance: A Sensitive Experience of Movement
  • The Body Knows the Pattern: A Performance System Exploring Gesture Mapping and Embodied Rhythm
  • The mv lab spatial trainer MR demo
  • Sonification of dance during the performance promenade
  • Anonymous - A Participatory Installation for Creative Improvisation

 

19.00+

Dinner on your own

20.30

 

 

 

Conference Day 3 – Saturday 25 April

Location: Cité des Arts

8.30 – 9.00

Conference registration -Coffee & tea

 

9.00 – 10.30

Paper Session #4 - XR, Virtual Environments & Multimodal Interaction Systems 

Chair: TBA

12 min + 3 min Q&A

  • Gaugne et al. Blow based collaboration in a digital art virtual environment
  • Saint-Cast et al. A Full-Stack Web-Based Ecosystem for Movement–Sound Interactions
  • McKendrick. Mask Work and Performance Techniques for VR Embodiment
  • Guo et al. Liquid Connections: Reimagining Social Touch in Virtual Reality
  • Brendel et al.Low-Latency Real-Time Volumetric Reconstruction for Interactive and Dynamic Stage Productions
  • Odonnell et al. Gesture Mapping for Embodied Rhythmic Expression: A Case Study on Expressive Affordances

 

10.30 – 11.00

Coffee Break

11.00 – 12:10

Keynote #2

V. Cochen De Cock & B. Bardy

Chair: J. Laroche

50 min + 20 min Q&A

Beyond Fixed Beats: Adaptive Musical Entrainment for Movement, Health and Wellness

Music can do more than accompany movement: it can shape, stabilize, and transform it. In this keynote, we revisit the scientific foundations of musical entrainment of movement and physiology, and the clinical legacy of Rhythmic Auditory Stimulation (RAS). We argue that much of the traditional literature has relied on fixed cueing paradigms that insufficiently capture the dynamic and reciprocal nature of real human movement. To address this limitation, we present the BeatHealth project, which introduced adaptive rhythmic cueing as a new generation of movement support, and describe how this approach led to the development of BeatMove, a smartphone application designed to synchronize musical feedback with the user’s ongoing behavior in real time. We then summarize results obtained (i) with runners, demonstrating how adaptive cueing can support performance and sensorimotor regulation in ecological conditions; (ii) with patients suffering from Parkinson’s disease, where adaptive auditory cueing opens promising perspectives for fall prevention, gait support and rehabilitation; and (iii) with patients with obesity, revealing that adaptive music-based interventions improve movement synchronization, locomotor performance, and desire to move. Across these use cases, the central idea is that cueing should not be imposed on movement from the outside, but negotiated with it. We conclude by discussing how adaptive musical systems such as BeatMove may contribute to mobility, autonomy and healthy aging. More broadly, this work positions movement, music, and real-time computation within a common framework of interactive, embodied health and wellness.

 

Valérie Cochen De Cock is a neurologist specialized in movement disorders treating patients with Parkinson’s disease. She is also a neuroscientist expert in movement, rhythm and health at EuroMov lab in Montpellier University. Her work explores how rhythm and especially music can improve gait and motivate physical activity in persons with Parkinson’s disease. She is co-founder and the medical expert of BeatHealth, a neurotechnology company developing adaptive music-based systems to support gait, rehabilitation, and wellbeing.

 

Benoît G. Bardy is a movement scientist and innovation leader working at the intersection of human movement, rhythm, health, and interactive technology. He is Full Professor at the University of Montpellier and founder of EuroMov, a European center dedicated to research and innovation in movement, health, and digital sciences. His work explores how perception, action, rhythm, and social coordination shape human behavior in both physical and virtual environments. Across neuroscience, embodied cognition, and movement science, he has developed a distinctive approach to movement as a dynamic, meaningful, and interactive process. He is also co-founder and Chief Scientific Officer of BeatHealth, a neurotechnology company developing adaptive music-based systems to support gait, rehabilitation, and wellbeing. His research and innovation projects bridge science, computation, and care, with a particular focus on how interactive systems can synchronize with living bodies in real time.

12.15 – 13.30

Lunch

 

13.30 – 15.00

Paper Session #5 - Human–Robot Interaction & Bio-Inspired Systems

Chair: TBA

12 min + 3 min Q&A

  • Ouhssain et al. Reinforcement Learning with Musculoskeletal Models to Study Fatigue Effects on Human Muscle Synergies
  • Alcubilla et al. Designing Relational Care: Speculative and Participatory Approaches to Movement-Based Human-Robot Interaction through the Performing Arts
  • Guevara.Reflections: Health, Technology, and the CCL Experience
  • Hu et al.“We Move Like an Octopus”: Exploring Decentralized Tentacular Coordination via Inter-Bodily Electromyostimulation Relays X
  • Neuhauser et al.Estimating Piano Piece Difficulty via Embodied Robotic Hand Performance Analysis

 

15.00 – 16.30

Practice Works and Posters

 

Practice Works - Chair: Leonardo Montecchia

  • Interactive Dance Performance as a Dialogue: Choreographing through Sound and Grief
  • Gone Fabulous VR: Virtual Reality Installation through Choreographic Process
  • SyncOff™ A Speculative Symposium on Coordination Collapse

 

Poster #3 - Chair: Stéphane Perrey

  • Tadayoni et al. SensualMap 2.0 Meets The Source
  • Di Donato et al. British Sign Language in Embodied Music Interaction: An exploratory study of British Sign Language music interpretation
  • Marin-Bucio. Machinic Movement Matrix: A framework and tool for human-AI dance creation
  • Ardaiz et al. Teams of Sport Science and Computer Engineering Students Learning Together
  • Bosselut et al. Exploring multimodal neurophysiological synchrony and behaviour in choir performance: a preliminary study.
  • Siman. The Recorded Performance as Virtual Event: Archival Vitality in Preljocaj's Swan Lake
  • San German Bravo et al. Laban Inspired Visual Effects Influence Perception and Movement
  • Laroche et al. Multi-agent Coordination in Shared Hybrid Spaces - How Digital Environments and Adaptive Agents Shape Collective Embodied Timing
  • Akbas et al.Reflective Embodiment through Avatar Abstraction: Insights from Movement Practitioners
  • Corbellini et al. Slow Mood, Aesthetic Resonance, and Embodied Interaction: Design Principles for Art-Aided Rehabilitation

 

16.30 – 17.30

Paper Session #6 - Movement Analysis, Motion Capture & Computational Modeling

Chair: TBA

12 min + 3 min Q&A

  • Pilkov et al.Estimating Pianists' Hand and Finger Kinematics with Markerless Motion Capture
  • Pataranutaporn et al.Phylogenetic Tree of Dance: Computational Reconstruction of Movement Lineages Through Motion Capture Analysis
  • Serdar et al. Mixed Method Audio-Video Analyses of Felt Togetherness in a Networked Music-Dance Performance

 

17.30 – 18.00

Ending Remarks – Closing MOCO 10th

 

18.00 – 21.00

Jam session

 

 

 
Loading... Loading...