top of page

Meta-Ensemble

A NEW INTERFACE FOR MUSICAL EXPRESSION

An experimental dancing performance exploring the relationship between body language and "augmented emotion"

TEAM

Xiaofei Hong | GSD , Harvard

Koi Ren | Mdes, Berkeley

FakeRabbit | Xinghai Conservatory of Music

TIME

5/2021 - 18/2021

SHOWN AT
Aotuspace , Beijing

MY ROLE

Concept Development

Interaction Design

Audio Visualization

TOOL

Visual ; TouchDesigner

Sound : Ableton + Max/Msp

Hardware : IMU sensor + Arduino

Audio Visual Design (Xiaofei)

Hover over the image to trigger sound effect

Concept Development

Concept Developement

Inspiration

Emotions are Embodied

"Emotions are not just an idea in your head, they are physically embodied. So, the way you feel affects the way you hold your body. For example, if you’re anxious you’ll probably get tense, if you’re confident you’ll stand tall and at ease, if you’re depressed you’ll slump.

Because emotions are embodied, they have implications on things that in dance are normally considered on purely physical terms, things like posture, line, coordination and injury. "

What if ... ?

AFFECT

MUSIC

INTERACTIVE ?

EXPRESS

BODY LANGUAGE
EMOTION

GENERATE

INFLUENCE

INTERACTIVE?

Interactive Instrument Research

Ideation

WorkFlow

Hardware and Sound Prototyping

We researched the basic movement forms in modern dance and simplified them into three basic movement prototypes: Linear motion, curved motion, and rotary motion, as well as the state of constant speed and acceleration respectively . The user simulate those movements and collect the three-axis attitude angle and acceleration with an IMU sensor, and conduct a preliminary analysis on the extreme value and average value of the original data. Later, the ESP32 Feather Board was used to complete the wireless data transmission, and the device was fixed in the wearable device to capture the dancer's hand movement data.

“Maxmsp and ableton live are used as the source of sound. Parameters were captured by the arduino through the object serial, then they were scaled into a suitable midi signal value range. These midi signal values ​​are converted and connected to the makenote object and noteout object as a virtual midi controller.

Sound design inspiration comes from emotions and the environment Diversity of body motion feedback is considered, so as to design multi-level and dynamic sound effects. The grasp of each sound material is complementary to the action, "the sound is in motion, and motion is in the sound, the sound is moving, and the sound is vivid."

When dancers dance their limbs, they can feel the energy feedback of the sound. The sound then gave the dancer a new physical arrangement. This kind of sound design is not under a conventional time-linear arrangement, but a timely and feedback-based sound creation . It is this kind of challenge that gives me new thinking about sound creation. I need to think about the multi-level sound arrangement and consider the dancer's body shape feedback, and through a large number of experiments, the numerically triggered instrument channel is continuously carried out. Only by modification can the best quality, hierarchical and dynamic timely feedback sound material be obtained.”

Visual Prototyping (By Xiaofei)

Installation Design

Installation Design

Test

"This is a multiple experience. At the beginning, I was attracted by sound and tried various sounds triggered by different actions. After gradually understanding and mastering the rules of sound triggering, I paid attention to vision. The vision will be more random but there will be some rules. After mastering the triggering rules of the two, dance is more like a dialogue or a game of hide-and-seek of mutual probing. You can control it but you can’t completely dominate it. And the combination of sound and vision can also inspire dancers to improvise ."

- Xueyang Cao

Iteration

Iteration

Concept developement (By Xiaofei)

concept development1.png
 We want to study 

1. How body movements would express emotions / influence emotions

2.How body-language can effect the dialoge between different individuals

 

Built on that, we use the MSVR and AROUSAL-VALANCE emotion quandrant to divide the preset ideology into four emotional dimentions.

 We want to study 

1. How body movements would express emotions / influence emotions

2.How body-language can effect the dialoge between different individuals

 

Built on that, we use the MSVR and AROUSAL-VALANCE emotion quandrant to divide the preset ideology into four emotional dimentions.

Sound Design

Ableton
MAX-MSP

Multi-Track sound design
Controled Pitch and Velocity

DATA Transmission

Visual Design (by Xiaofei)

Anger - Noise Harmonic Gain
Happy - Arc Anchor
Sad - Noise Amplitude
Calm - Color &Alpha
Particle Effect
Performance

Performance

Credits


Overview
summary
audio visual design


Concept

Inspiration
Initial Idea
Refrence


Ideation

Workflow
Hardware Design
Sound Design
Visual Design
Installation design
Test



Iteration

Re-define Concept
Re-design Sound
Re-design Visual


Performance

Video
Audience Interview

 
bottom of page