The Subtle Body Project is an interactive multimedia biofeedback system that uses the human body and brain to generate data and to control the media elements of the project installation. This project seeks to integrate the convergence of the disparate disciplines of interactive media, physical computing, biofeedback research, sound design, video mixing, stagecraft and theatrics in the attempt to create a profound and artful expression of the subtle sensitivity of the human body to create a multi-sensory and multi-modal interactive installation and performance environment using the physical body as the content generating device. The arc of the experience for the user will be directly controlled by their brainwave activity and their skin conductivity. To achieve this aim, an array of highly sensitive sensors will be attached to the user’s body to collect data from the brain, the skin and muscle movement of the user. There will be an additional array of sensors used to monitor the movement and location of the user as they participate in the experience.
The Subtle Body Project has undergone several major revisions since it’s initial proposal in June of 2008. Although the focus of the project remains that of the human body and its interaction with sensors and MIDI, it has grown substantially in scope. The Project is now employing from 7 different data streams from 10 biosensors attached to the body of the user. These sensors are collecting a diverse range of data including Brainwave (EEG), Eye motion (EOG) and Muscle (EMG), Galvanic Skin Response (GSR), Body Motion and Proximity (PNG). This sensor data is captured through several microcontrollers, both manufactured devices and devices built by the Subtle Body team. The sensor data is then piped into Max/MSP, which serves as our central data processing hub. The data is heavily processed using sequencers, scaling devices and metronomes and routed out to audio to the audio application Ableton Live which is in turn hosting multiple instances of VST synthesizers such as Atmosphere, Predator, Absynth, and Reason among others for audio processing. The same data channels generated by the biosensor array are also sent out to the video management application Jitter, also from Cycling 74 (the makers of MAX/MSP), for the visual component of the experience. The video elements interact with the user in a variety of ways depending on the phase of the experience. The project is using the filtering, spatial mapping, video mixing and playback features found in Jitter.
Video assets have been collected and created by the Subtle Body team, processed and edited in Final Cut Pro and After Effects and then exported to Jitter for interactive playback on the experience. There are over 80 video cues used in the project.
The result of this effort is a three phase experience in which the user experiences the raw data of their bodies electromagnetic energy, a smoothed artistic interpretation of this energy data and finally an elaborate third section in which the user is controlling the playback of audio and video clips with energy gathered from their brain, their body and the blinking of their eyes in a real time audio/visual remix mash up.