This is a music video work, which is representative of my use of real time simulations of physical, mathematical, and biological systems as instruments in visual live performance. The video is recorded in real time, based on direct controller inputs (not control scripts), and has not been edited or processed after initial recording. This is an attempt to capture the immediacy and spontaneity of visual live performance, in contrast to highly scripted visual content and its sterile feel. This is taken further by the use of real time 3D data capture and dynamical systems that naturally transit between ordered and chaotic states (Logistic Map, Lorenz System).
Simulations and rendering are based on the cinder library. This ports a tool commonly utilized in installation, advertisement, and mobile applications into the realm of audio-visual live performance.
C++; cinder; cinder blocks: cinder-MIDI, cinder-freenect, cinder-syphon; syphon recorder
Computation of the dynamical systems simulation, and processing and rendering of particles had to be moved to entirely to GPU. By this approach, it was possible to achieve a high enough particle count (~300,000) at ~30 fps rendering speed on a laptop system.
The visual rendering is based on a set of particles, which is used throughout the entire video – this means no particle generation or destruction. All changes are effected by changing particle coordinates, which adhere to the different coordinate sets supplied from the Kinect input and the simulated dynamical systems.