Saturday 24 February 2007

Hyper-shaku at Creativity and Cognition '07

Written with Sam Ferguson, our paper, Gestural Hyper Instrument Collaboration with Generative Computation for Real Time Creativity has been selected for the Creativity and Cognition Conference (Seeding Creativity: Tools, Media, and Environments) in Washington D.C. with keynote speakers Mitchell Resnick from MIT Media Lab and Thecla Schiphorst from Simon Fraser University. We are quite chuffed because only 24 papers were chosen. Sam will present the paper as I will be in Japan at that time. This paper describes an environment for creative engagement utilising idiomatic musical performance gestures and expression to elicit responsive generative augmentation of audio and visual delivery. The system employs artificial biological systems to generate new artistic material meshed with musical performance. The generative process is triggered and moderated by the gestural interaction of the human performer (sensed by motion captors, computer vision and computer hearing). The model’s scalability and modularity enable different generative processes to be interchanged to explore the affect of their interaction with each other and responsiveness to the performer. Technical implementation is demonstrated in the environment, Hyper-Shaku.