As long-time fans of electronic music, we’re aware that its fans and show attendees often lean towards introversion. In partnership with Microsoft, we set out to see if we could design a gestural interface that would break the ice and get showgoers to interact with each other. The result was completely homegrown — #dBcube is a four-sided, audio-reactive, multi-user gestural interface controlled by dance.
As a dancer approaches one of the glowing sides of Microsoft’s Cube, the music visualization spawns a digital avatar that follows the dancer’s moves. When dancers arrive on opposite sides, their two digital avatars link at the hands with flowing ribbons, turning their individual dance moves into flowing and twisting shapes, creating collaborative visual expressions as they move along with the music. Dancers and their friends see themselves in a whole new way, and interact with dancers that they might not know – and can’t actually see physically. #dBcube visualizes a collection of sensor data from the four Kinects in the Cube, reacts in real-time to the beats of performers in the venue, and cycles through a variety of virtual environments as the evening progresses. Our experiment was successful — #dBcube helped make new friends from complete strangers, and turned even the most determined wallfower into a social creature. The dancefloor may never be the same again.
Drawing on the success of the experience, we authored an open-source toolkit to let other artists and creative coders author their own experiences to deploy on the cube. Our goal was to make this unique platform available to a wide variety of creative exploration and expression. The toolkit is also helpful for anyone building applications which use multiple Kinect for Windows v2 sensors, particularly if you haven’t had the chance to work on the physical Cube.
Microsoft / Decibel Festival / EMP