INSTINT 2017 Highlights

Last week I had the privilege of attending the fourth edition of the annual INSTINT conference, a gathering of artists, activists, and engineers who work in the field of installation-based experiences. Aside from the charm of the city of New Orleans and the friendliness of the brilliant and good-looking attendees, some of the speakers shared work that's extremely relevant to what we try to do at Stimulant.

 

Rafael Lozano-Hemmer presented a survey of his recent works, mostly around a theme of literally making visible the voices of the underserved and ignored. The most technically exciting piece he's working on is Population Theater, which aims to combine 3,617 displays powered by $5 computers to create a canvas of 7.5 billion pixels, enough to visualize data about the global population such that every person on earth is represented by an individual point of light. Combining large numbers of inexpensive displays and computers create unlimited possibilities for sculptural interactive forms.

 

Daily tous les jours, of studio-favorite project The Swings, walked through a case study of Mesa Musical Shadows, a public space at which visitors make music by casting shadows over sensors integrated into the pavement. Projects like these combine the engineering challenges of working with sensors in public space with the practical struggles of permitting, weatherization, and construction.

https://vimeo.com/163596013

 

Refik Anadol presented his studio's inspiring work in the area of architetural-scale realtime graphics, including the 350 Mission Building just up the street from Stimulant's studio, and the conductor-controlled graphics performance at the Walt Disney Concert Hall. These works illustrate what's possible when you work with canvases of unusual shapes and sizes, and use them more to convey emotion than content.

https://vimeo.com/147304811

 

Rebecca Fiebrink shared a technical demonstration of her Wekinator project, which accepts input from nearly any sort of sensor, and with a bit of neural network magic, maps it to whatever outputs an installation might need. Examples include triggering events as an object moves across a camera feed or modifying a value with a hand tracked by Leap Motion. Such a technique could eliminate much of the manual interpretation of sensor data currently required in projects.

https://www.youtube.com/watch?v=dPV-gCqy9j4

 

Thanks again to the INSTINT team for gathering such a great group of practitioners and speakers to inspire one another's practices. Let's do it again next year.