When Microsoft showed SecondLight at PDC 2008, we were inspired to make something similar work with our current Surface unit. What you see here is a prototype that takes advantage of Surface’s object recognition capabilities to recognize the position of one or more iPhones on the Surface, and allows those phones to “see through” the images and reveal a second layer of information. The possibilities here are fairly extensive; what’s most interesting to us is the potential for adding a layer of personalized information on top of a public computing experience. This could enable users to capture content and take it with them, or to have the system display a personalized information layer (translated text/larger-print type/private messages) for individual users of a multi-user system.
iPhone was the first mobile platform we dug in to, but we’ve also got XRay working on Android-based and Windows Mobile-based phones as well. Big props to Josh for pulling this all together, and to long-time friend Arthur Mount for the use of his fantastic illustrations.