We’re always getting new sensors in the lab and are excited to share them with you. The latest thing we’ve been experimenting with is the Structure sensor from Occipital. We’ve added it as the 11th (!) sensor on our in-depth comparison page, and we have some ideas for experiences we could create with it.
One of the most interesting ideas is to use the sensor to scan a space to create a 3D model, and then use that model to create a mixed reality experience on a device that doesn’t have the sensor. This would enable site-specific augmented experiences on visitor’s current smartphones or on venue-provided tablets in which virtual objects appear to interact with the physical world, with proper occlusion and physics.
Another type of experience could involve 3D scanning heads or entire bodies and using that model in a sort of photo booth experience, or as a visitor’s avatar in some type of gaming environment.
Artificial Experiences by CODAME on Sketchfab (may not appear if an ad-blocker is enabled)
Below we’ll get into the detailed capabilities of the system, but if these sound like interesting ideas for your venue or project, get in touch to explore the possibilities.
The Structure sensor is designed to attach physically to iOS devices to provide 3D scanning capabilities and enable mixed reality scenarios. There is also some support for Windows, macOS, Linux and Android using the OpenNI 2 project.
The “hello world” of Structure is the Canvas app, which creates 3D scans of spaces and objects up to a certain size, and provides standard 3D model files.
Larger objects or whole rooms can be scanned with the help of the Skanect app on Windows or macOS. During the scan, mesh data is sent over the network to Skanect, which can export a mesh of any size.
There are a couple of easy workflows for getting scan data from the sensor into Unity3D.
StructureUnityAR (Augmented Reality): In this type of scanning, the SDK first creates the mesh of the real world in 3D and then the mesh is available as a regular Unity game object and can do things like collision and Unity physics.
StructureUnityUBT (Unbounded Tracking): Unbounded tracking allows you to use the Structure to move physically through a virtual scene to create augmented reality experiences. The system sends translation and rotation information to a camera or object in a Unity scene, so when the structure sensor moves in the real world, it moves the same amount in the virtual world, without prior knowledge of the physical space.
There is also a way to scan a space with the Structure sensor, but use the resulting mesh on a device without a sensor. This would allow for location-specific mixed-reality experiences using typical smartphones.
We’ve only been able to test the regular Structure sensor so far, but Occipital has been working on other products as well.
The Bridge headset works with the Bridge Engine SDK to create mixed reality experiences that incorporate the physical and digital, similar to a HoloLens.
Their new structure core sensor apparently provides very accurate tracking, has a range of 0.3m to 5m and can work well both indoors as well as outdoors. The new sensor is used in the Occipital tracking platform which provides 6-DoF AR/VR inside out tracking with 160 degree viewing angle, a laser projector and global shutter IR cameras.
Overall, it's pretty exciting to see a fairly small local (to SF) company innovating on such a wide range of AR applications, and we're hoping to use the tech in a project sometime soon.