Our engineer made this little demo in Processing using the currently available Kinect drivers. Rojas is using an iPad to setup different planes of interest each with their own level of detail. The processing app is sending out OSC with depth information based on the level of detail and the defined plane. The iPad app is TouchOSC.
Translated this means the iPad app is controlling the Kinect software running on the Mac Book Pro.
We have been working on new ways to bridge the human-computer interface and one of the most popular new interfaces involves gestures as an input controller. The idea is to allow a user to use natural hand and arm gestures as a replacement for the mouse.
Case in point, the image on the left shows one of our gesture-based educational experiences in action at a recent Urology conference. The image on the right depicts what the 3D machine vision camera sees, notice the user’s left hand in red. The software tracks the digital hand and sends the coordinates to the game shown on the right.
We created this subtle device to allow hosts to control the content within our Black Lab experience.
When the host changes the cube’s orientation the arduino micro-controller broadcasts the new orientation information to all subscribing computers, allowing the software on each computer to act accordingly.
Technology: Arduino, OSC