The newly-launched Microsoft Kinect has been causing a lot of excitement in hacker circles since its recent launch, due to its open interfaces, and the Smalltalk community has already got some great uses of this device.

The Kinect is a device, intended to be used as an accessory for the X-Box, which interprets 3D scene information from a continuously-projected infrared structured light, allowing live controller-less interaction by interpretation of movement and posture. This makes it a great complement to existing project teams working in Smalltalk.

Ricardo Moran was a member of the team from the Grupo de Investigación en Robótica Autónoma at CAETI in Argentina who won the 2010 ESUG Innovation Technology Awards with Physical Etoys, their Arduino-based interface to Squeak which allowed them to monitor and control robots as they drove round the conference hall. They spent their prize money wisely, buying a Kinect! Building on the work done by Stephen Howell in getting the Kinect working with Scratch (a visual programming environment developed at MIT), they have now shown how to use the Kinect to control activities in Etoys, using the existing OSCeleton framework to provide the skeleton interpretation interface. Their approach is documented in more detail on their blog.

Another approach to integrating data from the Kinect is that taken in Nikolay Suslov’s separate and equally impressive implementation which uses the lower-level OpenKinect driver to access the raw colour and depth information and pass this into his bespoke Krestianstvo images, where he then does the detailed video processing and interpretation in Smalltalk, which reduces his reliance on external, platform-specific code. His blog gives more details of his implementation as well as source code, and pre-built images.