Kinect Based Gaming

GE Goes Flying for Healthcare IT from SuperTouch Group on Vimeo.


I’ve done quite a bit of work with the Microsoft Kinect.  In fact, one of our team, Chris Rojas was one of the first to play with a hacked Kinect and we created a video that went a bit viral during the kinect’s first hours running from a pc or mac. The Kinect was, and is important to a redefining of user interface and physical computing within certain contexts. It has also helped to blow the doors open on trying to get as far away as possible from a mouse or touchscreen. CCV (community core vision) was a huge paradigm shift, and really interesting devices like the Leap Motion would follow. Throw in that there’s a race to fulfill the sci-fi promises of VR (sometimes overlapping with AR) with undefined rules (read as Oculus Rift, Google Glass,  and Sony’s Project Morpheus), and we have got a really interesting time ahead of us. Here’s one project that was fun for all involved…

With the help of fellow team mate Noah Zerkin we built a Kinect based game for GE (General Electric). It was one of the bigger booths at RSNA and we were situated on the aisle right in front of a main entrance. GE wanted to tell a story through a selection of factoids about their radiology equipment. We pitched a Kinect based game experience whereby the user selected between a helicopter and a biplane, then selected between New York City and New Orleans (the city hosting the trade show). The user then flew around catching rewards in the sky, each resulting in a factoid, while avoiding birds and balloons which would end the game.  We found many people replaying to try to beat the time of their colleagues. Education (which is occasionally made up of some dry data) of visitors through “gamification” and an interactive experience. Of course, the more bypassers that see a game in action, the more willing they are to give it a try, it’s a human thing.  We consistently had a queue. It was a success from the first day of show to the last.

Some technical tidbits: The HD output of a beefy gaming laptop fed a large 9 screen video wall. We used the extremely capable Unity3D as our engine. The Kinect peeked out of small square cut out of the wall structure,  and we found that a homebrew “blanking” device improved the Kinects ability to lock on to a user’s skeleton… It was basically a small motor with a piece of black sintra taped to the motor arm. Have you ever covered the sensors of the Kinect with your hand to effectively reset what it sees? That’s basically what we were doing, while relying on our timing sequence of onscreen instructions, ensuring that the Kinect would see the intended person, in the perfect position, and with arms out. It was necessary that we coddled the Kinect at the super-crowded trade show because it’s sensors are basically pointing out at the aisle full of constant foot traffic. It all worked out well.  We used Processing to communicate with Unity and control the motor via Arduino.


Leave a Reply