Here at Yelp, we’re passionate about building things; it’s at the core of our engineering philosophy. In fact, we enjoy it so much that many of us keep on building after we finish work. I recently found some spare time to work on an interesting project with the Microsoft Kinect. I think it’s a cool start and I’ve open sourced the code so that others can build something even cooler.

Easy Skeletal Tracking

If you’re reading this blog, you’re likely familiar with the Microsoft Kinect. It combines an RGB camera, an infrared laser projector, and an infrared camera to determine depth of objects in a scene. This makes problems for computer vision that were previously very difficult, like interpreting skeletal motion, much easier. Recently, PrimeSense, the company that built the core technology behind the Kinect, launched an open-source version of their natural interaction framework called OpenNI. OpenNI provides APIs for interacting with hardware as well as incorporating higher level computer vision modules that can recognize objects and gestures. This allows us to solve really complex problems (like tracking skeletons in 3d space) very simply, since the heavy lifting is already done for us in OpenNI.

Fun and Games

I thought it’d be neat to get Kinect data into a game engine that supported physics so I could interact with objects in a virtual world. I chose Garry’s Mod for Valve’s Source engine because it can be easily scripted using Lua.

Theory of Operation

My code consists of two major parts: a backend that interprets Kinect data and a Lua script that controls the game. The backend (based on one of the OpenNI example projects) reads data from the Kinect over USB, uses OpenNI to track the skeleton of the user, then sends UDP packets containing (x, y, z) coordinates corresponding to the joints in the user’s skeleton. The Lua script in Garry’s mod parses those UDP packets, then maps those coordinates to spheres that move themselves to positions corresponding to the coordinates (these are like Garry’s Mod hoverballs, but in 3d). By attaching these position-tracking balls to different entities or objects, the user can move objects in the game by moving around!

Interfacing with the Kinect is super easy with OpenNI. Here’s an example of all it takes to get and send data about the player’s left hand:

void SendUDPLeftHandData(XnUserID player) {

// Make sure the player is being tracked

if (!g_UserGenerator.GetSkeletonCap().IsTracking(player)) return;

// Get the player’s left hand position

XnSkeletonJointPosition leftHand;

g_UserGenerator.GetSkeletonCap().GetSkeletonJointPosition(player, XN_SKEL_LEFT_HAND, leftHand);

// Create a data packet with the hand coordinate

char packet[40];

sprintf(packet, “lhx%0.3fy%0.3fz%0.3f”, leftHand.position.X, leftHand.position.Y, leftHand.position.Z);

// Uses standard posix calls to send a UDP packet with the data

SendUDPPacket(packet, strlen(packet));

}

What Now?

This project is definitely still in the proof-of-concept stage, but that’s where you come in. The code is available on github at github.com/johnboiles/JBKinectHacks. Feel free to branch it and hack it into your own, better, creation. Drop me a line to let me know how it goes: johnb at yelp dot com

Back to blog