Close Show/hide page

Kinect Point Cloud Visualizer

The underlying idea here is something along the lines of a realtime VJ-like performance tool. This is a work in progress. Using openFrameworks, Kinect point-cloud data is collected in a sequence of frames that can be saved to disk and played back on demand. Points beyond a specified depth are filtered out, and a bounding box is calculated to form the basis of some simple dynamic interactions.

Various effects with variations can be triggered and combined in real time on both saved and live video:

An ostentatious particle emitter effect:

Explosions which can be triggered on demand or based on sudden changes to the bounding box:

“Alpha trails” (with more hands)…

Strobe, with quick time shifting effect:

The priority for this has been to develop my up-til-now shallow knowledge of C++, focusing on code design and performance, rather than the exploration of novel uses of the Kinect as an input device (or so I tell myself).

Given more time, I’d like to do more with the visual effects, look into shaders, write a polygon-based renderer (rather than simple points), and find excuses to do more with multi-threading (currently the loading routine runs on a separate thread, and the ‘video capture’ routine runs on three). And find a clever use for OpenCV to do more than what I’m doing now with just a bounding box. And play with MIDI to trigger video clips, and map various parameters DAW-style. And then maybe work on a real user interface, possibly broken out in a separate window, using Flash or what have you.

I believe the core recording and playback system and my implementation of a “Point Cloud Video” file format are logical candidates for a public post on GitHub, but will entail some rationalization and massaging of code.

11 Responses to “Kinect Point Cloud Visualizer”

  1. Wow some really great results.

    How were you able to calibrate the depth camera with the colour camera? I was looking into using the Kinect for the original 3 section on:

    But I couldn’t figure out how to calibrate the two cameras. Spent a few days figuring it out.

  2. felix says:

    Wow! Lots of potential for live performance / interactive pieces. I love how you can move the heads in 3D space in the top one. Seems like you could do a time shift and have a conversation with yourself.

  3. admin says:

    Hi Mikko, I couldn’t find a proper tool to calibrate properly using a printed-out grid, so I had to “eyeball it” by tweaking the magic numbers found in in “ofxKinectCalibration.c” in the ofxKinect addon of openFrameworks. Specifically, the values for fx_d, fy_d, cx_d, and cy_d

    The results aren’t perfect, but in the office where I took the video captures, the walls are a medium-gray color, so the fringing around the edges isn’t painfully obvious.

  4. admin says:

    Thanks Felix. I do have a preset in the app where two instances of a video are played with one being a couple seconds behind (but it isn’t demonstrated in any of the video). Time-shifting is built into the “engine”, like with the alpha trails and the the strobe effect.

  5. roger says:

    awesome work!

    I am looking forward to that github repo. I’m just starting with cloud points and was looking for some info when I ran into your astonishing videos. I really like them and see lots of potential in them.

    I struggled with calibration a while ago and found a quite simple solution:

    DepthGenerator depthGenerator; // NOT ofxDepthGenerator!!
    UserGenerator userGenerator; // NOT ofxImageGenerator!!

    found it on a thread in OF’s forum:,5641.msg29079.html#msg29079

    I hope that helps.


  6. admin says:

    Thanks a lot Roger I will check that out.

    It will be also interesting to look into the new Microsoft SDK and seeing what that has to offer.

  7. Jason says:

    Hi Lee,

    Some really amazing work here. I’d love to see how you added the particle emitter. Would you mind sharing your code?


  8. admin says:

    Thanks Jason. Here’s the code from the particle emitter class. If it’s too opaque without context, let me know and I’ll provide a high-level description.

  9. nic says:

    Cool. Now dance the algorithmic dance with yourself!!

  10. anthony says:

    can i also see that particle code? that link in the comments seems dead

  11. Nathan says:

    Very cool work! I am trying to create a similar effect with your particle emitter but following/tracking a specific color instead. I would really appreciate it if you could share the code for the particle emitter as I have just started using C++ etc etc, and I am completely lost! Your link for the code noted above no longer works I’m afraid. Thnx and amazing work!

Leave a Reply