Kinect experiment with Ogre3D

November 20th, 2010 by Henri Leave a reply »

I’ve just bought a Kinect and decided to do some experiment with it:

This demo is a ripp-off the Kinect-v11 demo made by Zephod. In fact I’ve designed a new Ogre::Kinect library that provide Kinect connection through Zephod library. Then I’ve replace the Zephod OpenGL demo by an Ogre3D demo using my library. The nice part is that I’ve managed to move some depth to rgb conversion to the GPU (using pixel shader).


Binary demo:
Source code: svn on
Documentation: doxygen
License: MIT



  1. HegerService says:

    Thanks. But:

    12:25:27: Loading library .\RenderSystem_Direct3D9
    12:25:33: OGRE EXCEPTION(7:InternalErrorException): Could not load dynamic library .\RenderSystem_Direct3D9. System Error: Das angegebene Modul wurde nicht gefunden.

    in DynLib::load at ..\..\OgreMain\src\OgreDynLib.cpp (line 91)

  2. admin says:

    @HegerService: You maybe need to install the latest directx runtime or just update your video driver? BTW I don’t speak german ;-)

  3. HegerService says:

    Work with the missing D3DX9_41.dll

  4. Eugene says:

    Any chance this can be used as a type of stereo scanner for objects?

  5. admin says:

    @Eugene: the Kinect can indeed be used as a 3D scanner (check out this youtube video for example). But the main issue is the resolution (640×480)…

  6. admin says:

    You can take a look at this really impressive demo using a very well-known mascott: created by OpenNI using PrimeSense NITE (source code is available).

  7. Eugene says:

    Astre: I guess the optimal scanning distance means they have to be further back from the object. I have always had the idea that it would be much better to create a scanning system, but using 2 (or 3) HD web cameras?? How hard would it be to do that? I read that the real challenge would be the simultaneous capture for the USB cameras.

  8. admin says:

    @Eugene: it depends… You are referring to a solution using calibrated and fixed HD cameras whereas the video link is using single moving camera: this is very different. If your subject is not moving you don’t need HD cameras synchronization (shots of the scene are enough using my SFMToolkit for example). Otherwise this is indeed more complicated…

  9. Gandi ( Pierre de Catopsys ) says:

    Roo j’aurais du parler plus avec toi à la soutenance de Ju :p.
    *Parts jeter un œil sur le reste de ton travail…miam miam*

  10. Salem Sayed says:

    Can the Sinbad + Kinect demo be ported to Mac/Linux ? i can’t understand why is it exclusive to PC while OGRE3D is cross-platform.

  11. admin says:

    Salem Sayed: this demo is using Ogre3D for rendering and PrimeSense/OpenNI for body tracking. So if PrimeSense/OpenNI is available on Mac/Linux platform, the demo could be ported.

Leave a Reply