Irrlicht + Kinect

Announce new projects or updates of Irrlicht Engine related tools, games, and applications.
Also check the Wiki
Post Reply
jawdy
Posts: 14
Joined: Wed Jan 13, 2010 4:20 pm

Irrlicht + Kinect

Post by jawdy »

Hey all,

I've just gotten Kinect working in Irrlicht!

I'm just writing up a quick tutorial on how to do this - unfortunately, Windows only. Would this tutorial be something people would want here?

A quick rundown, if anyone wants to try and get things going, without waiting for me.
You'll need:
- A Kinect Sensor
- OpenNI
- NITE
- The PrimeSense driver (with the Kinect mod, this is taken from the official driver and then modified to allow Kinect)
- Irrlicht 1.7.2

- Connect the Kinect sensor to your PC.
- Install the modified driver for Windows.
- Download and install OpenNI (for Windows only, at the moment)
- Download and install NITE (again, Windows only... but they're hard at work doing cross-platform support!)
- Update the XML files in the install director (needs licenses and changes to the settings)
- Test that you can run ALL the Samples
- Modify an Irrlicht example to include the OpenNI/NITE items
- Enjoy motion interaction with an accurate depth buffer :-D

I WILL have some more in depth instructions, links to downloads, potential problems, screens etc... but there's a LOT to it, so please bear with me!

[MODS: If this is in the wrong section, can you please move? My apologies if it's wrong! Thanks]
XXChester
Posts: 95
Joined: Thu Oct 04, 2007 5:41 pm
Location: Ontario, Canada

Post by XXChester »

I would be very interested in seeing how to do this. I have been waiting to try out porting a game to use Kinect until Microsoft officially supports something but I think it would be neat to try out even before hand.

Great work.
jawdy
Posts: 14
Joined: Wed Jan 13, 2010 4:20 pm

Post by jawdy »

XXChester wrote:I would be very interested in seeing how to do this. I have been waiting to try out porting a game to use Kinect until Microsoft officially supports something but I think it would be neat to try out even before hand.

Great work.
Cool! As long as it's useful to someone, then I'll get the write-up finished ASAP.

I'll be interested to see what MS' official Kinect for Windows SDK/Driver will support. The OpenNI/NITE stuff is from PrimeSense (the company who built the hardware inside Kinect) so you get a LOT of stuff for *free* and some other things that are more "PrimeSense-generic" rather than just for Kinect.

Still, MS have said it'll be "Spring 2011"... so we can expect the official Kinect for Windows some time in October ;-)
polylux
Posts: 267
Joined: Thu Aug 27, 2009 12:39 pm
Location: EU

Post by polylux »

OpenNI is cross platform. They provide it in source- as well as in package form also for various linux distros.
Not sure about NITE, but I don't see why it shouldn't be cross as well.

As for an OpenNI production node in linux using the xbox kinect hardware, I recommend having a look at "freenect" (see www.openkinect.org). This can be wrapped in a production node with ease - if not already done.
beer->setMotivationCallback(this);
anoki
Posts: 58
Joined: Fri May 05, 2006 8:31 am

Post by anoki »

Hey,

it sounds really interesting. If it is running well, i would be happy
if you post a small description here.
I got a kinect also, but just tried it with the Kinemote.
Kinemote is not so accurate.

Anoki
Tekkai
Posts: 24
Joined: Tue Mar 30, 2010 2:02 pm
Contact:

Post by Tekkai »

Hey! How is the project? I'm fighting with this stuff and some I'm a bit lost about connecting the output of kinect with the bones in irrlicht... Any example will help :D :D :D :D :D
rookie
Posts: 16
Joined: Thu Dec 29, 2011 9:44 pm

Re: Irrlicht + Kinect

Post by rookie »

Can you please help out with examples since right now I have worked out how to make them work separately but want them to work together. something like displaying meshes using irrlicht on top of video from kinect??
jawdy
Posts: 14
Joined: Wed Jan 13, 2010 4:20 pm

Re: Irrlicht + Kinect

Post by jawdy »

Hi All,

As many other people have suffered, work got in the way and basically took all my time - and still is!
Tekkai wrote:Hey! How is the project? I'm fighting with this stuff and some I'm a bit lost about connecting the output of kinect with the bones in irrlicht... Any example will help :D :D :D :D :D
My first foray into this was to use the skeletal detection to track the users position in real space and map that to the camera in virtual space (irrlicht camera). I didn't get up to handling any more than a single point.
rookie wrote:Can you please help out with examples since right now I have worked out how to make them work separately but want them to work together. something like displaying meshes using irrlicht on top of video from kinect??
If you look at the OpenNI/NITE examples, more specifically the simple skeleton example, you see the rendered output of the skeleton - is this the "video" you're after? If so, that's the skeleton points being rendered to a GLUT surface every frame, rather than a video. I'm sure you *can* get the video feed from the Kinect camera, but that was never something I was after.
anoki wrote:Hey,
it sounds really interesting. If it is running well, i would be happy
if you post a small description here.
I got a kinect also, but just tried it with the Kinemote.
Kinemote is not so accurate.
Anoki
The current version I built was limited to 30fps - the frame rate from Kinect itself. By adding simple multi-threading one can have irrlicht render at it's maximum (on my machine, this is some 2000fps on simple rooms) with Kinect feeding data when it has it.
(I did this same technique when using OpenCV and the web-cam was restricted to some 25fps)


I've only just started to dig out the project again - sorry for the lack of updates or the sheer length of time it's taken me to get back on this!
The first thing I've done is compile a 64-bit version of Irrlicht.
The Kinect demos in OpenNI/NITE refused to build in 32-bit mode on my 64-bit machine... this should be a simple thing of changing the target, but I figured having everything target 64-bit would be better anyway.
Next, I used the 64-bit libs from OpenNI/NITE - also make sure you get the latest releases from OpenNI/NITE.
The driver I'm using is the SensorKinect available on GitHub.
I then trawled through the SimpleSkeleton demo, provided by OpenNI in order to get a feel for what would be needed. OpenNI/NITE is event driven, so you register callbacks and the target function and they fire when something happens - gestures or skeleton detection etc.
And finally, I went through the OpenNI documentation where they explain how to do "simple" things. This helped me to strip out any GLUT items that are heavily used in the demos.

If you look at the SimpleSkeleton demo, you can see where they iterate through all of the detected skeletal points and render these to the GLUT texture - they also draw lines between them.
I would imagine that it would be fairly simple to track the detected point of a Kinect skeleton and map that to the same point in an Irrlicht skeleton... but that's assuming they use the same number and position of points in the skeleton. This is something I'll take a look at!

My first plans, as I've said above, were to make the 3D camera move in virtual space relative to real space.
After that, I was using the "wave" gesture to then detect and track hand points - which is different than the skeleton. I was then wanting to interact with the virtual world by moving around and pushing objects with the detected hand.

Hopefully things will quieten down a little and I can take a look... I'm still doing Kinect stuff, but it's wrappers and native interfaces for other languages/engines/applications, so I may not be able to pick this up for some time :-/

Good luck!
Post Reply