Nuitrack With Astra SDK

Hello Everyone…

I have an Astra Pro sensor and I’m using Astra SDK for software development. I preferred Astra SDK over OpenNI because it’s more clean and organised although it is based on OpenNI architecture…

Currently I’m using NuiTrack for body and hand tracking. all the codes and libraries in nuitrack uses OpenNi and I’m unable to use Astra SDK API’s.

So, I want to use Astra SDK for sensor control and Image acquisition (both RGB and Depth) and nuiTrack for getting body tracking or hand tracking task.

I hope I made my point clear.

Any kind of help or a small snippet just like hand tracking using nuiTrack with Astra SDK will be highly appreciated.

I would like to bypass OpenNI so as not to mix up Astra SDK, OpenNi, and nuiTrack.

Prasanna Kumar Routray

FWIW - nuitrack needs direct access to the depth stream to do body tracking and hand tracking.
Its not a matter of just using a bit of nuitrack - and bits of something else.

Nuitrack reads the depth data - does segmentation - defines users - defines skeletons - defines hands.

You might be able to use the astra sdk for the rgb stream as its a separate stream = but suspect that two sdk both trying to access the depth frame may put serious loads on the system - if they can even share it at all.


1 Like

Hello Westa,
Thanks for your insightful reference to the concern I’m worried about.

I would like to know about the load issue. What if I get a depth stream with Astra SDK and convert to OpenCV image format at the first place (of course that will not be a single image and will be a stream of images). At the later stage I make a copy of the depth image and use 1st one for hand tracking and second one for some simple image processing? Is that possible?

If not then, would you please like to tell me something about body tracking comparison with different SDK’s?

To my knowledge Orbbec Astra also provides body tracking sdk but that is now completely integrated to Astra SDK in the newest release.

So, My question is which one performs better in body tracking? nuiTrack or Astra body tracking SDK?

Thanks and Regards,

HI Prasanna,

Firstly no you cant just grab a depth frame in EITHER SDK do some processing and then hand it off after the fact to either the Astra SDK OR the Nuitrack SDK for skeletal/hand tracking. Neither system supports this sort of workflow.

Some explanation firstly of how both SDKs work … both frameworks are best thought of as black boxes.

You initialise them - which performs a bunch actions that find a hardware sensor and wire up the system in preparation for processing. Once you have a initialised system - you tell both SDKs to wait for a frame of data to be captured from a stream.

Each SDK then has a mechanism to call back into your software with the latest frame of data already processed.

The mechanisms behind how each framework functions to do this are substantially different - but the net result is the same - the sdk does the processing and returns you back a set of data constructs containing fully calculated and processed information.

This may be - depending on the framework a full frame of depth, a full frame of rgb, a skeletal array of all the
currently tracked data points, an array of hand data or a gesture array.

How the wiring up of these systems works varies for each framework - but in essence - you tell the SDK all the things you want to PROCESS for a depth stream - and then the framework takes over almost complete control of the process from there.

In terms of which is best … There are major differences between the OrbBec and the Nuitrack SDKs on many levels. But I would put it like this - if you have very limited needs for tracking then OrbBec may be OK … but if you want accuracy or repeat-ability then the Nuitrack SDK beats the orbbec offering hands down. Further if you want functional hand tracking and gesture recognition that is in sync with the skeletal tracker then Nuitrack is your only option.

SO a little further discussion of how the Nuitrack framework operates:

Currently you have no direct control over the init() process beyond setting up parameters in the nuitrack.config file and letting it do its job. init() does all the connection work - finding a sensor and turning it on - which allows it to work with a wide variety of different sensor hardware in a completely generic way.

Once you have the system initialised you decide what sorts of data you want to receive from the SDK during each frame and you then setup a set of callback function mechanisms one for each type of tracking output you want to receive.

Then you effectively start a data pump loop that sits and tells nuitrack to wait until it gets a full frame of data from the connected sensor - during this process - nuitrack will call each of the callback functions you declared

  • passing to those functions the calculations it has performed for that tracking type on the current frame.

SO - one function receives the Depth Frame, one function receives the RGB Frame, one function the user Frame, one function the Skeleton Frame, one function the Hand Frame, and one function a Gesture Frame if any has been recently captured.

The OrbBec SDK works a little differently in that you define a set of STREAM readers which define what is to be processed … you then setup the same sort of data pump loop and define a single callback point - which is called AFTER the SDK has processed the current frame completely. In that callback you have access to all of the processed data ( DEPTH, RGB, SKELETON ).


Of the two - the Nuitrack system wins hands down in terms of tracking performance. And it has HAND tracking that is instant and responsive (within the limitations of the sensor).
The orbbec sdk does have a hand tracker - but it is slow to lock on (up to 5 seconds) and very quick to loose tracking of the hands.

Also significantly the Nuitrack hand and gesture tracking is intrinsically mapped to the SKELETON system which means the same IDs can be used to reference an USER, SKELETON, HAND JOINT and HAND TRACKER. — this is not the case with the OrbBec SDK in that the HAND tracker and Skeleton tracker dont have the same reference id’s.



Wow Westa,
It was a complete overhaul. Thanks for quick and complete info.

Thank you very much.


Hello Westa,
What I intend to do now is to use nuitrack as a primary sdk and as usual it will do it’s job of bodytracking. and in the meantime I would like to create another function alongside skeleton and use some opencv tools to do some processing.

why is that not possible? when I try to create an image matrix with cv::Mat image, it fails to compile?

I was wondering if it is possible to use some opencv tools inside the example code given by Nuitrack.

I think if that’s possible I will be able to solve my problem.


Hi Prasanna,

It is likely a versioning issue - nuitrack uses opencv libraries from some of its own internal processing.
When you say it fails to compile - what are the actual errors - is it a compile error or a linker error?


yes the problem may stand with compilation.

This (Image Upper) is what I get when I compile without using
‘find_package(OpenCV REQUIRED)’ and
target_link_libraries(${PROJECT_NAME} ${LIBS} ${OpenCV_LIBS})
in cmakelists.txt

Image (Upper)


and this (Image Lower) when using
‘find_package(OpenCV REQUIRED)’ and
target_link_libraries(${PROJECT_NAME} ${LIBS} ${OpenCV_LIBS})
in cmakelists.txt

May I know the exact version of opencv being used by Nuitrack?
I have installed


Yes that does look like a linker issue with open cv


How to solve that? I tried replacing the .so files inside nuitrack folder with the opencv ones and it does the same. What can be done to solve this? Which opencv version nuitrack uses? Should I use that opencv version?


You would need to match the correct OpenCV sdk build that nuitrack uses internally it would seem - the nuitrack .so files are specifically bound to a build of opencv so trying to change those would cause a set of complications.


recently,I am developing an project about hand and body tracking, my sensor is Astra pro, I use openNi for developing, when I want to create a Hand node:
xn::HandsGenerator _handsGen;
the return value was 65565, and the system noticed that can’t create any node.
I am confused about the error, is that the problem of the device?please help me, thank you!

I think you are using nuitrack_ni_gl_sample. Why don’t you use nuitrack_gl_sample? At any time you can just use cout for debug. That 65565 is a 16bit value. I have no idea from where you are getting that.

I would suggest you to give a try with nuitrack_gl_sample.


Thank you!I tried nuitrack_gl_sample, the skeleton drawing is ok, I really want to know that why nuitrack_ni_gl_sample can not work.

hello Chris,
This is because of OpenNi library linking problems. I tried once and gave up.


Using the openNi examples requires the nuitrack library to be directly linked with openNi - there are some details in the installation notes of the steps that are required - BUT we found them more trouble than they were worth.

Stick with the native nuitrack SDK - it puts a wrapper over all the openni nightmares - for the most part.