Test video showing poor tracking quality on realSense cameras

Hi,

We’re evaluating the skeletal tracking quality for different RealSense cameras. The following video shows a comparison between NuiTrack v32 / v35 and RealSense D435 / D455. It’s a video capture of the NuiTrack sample application with default settings.

I personally find the tracking quality very unstable on both sensors. There are multiple frames where the detected pose jumps from a good detection into a completely off pose which could have not been reached but a human. Tracking is worst for hands in front of the body.

The video also shows that on average the tracking results are worse for the D455, which I found surprising.

  1. What are your thoughts on these results?
  2. Are there any options to improve the tracking of hands in front of the body?
  3. Why is the tracking worse on a camera with a better sensor?
  4. Is it advisable to switch to Kinect Azure and the KinectSDK for projects that depend on precise and stable tracking for human interaction?

I’m happy for all suggestions and thoughts.
Tom

No suggestions? Anybody?

It looks like your camera view is top down? Maybe change your camera position to the following in the Nuitrack documents - https://github.com/3DiVi/nuitrack-sdk/blob/master/doc/General_preparations.md

This might improve tracking reliability.

@UomoCaffeLatte thank you for that link. The tracking-quality indeed significantly improves in a horizontal configuration (e.g. without pitch).

Well, the NUI tracking support seems to be on vacation. And in our last couple of projects things like the NUITrack requirement to run Unity with administrative rights caused additional headache, I spent some time looking for alternatives. I found cubemos and for all our tests, their detection stability with realSense cameras appears to be significantly better:

So we decided to port out apps to this new framework.

@pixtur we are investigating tracking quality issue.
Please tell had you checked AI version of Nuitrack for this viewpoint position? Nuitrack AI relies on the same type of processing as Cubemos (neural network inference on RGB data) and it is free of Nuitrack Pro limitations on viewpoint position.

We’ve had similar experiences with D455. The tracking is significantly worse when hands are in front of the body and joints jump to undesired positions in some situations like having hand in front of the elbow from the camera’s perspective. AI version did not improve results for us.

We had better tracking results with Cubemos, altough they have their own quite severe problems of their own like depth position jumping to distance positions and detecting every chair, jacket and many random objects as people. Cubemos staff is not always on vacation, but their documentation and updates are so non-existent it’s likely you have to get in touch with them personally at some point.

Azure Kinect is comparatively good if you can deal with the somewhat limited FOV. The wide FOV setting is so bad it’s unusable.

You might also want to checkout ZED2 if you haven’t done that already. In my experience, they have the best support of this bunch which is, to be honest, quite refreshing.