Hand Tracking sensitivity and GRAB location

We have been doing a lot of testing with the HandTracker and orbbec sensors.

We are still seeing a lot of jitters in the reported location of hands - even when hands are held perfectly still … is there any way to improve the accuracy of the tracking?

The HAND Tracker does report somewhat more stable locations than the SKELETON Hand Joint - but the locations and random movements are still not ideal for commercial applications - especially when the hands are moved near the line of the body.

Secondly - the location of the hand pointer seems to vary significantly in the Y axis when switching between hand OPEN and hand GRAB states.

This is making it challenging to detect drag actions accurately - is there any way to refine the accuracy of the HAND position reporting so that the location of a perfectly still hand that is OPEN or CLOSED are reported as being in the same location?

Westa

Hi Westa,

Hand Tracker provides accurate hand position without random movements if the hands are located in predefined area in front of a person. You can configure the parameters in the “HandTracker” --> “HandPointerFrame” section of the nuitrack.config file.

Please, clarify this phrase, what movements do you mean?

Please, specify, what values do you mean by “significant”? In the current Nuitrack version, slight variation of Y coordinate is possible due to the changing of a palm form.

Looking forward to your reply.

Hi Olga,

Been a busy week … I am happy to show you some screen captures of our current implementation.

We are seeing noticeable random jitter movements in the hand tracker - that we are finding hard to consistently track down - I would have to say that we are NOT seeing repeatable accurate hand positioning with an ORBBEC camera and the hand approximately 90cm from the sensor.

The results seem highly dependent on the exact position of the body in relation to the sensor - and even a small step left or right of the center of the sensor view area can significantly vary the results.

When that hands are directly in front of the body we see more jitter than when the hands are to the left or right of the body respectively.

Y deviation on grip is around -10 units

regards

Westa

Hi,

@olga.kuzminykh: I am also facing the same issue with stability of hand tracker. Changing the drag sensitivity of Pointer(MyPointer.cs in sample) Script doesn’t seem to have any effect on the sensitivity of hand.
@Westa: Have you find any solution for this or maybe any work around?

Thanks

Hi nitish,

There are a number of issues the impact on the performance of hand tracking - as a pointer.
The first is the time of sensor you are using, the second is the distance the hand is away from the sensor.

The first issue relates to how much RMS related depth noise the sensor is delivering. Basically, depth sensors in the hobby range ( anything under 1000 ) all suffer from different types of error noise - which basically means that from frame to frame - the exactly location reported for any given point can vary by optically observable amount.

For example - an intel d435 might say a point is one depth in the first frame and in the next frame - depending on how far away you are from the sensor that point might be reported as being up to 5mm different or even much worst at greater depths.

So the first job is finding a sensor that works well at the distance range you are interested in tracking across.

The second issue is that - these sensors cant see around corners very well - if for example you look at most of the raw depth samples converted to a color gradient from an orbbec sensor - you will likely see quite noticable dark halo’s around the edges of the closest objects - these halo’s are basically dead spots. The sensor has no idea what is in these dead areas so reports nothing. This is a serious weakness of what are called structured light type sensors.

The intel sensors aim to reduce this particular issue by using stereo sensors instead of mono sensors - and by using some clever post processing to fill in holes where they can - but this too is really just another form of processed noise - or misinformation.

Even the Kinect 2.0 sensor still suffered from quite severe noise issues at times - even though it was a more superior in theory - time of flight based sensor.

All of this compounds to what is key to many of these issues - most of the current technologies fall apart severely when you move hands in front of a body - which increases the noise and errors noticably.

When it comes to using all this - noisy information to try and track a hand - well lets just say the results are rarely good - out of the box.

The kinect sdk for windows looked to solve this in part by adding filtering code to the front end - after the skeleton was tracked - Unscented kalman filters and other techniques such as moving window filters and some forms of exponential filters and predictive motion filters all help in some ways - though some cause latency issues and noticable lag.

We have developed our own custom implementations in-house that merge a number of these techniques with discreet spacial filtering based on predictive comparisons and rejections of improbable jitter movements.

But this all is quite aggressive in terms of processing power and best suited to C++ code implementations over c# performance limitations

Westa