Sample for Hand Clenching and Click

Hello Everyone,
I’m working with hand and gestures,

can anyone provide some info or a kind of sample about how to use ''Hand Clenching" and “Hand Click”?

I hope the concern is clear.

Thanks,
Prasanna

Hi Prasanna,

Kind of an open ended question … could you maybe be a bit more specific in what you are looking to achieve.

The nuitrack hand tracker reports an estimate of the open or closed state of each hand that its tracking as a grab and the rate of closure as pressure.

The examples in the sdk include a demo of how this is read - what more beyond that are you looking to work on.

Westa

Hello Westa,
What I intend to do is I just push my palm against the camera, 2 meters away and would like to see a message that I pushed my palm.

At later stage I could use that for some other application.

I also would like to use that for screen control. Like we use fingers for smartphone screen.
I would like to use the click function for swapping or selecting in a larger display.

I hope you understood what I mean.

Regards,
Prasanna

Hi Prasanna,

That’s code that you would need to implement for yourself depending on your hardware platform and design.

I would start by doing some research into coding an event management system for your application.
Once you have the event management system running then you would add nuitrack as an input to that system.

We have similar code running for our current application framework - as it is a commercial application would be happy to talk offline about it (wtatters@live.com.au)

Westa

Thanks for the reply,

Yes sure we can not discuss more commercial things openly.

But I would like to know if NuiTrack provides that click function ?

can I just use that and check and get a push detection? The way I said that I just push my palm and detect it?

BR,
Prasanna

Hi Prasanna,

nuitrack does not have a event management system as such - you need to code that sort of thing yourself.

What it can do is show you when a hand is open or closed - you do this by examining the hands array for each frame and reading the click and pressure values - these values can be read each frame - they show the state of each hand on each frame.

This is an entirely separate system to the gesture system which only reports a gesture when it detects one. There is a PUSH action - but FWIW its not really an intuitive system from an end user perspective - and we have found the results a bit hit and miss.

But regardless - what you would do is create some sort of event mechanism that keeps track of both the hand states and any gestures that happen to be reported and use this as the basis of your own system - say something like - handOpen, handClosing, handJustClosed, handStillClosed, handOpening, handJustOpened - and to get more fancy - handDragging maybe while in a closed state.

Westa

Hi Westa:
I am confusing about the hand tracking and gesture detecting effect of NuiTrack on my Astra pro, in the program of nuitrack_gl_sample, the skeleton tracking is ok, but the hand tracking is wrong and can not detect any gesture, why this happen.Did you meet this problem?Please help me, Thank you!

Hi Chris,

The hand tracking and gesture system is somewhat different to what you would expect.

Firstly - it seems to work best when the system detects a full skeleton. As such - sitting down will not give a reliable hand track from what we have found.

Secondly - this requires that your hands and skeleton are some distance from the sensor to work optimally.
We have found that at distances under 1.5m your results will vary greatly,

Thirdly - the HAND position reported is based along a plane based on a reference point in space normal to the center of gravity of the body its associated with.

As such the POINT that is reported will seem off if you sitting or not standing with a full skeleton in view.

We have wound up just working with the XYZ world point for each hand as the basis of the hand location - as opposed to the projected point - the projected point calculation was just TOO unreliable for any of our commercial applications.

Now with regard to the gestures - IF - you are setup in the optimum position - with full skeleton in frame - then the GESTURES do work reasonably reliably - we have found about 90%-95% reliability for - SWIPE UP, SWIPE LEFT, SWIPE RIGHT under those conditions. PUSH seems to requires FULL extension of the arm - and is a very unnatural movement that is hard to rely on. GRAB is about 95% reliable - HOWEVER - the vertical reported position of the hand can move considerably between OPEN and GRAB - which makes is hard to utilise as a (click) type action for many people - as the hand moves outside the click region if these areas are too small.

The other major issue is a hardware one - the quality of the depth data from the astra pro is - well lets just say noisy at best. This results in many potential reporting errors over time. The nature of the hardware means that there are often “holes” in the depth data - especially when the hands are positioned directly in front of the body - basically - the sensor cant report what is behind the hands - and the nuitrack algorithm - starts making guesses - that result if bad tracking results - leading to the hand jumping around - and making the system all but unusable without extensive post filtering of the hand position data.

Westa

Hi Westa,
Thanks very much for your answer and your time to help me, I will test the effect on your suggestion.
best wishes!