# Hand Attributes

Hello :-),

I don’t understand how x and y attribute from Hand class are computed !
Can you explain me please ?

I thinks the normalized projective x from Hand is not the same than JOINT_RIGHT_HAND or JOINT_LEFT_HAND from Joint ?

Class tdv::nuitrack::Joint :

/**

• @brief %Joint position in normalized projective coordinates
• (x, y from 0.0 to 1.0, z is real).
*/
Vector3 proj;

I can put Joint proj in coorrespondance to the texture of depth but hand not !

Hand is computed following the center of gravity of people ?

Class tdv::nuitrack::Hand :

Public Attributes
float x
The normalized projective x coordinate of the hand (in range [0, 1]).

float y
The normalized projective y coordinate of the hand (in range [0, 1]).

bool click
True if the hand makes a click, false otherwise.

int pressure
Rate of hand clenching.

float xReal
The x coordinate of the hand in the world system.

float yReal
The y coordinate of the hand in the world system.

float zReal
The z coordinate of the hand in the world system.

nuitrack.config :

``````"HandTracker": {
"SkeletonSupport": true,
"HandMode": "grab3d",
"CoordsMode": "abs",
"TrainedClassificator": "handtracker/svm_grab.xml",
"HandPointerFrame": {
"DistanceBetweenFrames": 150.0,
"Width": 640.0,
"Height": 480.0
}
},
``````

What is HandMode or CoordsMode ?

CoboTrack

The hand algorithm and the joint algorithm use different methods to calculate the world space location - from what we have been able to observe.

BUT they are pretty close to each other - for the most part.

HOWEVER the projected math is entirely different

• the joint projection is a direct one to one projection onto the depth map.
• the hand projection is a projection on a normalised point in space in front of the BODY detected by system.
• if you want the hand (projection) based on the same system as the joints - take the hand work position and using the worldtoprojection function.

As for the settings in the config - the only one that we have found useful - is the witdh/height - they are used to determine the area in space relevant to the body that the hand is tracked thru,

We would love to know more about the distancebetweenframes ???

Westa

Each hand is associated with a virtual frame with the size of `Width` (mm) x `Height` (mm). `DistanceBetweenFrames` is the distance between centers of these frames.

1 Like

Hi Olga o/

I apologise but I don’t understand what “DistanceBetweenFrame” Is. We try to modify this parameter, but nithing special happen. Could you re-explain what is it ? Or link a documentation ?

Thanks a lot.

Hi Julien,

`HandPointerFrame` can be represented as a rectangle area (or a virtual frame), in which a user’s hand is moved. There are two `HandPointerFrames`: one for a left hand and one for a right hand. Movement of hands are projected to `HandPointerFrames` and then the position of a hand on a virtual frame is converted to the position of a pointer on a screen. The bigger the frame is, the less sensitive the pointer is, and vice versa. `Width` and `Height` mean the size of the rectangle (`HandPointerFrame`). `DistanceBetweenFrames` means the distance between the centers of these two rectangles (one for a right hand and one for a left hand). You can set your own values of `Width`, `Height`, and `DistanceBetweenFrames` in order to correct the sensitivity of a cursor.
Hope this explanation helps.

2 Likes

thanks a lot Olga, that’s very helpfull . We will try to do something with this.