I am developing a program that calculates body angles using Nuitrack’s Body Tracking.
So far I was able to calculate the angles of a lot of movements in the shoulder and elbow, however i am unable to track correctly the pronation and supination of the elbow/hand.
I thought that through the joint.Rotation i would be able to find the value that changes when the hand tilts left or right but it seems that it’s not the one i need.
The following image represents the kind of movement i’d like to be able to track and calculate angles of:
Hello @Tora24, currently these kind of rotational movements is “too subtle” for our full-body tracking and isn’t possible to directly estimate from current skeleton tracking result, but there are two workarounds:
You could try to use 3D position from a dedicated module HandTracker - it process data above main full-body tracking based on a hand / forearm segmented view, so provides an additional degrees of freedom to assess a hand pose (though it lacks rotational assessment, only position, so you have to estimate remaining parameters). Basically it provides an estimation for center of palm, so it could be used to in combination with SkeletalTracker wrist joint to understand angles from your (a), (b) pictures
Please refer to this page for additional information.
You could segment a hand / palm / forearm based on a 3D skeleton and do any kind of additional processing to estimate pronation and supination
As was said it isn’t ready-made solution, so it requires additional engineering on your side.
But for the mid-term we plan to expand our current incomplete parametrization, as we already have a plenty of internal estimates, which aren’t exposed in a proper way through API.
Thank you for your answer!
Unfortunately after trying a lot of different methods and scenes/tutorial i am coming to the conclusion that the center of the palm moves only slightly when trying to replicate the movements seen in pictures (a) or (b) even from different positions. I have the hand tracker module active, i tried in all kind of scenes and even the demo in the activation tool but the movements as you said are too minimal to get captured by the camera. My company is looking forward to the next update you mentioned, hope it will measure these movements accurately.
Edit: couldn’t it possible to make it so that the Ai tracks a long object that you hold in your hand and rotating your hand the object rotates too and calculate that motion by getting the initial position of the object and the final one?
We’re 99% sure, there is misunderstanding here - HandTracker 3D palm position isn’t the same as SkeletonTracker JOINT_HAND / JOINT_WRIST joint positions.
Currently it isn’t presented in main demo scenes or Nuitrack.exe, it has to be accessed directly in your code from API (it’s not sufficient for HandTracker module to be “active”).
Technically anything is possible, but such custom processing designs aren’t generally a subject of Nuitrack as a product (only as a part of major commercial development projects).
Oh i didn’t know that, my bad.
But how can i actually access it? I followed the github instructions but LeftHand is always null, despite the hand tracking module being active. I cannot access the Current.LeftHand nor the Current.RawUserHands.LeftHand, i checked also the SensorsData code but it should be okay
Check handtracking in Nuitrack.exe (“Try Nuitrack!” button) (C:\Program Files\Nuitrack\nuitrack\nuitrack\activation_tool\Nuitrack.exe). Two cursors should appear on the screen, blue and red (if palms are visible)
Check UIExample in Unity. Two cursors should also appear. (Pointer.cs script)
If the UIExample and this doesn’t work, send the Unity project we will test it
Fortunately i was able to make it work correctly, i think it didn’t work before because i was too on the right and my legs weren’t completely visible.
However i still get really vague values and coordinates about the position of the hand and i cannot calculate accurately enough the movements shown in the pictures.
This is the function i use to get the angle i need in my hand:
and this is where i call it: leftHandAngle = (float)Math.Round(HandAngle(user.Skeleton.RawSkeleton, user.LeftHand, nuitrack.JointType.LeftWrist, nuitrack.JointType.LeftElbow));