I evaluate hand tracking and various lighting environment.
I use D435 and Nuitrack SDK 1.4.0.
I have two questions below,
-
How to recognize user gestures and draw palm and fingers on display. I evaluate using nuitrack_gl_sample, but it does not show which gesture detected, and I can not found palm and finger data.
-
Is there requirement for lighting environment ? I must hide D435 lens from users using NIR filters, then RGB camera image turn to bad. Does it effect to tracking ?
Best regards.
Hi tanaka.shigeyasu,
-
Nuitrack detects only the gestures from the list below:
- GESTURE_WAVING,
- GESTURE_SWIPE_LEFT,
- GESTURE_SWIPE_RIGHT,
- GESTURE_SWIPE_UP,
- GESTURE_SWIPE_DOWN,
- GESTURE_PUSH
Fingers are not recognized. You can detect gestures using nuitrack_gl_sample
, detected gestures are written in the console (they’re not visualized).
- As far as we know, D435 creates a depth map using a stereopair (made of two RGB cameras). This means that quality of RGB can affect the quality of tracking. You can check the quality using the RealSense Viewer from RealSense SDK (without / with the filter).