How to recognize gesture and requirement of lighting environment


#1

I evaluate hand tracking and various lighting environment.
I use D435 and Nuitrack SDK 1.4.0.
I have two questions below,

  1. How to recognize user gestures and draw palm and fingers on display. I evaluate using nuitrack_gl_sample, but it does not show which gesture detected, and I can not found palm and finger data.

  2. Is there requirement for lighting environment ? I must hide D435 lens from users using NIR filters, then RGB camera image turn to bad. Does it effect to tracking ?

Best regards.


#2

Hi tanaka.shigeyasu,

  1. Nuitrack detects only the gestures from the list below:

    • GESTURE_WAVING,
    • GESTURE_SWIPE_LEFT,
    • GESTURE_SWIPE_RIGHT,
    • GESTURE_SWIPE_UP,
    • GESTURE_SWIPE_DOWN,
    • GESTURE_PUSH

Fingers are not recognized. You can detect gestures using nuitrack_gl_sample, detected gestures are written in the console (they’re not visualized).

  1. As far as we know, D435 creates a depth map using a stereopair (made of two RGB cameras). This means that quality of RGB can affect the quality of tracking. You can check the quality using the RealSense Viewer from RealSense SDK (without / with the filter).

#3

Thank you olga,

  1. Are there any condition for gesture recognition ? I tried above gestures using nuitrack_gl_sample, but it can not detect them except swipe right and down. And swipe right and down detection failed a lot. Need it some condition such as distance from cams, hand shape, or velocity ?

  2. I checked the quality of depth image using Viewer, there is a little differential in these images. For more accuracy, should I use IR emitter in D400 cams or external projector ?


#4
  1. The recommended distance between a user and a sensor for RealSense cameras is ~1-1.5 meters. Also we recommend you do not tilt the sensor for better results. The only recommendation for gesture detection is that you should make broad gestures (for example, swipe right should be the doubled distance between your shoulders). Your palm should be located at the level of your shoulders.

  2. If the difference is little, the skeleton tracking should be fine. You can hide the lens and check skeleton tracking quality using our native samples (such as nuitrack_sample). Besides, we recommend you to check out the following article from Intel (see the sections 9 and 13 for your case).