Best settings for nuitrack.config for skeleton and face tracking with Intel D415

Hello guys,

I was wondering if someone can help me to figure out what are the best settings for nuitrack.config I’ve already read there is no documentation for this file, and I’ve been checking different post about the same and everypost is kind of suggesting different things, so what I’d like to do is having this post as a nice reference for other users as well.

I am doing skeleton and face tracking with Intel D415.

`"Realsense2Module": {
    "Depth": {
        "ProcessMaxDepth": 5000, 
        "ProcessWidth": 640, 
        "ProcessHeight": 480,
		"FPS":60,
		"RawWidth": 848,
		"RawHeight": 480,
        "Preset": 5, 
        "PostProcessing": {
            "SpatialFilter": {
                "spatial_iter": 0, 
                "spatial_alpha": 0.5, 
                "spatial_delta": 20
            }, 
            "DownsampleFactor": 1
        }, 
        "LaserPower": 1.0
    }, 
    "FileRecord": "", 
    "RGB": {
        "ProcessWidth": 640, 
        "ProcessHeight": 480
    }
}`
  • I am not using RGB camera, just using the depth to perform the tracking, is the RGB value needed there?
    Help will be very much appreciated
  • “Preset”: 5, can this be changed? is that the same as RealSenseViewer? Preset 5 means HightDensity, MediumDensity?

Also I am trying to track only 1 person, and even setting the value of ActiveUsers to 1 is tracking other people around, is there a workaound to this different from this post? Select a specific skeleton - Unity / Orbbec Astra Pro

"Skeletonization": { "MaxDistance": 4000, "AutoTracking": true, "Type": "RegressionSkeletonization", "ActiveUsers": 1, "FeedbackThreshold": 0.1 },

Hi Dee,

  • You can track skeletons using depth map only, but for face tracking you have to enable RGB stream from the sensor.
  • Preset 5 means MediumDensity
  • If you set ActiveUsers to 1, it means that only 1 skeleton will be tracked. However, this value doesn’t affect user tracking. Nuitrack always tracks 6 users but this doesn’t affect the processing.

Thanks Olga for your answer,

One of the problems I am having is even the ActiveUsers is set to 1 I am still loosing tracking of the person in front of the camera when people walk around, is there a way to avoid that?

"Skeletonization": {
        "MaxDistance": 4000, 
        "AutoTracking": true, 
        "Type": "RegressionSkeletonization", 
        "ActiveUsers": 1, 
        "FeedbackThreshold": 0.1
    },```

Next question is about the values being present in the config file
`
"Depth": {
        "ProcessWidth": 640,  // Needed? Has to be the same as Raw?
        "ProcessHeight": 480, // Needed? Has to be the same as Raw?
	"RawWidth": 848,
	"RawHeight": 480,
        ...
        ...
    "RGB": {
        "ProcessWidth": 640, - Has to be the same as values above?
        "ProcessHeight": 480 - Has to be the same as values above?
    }
`
And about tracking, what would be the easy way to enable/disable tracking by code? I am assuming to track by code I'd need first to set this value to false?

"AutoTracking": false,

This can happen if Nuitrack loses tracking of a user segment. Could you please describe your case in more details - are people going in front of the tracked user or behind him?

"ProcessWidth", "ProcessHeight" is the resolution, which raw input stream is resized into (it is upscaled/downsampled and cropped to fit aspect ratio if needed). Please note that ProcessWidth:ProcessHeight aspect ratio should be 4:3 (e.g. 960x720).

You can try to use startTracking and stopTracking methods from Nuitrack API to solve this issue. In this case, you have to set AutoTracking to false.

When AutoTracking is set to true, startTracking and stopTracking are performed automatically when user is detected / lost.

Hello Olga, thanks for the message,

About the tracking:

  • People is going behind the user, as is going to be a demonstration I am expecting people walking around, so I’ve been testing with the main person in front and people walking around but always behind the user, never in front and never crossing, and some times the tracking is picking up other persons around and loosing tracking of the main person, even though the main person is still directly in front of the camera and in the main FOV, “ActiveUsers” is set to 1, I’ll investigate with Autotracking to false and try the methods in your message, but in the meantime I was wondering if you know a solution to the current problem, which seems to has nothing to do with Auto-tracking being false or true.

Thank you very much,
Dee

Please advise what is the distance between a tracked user and other people going behind him? If they’re too close, incorrect segment tracking is possible, which can lead to “merging” of skeletons.

You can try to restrict the processing range for depth data. To do this:

  1. open %NUITRACK_HOME%/data/nuitrack.config file in a text editor;
  2. add this line

"BoxCutter":{"x":"0","y":"0","z":"0","width":"1000","height":"4000","depth":"3000","alpha":"0","beta":"-1.5"},

as the first line to the “Segmentation” section.

Notes about 2nd step:

  • depth map will be thresholded by height/2 mm (depth values [0…height/2] will be processed only);
  • “beta” parameter is the BoxCutter’s rotation angle in radians along the X-axis of a sensor. So the box above should have the following dimensions after rotation: “width” x “depth” x “height/2” (X, Y, Z order here).

Thank you very much Olga, that worked really well for what I needed.

1 Like

Hello Olga,

Is there any way I can visualise that box in the demo, activation_tool/Nuitrack.exe? thank you

The visualization of the box is not implemented in these samples.

Hello Olga.
I have a scenerario where nuitrack gets stuck tracking a skeleton that doesn’t exist and it happens always arround the same position. I have the same situation in another places but in this scenario seems to be even more usual.
I’ve been thinking that if i could change the processing range for depth data maybe i could overpass the situation.
I saw this option “boxcutter” but, as far as I got it, it adds a threshold behind the region of interest and what i need is to be able to tell to nuitrack to dont track some part of the image or at least the opposite, place the threshold before the region of interest.

The red circle is where nuitrack gets stuck when the body that was being tracked leave the area.
The green rectangle is my ROI.

I would like to tell nuitrack in somehow not to track this black area.

2

Or apply this “boxcutter” to do not process the black rectangle.

3

Thank you,

Regards

Hi Victor,

You can try to remove the background, which can help to solve your issue (please use the latest version of Nuitrack):

  1. Find the section "Segmentation.Background"in the nuitrack.config file and replace the line "BackgroundMode": "dynamic" with these two lines: "BackgroundMode" : "static_first_frame", "CalibrationFramesNumber": 20
    It’ll turn on a static background model.
  2. Wait for CalibrationFramesNumber frames, and only then enter the scene. If you set 20 frames as mentioned above, the background should be static for 20 frames (no users, only background). This is very important - if a user appears in the frame immediately, during calibration, the background won’t be calculated and you won’t see any effect. You have to wait for some time after running the program to let Nuitrack calculate the background.

You can also set these options using Nuitrack API:

nuitrack::setConfigValue("Segmentation.Background.BackgroundMode", "static_first_frame"); nuitrack::setConfigValue("Segmentation.Background.CalibrationFramesNumber", "20");

Hi Olga,

I applied this changes on nuitrack config file to remove background and, during yesterday at least, we haven’t seen again the problem, so it helped a lot. Thank you.
Is there another “secret” section that can be modified when the sensor is always placed in the same static position? In a position like the scenario that i showed you, the sensor is placed at 2.8 meters of height.

Thanks in advance.
Regards.

Actually, there’s no other “secret” sections for this case.