We are glad to announce the new version of Nuitrack SDK.
Apart from skeleton data, now you can detect, track and analyse faces of detected users using RGB input! Data such as gender, age and emotions are now available via API.
The new version of Nuitrack 0.23.3 is released with the following improvements:
Added first implementation of Instance-based API (Beta) - face data is available via JSON string (Ubuntu amd64 and Windows x86/x86_64)
RealSense2 library updated to version 2.15.0
Fixed error during enabled depth-to-color registration for RealSense D435
Updated drivers for Asus Xtion 2
Please note that new Nuitrack SDK (1.3.5) works only with the updated Nuitrack runtime component (0.23.3). You can download it on our website: https://nuitrack.com/
Yes, but face tracking for Android in this case will require serious optimization work, so we are not yet ready to give exact dates for implementation.
So far this is Beta and works only with Ubuntu amd64 and Windows x86/x86_64.
For TVico (OS Android), we plan to release support for face tracking a little later.
If someone is getting close to sensor and then Nuitrack lost skeleton information.
At the same time, is the RGB camera’s face tracking function still available ?
Is face tracking function dependent on skeleton information? Or the face tracking function can work without skeleton information?
The face is linked with the skeleton (to be exact, with the head joint). Thus, if Nuitrack does not find the skeleton, then facial tracking will not work. But with partial occlusion of the skeleton, face tracking should work.
Yes, you can use face tracking in Unity with Nuitrack.GetInstancesJson, instructions can be found here http://download.3divi.com/Nuitrack/doc/Instance_based_API.html
Please note that this is still Beta and is supported only on Ubuntu amd64 and Windows x86/x86_64.
I think we will do a tutorial on this topic soon. Thank you for your interest!
If I choose D415 or D435 which one has better skeleton tracking stability?
Especially when both hands near to the center of someone’s body which one is better?
I’ve followed the instructions on: http://download.3divi.com/Nuitrack/doc/Instance_based_API.html
Tried it in Unity on Windows with RealSense D415
But the Json I’m getting has only the following objects: { "Timestamp": "1542635785173600", "Instances": [ { "id": "1", "class": "human" } ] }
Is there something specific I need to define in Unity to get back a “Face” object?
Yes, I know I could use boost to parse the JSON, but since you already have a working example, it would be very helpful to see your sample code (Just trying to avoid duplicate work)… At least the portion that walks through the resulting data structures.
OK, so I went ahead and figured this out. You were right, it’s not too hard. But, for anyone interested, here is a basic snippet that accesses some of the face info.