Face Tracking added to Nuitrack SDK!

We are glad to announce the new version of Nuitrack SDK.
Apart from skeleton data, now you can detect, track and analyse faces of detected users using RGB input! Data such as gender, age and emotions are now available via API.

The new version of Nuitrack 0.23.3 is released with the following improvements:

  • Added first implementation of Instance-based API (Beta) - face data is available via JSON string (Ubuntu amd64 and Windows x86/x86_64)
  • RealSense2 library updated to version 2.15.0
  • Fixed error during enabled depth-to-color registration for RealSense D435
  • Updated drivers for Asus Xtion 2

Please note that new Nuitrack SDK (1.3.5) works only with the updated Nuitrack runtime component (0.23.3). You can download it on our website: https://nuitrack.com/

convergif_x400

1 Like

Looks cool! Is android support in your roadmap?

Dear Ashih,

Yes, but face tracking for Android in this case will require serious optimization work, so we are not yet ready to give exact dates for implementation.

Is the face tracking function compatible with all supported 3Dsensors ? What is the sensor working in this demo video ?

Dear NK57,

Yes, it should work with all supported sensors (Ubuntu amd64 and Windows x86/x86_64).
In the example we used Astra Pro.

Does it work with TVico ? If yes how to install ?

So far this is Beta and works only with Ubuntu amd64 and Windows x86/x86_64.
For TVico (OS Android), we plan to release support for face tracking a little later.

If someone is getting close to sensor and then Nuitrack lost skeleton information.
At the same time, is the RGB camera’s face tracking function still available ?
Is face tracking function dependent on skeleton information? Or the face tracking function can work without skeleton information?

Face Tracking is available on Unity too?
Are we getting tutorials on how to use? :heart_eyes:

1 Like

Dear NK57,

The face is linked with the skeleton (to be exact, with the head joint). Thus, if Nuitrack does not find the skeleton, then facial tracking will not work. But with partial occlusion of the skeleton, face tracking should work.

Dear Guilherme,

Yes, you can use face tracking in Unity with Nuitrack.GetInstancesJson, instructions can be found here http://download.3divi.com/Nuitrack/doc/Instance_based_API.html
Please note that this is still Beta and is supported only on Ubuntu amd64 and Windows x86/x86_64.

I think we will do a tutorial on this topic soon. Thank you for your interest!

2 Likes

Thank you a.potopahin9h

If I choose D415 or D435 which one has better skeleton tracking stability?
Especially when both hands near to the center of someone’s body which one is better?

Dear NK57,

Based on our user experience, we tend to use the D415, but I think in order to choose between these devices you will need to test both in your case.

I’ve followed the instructions on: http://download.3divi.com/Nuitrack/doc/Instance_based_API.html
Tried it in Unity on Windows with RealSense D415
But the Json I’m getting has only the following objects:
{ "Timestamp": "1542635785173600", "Instances": [ { "id": "1", "class": "human" } ] }

Is there something specific I need to define in Unity to get back a “Face” object?

Thanks!

Oh Sorry, Sorted it now, forgot to
“open \data\nuitrack.config file and set Faces.ToUse and DepthProvider.Depth2ColorRegistration to true.”

Works well now!

1 Like

it looks like you have a nice demo that shows how to access the JSON fields and mark up the image. Is that available as a c++ sample?

Dear Dave,

You can use any convenient library for you to parse a JSON, for example, boost.

Yes, I know I could use boost to parse the JSON, but since you already have a working example, it would be very helpful to see your sample code (Just trying to avoid duplicate work)… At least the portion that walks through the resulting data structures.

OK, so I went ahead and figured this out. You were right, it’s not too hard. But, for anyone interested, here is a basic snippet that accesses some of the face info.

I’ll add this to NuiTrack ROS Node at:
https://github.com/shinselrobots/nuitrack_body_tracker

#include <boost/property_tree/ptree.hpp>
#include <boost/property_tree/json_parser.hpp>
#include <boost/foreach.hpp>

…

void onSkeletonUpdate(SkeletonData::Ptr userSkeletons)
{

  std::string face_info = tdv::nuitrack::Nuitrack::getInstancesJson();
  //std::cout << face_info; //This will print the entire json object.
  // Good examples at: http://zenol.fr/blog/boost-property-tree/en.html

  try
  {
    std::stringstream ss;
    ss << face_info;
    boost::property_tree::ptree root;
    boost::property_tree::read_json(ss, root);

    // Find all instances of objects (usually people)
    for(boost::property_tree::ptree::value_type &instance : root.get_child("Instances"))
    {

      for (boost::property_tree::ptree::value_type &found_object : instance.second)
      {

        boost::property_tree::ptree face = found_object.second; // see if there is a subtree
        
        if(face.empty()) 
        {
          // This is key:value data
          std::string name = found_object.first;
          std::string val = found_object.second.data();
          std::cout << "FOUND: " << name << " : " << val << std::endl;

          if( "id" == found_object.first)
          {
            std::cout << "FIELD: id = " << name << " : " << val << std::endl;

          }
          else if( "class" == found_object.first)
          {
            std::cout << "FIELD: class = " << name << " : " << val << std::endl;
          }

        }
        else
        {
          // this is a face subtree
          std::cout << "SUBTREE FOUND" << std::endl;
          std::string name = found_object.first;
          std::cout << "SUBTREE NAME: " << name  << std::endl;
          if( "face" == found_object.first)
          {
            std::cout << "SUBTREE: face FOUND " << std::endl;

            for(boost::property_tree::ptree::value_type &rectangle : face.get_child("rectangle"))
            {
              // rectangle is set of std::pair
              std::string name = rectangle.first;
              std::string val = rectangle.second.data();
              std::cout << "FACE RECTANGLE: " << name << " : " << val << std::endl;

            }
            for(boost::property_tree::ptree::value_type &angles : face.get_child("angles"))
            {
              // rectangle is set of std::pair
              std::string name = angles.first;
              std::string val = angles.second.data();
              std::cout << "FACE ANGLES: " << name << " : " << val << std::endl;

            }
            for(boost::property_tree::ptree::value_type &age : face.get_child("age"))
            {
              // rectangle is set of std::pair
              std::string name = age.first;
              std::string val = age.second.data();
              std::cout << "FACE ANGLES: " << name << " : " << val << std::endl;

            }
            std::string gender = face.get<std::string>("gender");
            std::cout << "GENDER: " << gender << std::endl;

          }
        }
      }
    }
  }
  catch (std::exception const& e)
  {
    std::cerr << e.what() << std::endl;
  }
2 Likes