My tests on Unity and questions

Hi !

Congratulations for the work you achieved on Vico,
Latency is not bad, (but Bluetooth is unstable)

I tested your SDK on Unity to see what were the possibilities from a developper point of vue.
I did some tests :
• ROOM HMD demo : I did my own version of Room HMD. I wanted to test the skeleton data.
• “WALL DODGE” demo : I made a quick game where the player has to dodge moving walls. (don’t pay attention to Graphics and everything, I just spent 30 minutes, as a test)
• “LASER MASTER” demo : I made a fireball shooting game to handle the hands (click, pressure…)
• In wall dodge, I also made a menu navigation with your basic gestures implementation
• Position tracking test : Iwanted to test positional tracking inside a room (2x2m), but it is still very unstable.

I have some questions about the SDK :

• Joints.real : I would like to know what represents the Real coordinates for Joints ? It seems like millimeters for me.

• Skeleton tree : I need a Tree to represent the links between Joints, but I did not find it in the SDK. I did it. But do you have additional code for that ? I think the developpers will need it for many purposes so maybe it will be annoying for them to code themselves a tree and manually add all the Joints.

• Hand’s coordinate : we have coordinate for hands, and also for hand’s joints. Also, the variable « Real » for hands is empty. Do you have any advice about this ?

• “Push” gesture : I didn’t get what is the « push » gesture.

• “issues module” : I dont know how issues work, apparently it is not the same type of module (I tried with createModule but app is freezing)

• When body is not Face to camera : Have you got any solution for people not facing the camera ? (when body is seen from the side, skeleton is twisted)

Thank you for your time.

Hi!

Yes, real coordinates are measured in millimeters. And it’s in sensor coordinate system: Sensor position is 0, X is the right direction from the sensor view, Y - up, Z is the depth.

Unfortunately, we don’t have any documentation about it yet. You can look how bones are connected from RoomHMD sample. We will release general documentation about Nuitrack modules soon.

Do you mean Hand from HandTracker or SkeletonTracker ? In skeletal tracking Hands joints are not tracked well. I recommend to use Wrist joint instead of it.
In HandTracker hand position is measured in screen units with range from 0.0 to 1.0, and it has special value -1 when hand is out of virtual plane borders.

It’s a gesture where you push with your hand towards sensor (like pressing a big button in front of you when you look towards sensor). It’s preferable to use hand tracker for menu controls in VR. Gesture API was developed mostly for TV control.

Issues do not work as a separate module. They work as an event from Nuitrack class: nuitrack.Nuitrack.onIssueUpdateEvent += OnIssuesUpdate;
Quote from manual:

Currently there are 2 possible issues: OcclusionIssue, FrameBorderIssue.
Check if user is occluded (OcclusionIssue):
if (issuesData.GetUserIssue<OcclusionIssue>(userId) != null)
{
   //user is occluded
}

Check if user touches FOV borders (FrameBorderIssue):
FrameBorderIssue borderIssue = issuesData.GetUserIssue<FrameBorderIssue>(userId)
if (borderIssue != null)
{
   if (borderIssue.Left)
   {
     //user touches left border
   }
   if (borderIssue.Right)
   {
     //user touches right border
   }
   if (borderIssue.Top)
   {
     //user touches top border
   }
}

You may check issues only for users that are important for application or not check them at all. Main purpose of Issues was the ability to give users feedback in cases when they may be tracked not correctly.

We do not have a solution for that use case, at least not yet. I can only recommend to put important game objects in front of user, and don’t do anything behind or on sides (so he won’t have the need to rotate whole body more then -45 - +45 degrees from sensor).