# Integrating character models with skeletal tracking

Does anyone have any pointers about integrating a character model from the unity asset store with the skeletal tracking system VicoVR provides?

I understand classic Unity character animations are a completely different thing, just wondering if anyone can point me in the right direction.

Thanks,
Joe

Hi, Joe

You can get rotations of skeleton joints (from t-pose) this way:

``````Vector3 jointRight = new Vector3( joint.Orient.Matrix[0], joint.Orient.Matrix[3], joint.Orient.Matrix[6] ); //X(Right), may need scaling (1; -1; 1)
Vector3 jointUp = new Vector3( joint.Orient.Matrix[1], joint.Orient.Matrix[4], joint.Orient.Matrix[7] ); //Y(Up), may need scaling (-1; 1; -1)
Vector3 jointForward = new Vector3( joint.Orient.Matrix[2], joint.Orient.Matrix[5], joint.Orient.Matrix[8] ); //Z(Forward), may need scaling (1; -1; 1)
Quaternion result = Quaternion.LookRotation(jointForward, jointUp);``````

Then if you update modelâ€™s bones rotations from root to children youâ€™ll have it mimic userâ€™s movements.

Best regards,
Yuriy

1 Like

Hi,

I tried to integrate character models too, with Robot Kyle from unity, but it seems the coordinate space is not the same (User in T-Pose has joints quaternion set to Identity, but Robot Kyle in T-Pose has other values, so if I move my arm up, Kyle will move his arm to the right for example etcâ€¦). Do you know an easy approch to have the 3d model mimic the user movements when coordinate space is not the same ?

Thanks,

Hi,
you may set model in t-pose, then at Start save jointsâ€™ initial rotations and when updating set rotations as (rotation from sensor) * (initial rotation).
Hereâ€™s script we use in one of our projects to do this:

[code]using UnityEngine;
using System.Collections;
using System.Collections.Generic;

public class ModelAvatar : MonoBehaviour
{

``````[SerializeField]Transform headTransform;

[SerializeField]float lerpFactor = 0.5f;
nuitrack.JointType[] availableJoints;
Dictionary<nuitrack.JointType, GameObject> joints;
``````

Dictionary<nuitrack.JointType, Quaternion> prevOrientations;
Dictionary<nuitrack.JointType, Quaternion> baseRotationOffsets;

[SerializeField]GameObject
basePivot,
torso,
hipLeft,
hipRight,
kneeLeft,
kneeRight,
shoulderLeft,
shoulderRight,
elbowLeft,
elbowRight,
collarLeft,
collarRight;

``````void Start ()
{
availableJoints = new nuitrack.JointType[]
{
nuitrack.JointType.Torso,

nuitrack.JointType.LeftCollar,
nuitrack.JointType.RightCollar,
nuitrack.JointType.LeftShoulder,
nuitrack.JointType.RightShoulder,
nuitrack.JointType.LeftElbow,
nuitrack.JointType.RightElbow,
//nuitrack.JointType.LeftWrist,
//nuitrack.JointType.RightWrist,

nuitrack.JointType.LeftHip,
nuitrack.JointType.RightHip,
nuitrack.JointType.LeftKnee,
nuitrack.JointType.RightKnee,
//nuitrack.JointType.LeftAnkle,
//nuitrack.JointType.RightAnkle
};

prevOrientations = new Dictionary<nuitrack.JointType, Quaternion>();

for (int i = 0; i < availableJoints.Length; i++)
{
}

joints = new Dictionary<nuitrack.JointType, GameObject>();

baseRotationOffsets = new Dictionary<nuitrack.JointType, Quaternion>();

}

void FixedUpdate ()
{
JointsUpdate();
}

public Vector3 GetJointPosition(nuitrack.JointType joint)
{
return joints[joint].transform.position;
}

public Transform GetJointTransform(nuitrack.JointType joint)
{
return joints[joint].transform;
}
``````

static Quaternion sensorOffset = Quaternion.Euler(0f, 0f, 0f);
static Vector3 mirrorScale = new Vector3(-1f, 1f, -1f);

``````void JointsUpdate()
{
if (NuitrackManager.CurrentUser != 0)
{
Vector3 torsoPos = 0.001f * (TPoseCalibration.SensorOrientation * Vector3.Scale(NuitrackManager.CurrentSkeleton.GetJoint(nuitrack.JointType.Torso).ToVector3(), mirrorScale));
Vector3 newTorsoPos = new Vector3(torsoPos.x, basePivot.transform.position.y, torsoPos.z);
basePivot.transform.position = newTorsoPos;

for (int i = 0; i < availableJoints.Length; i++)
{
nuitrack.Joint joint = NuitrackManager.CurrentSkeleton.GetJoint(availableJoints[i]);
Quaternion jointOrient = TPoseCalibration.SensorOrientation * (joint.ToQuaternionMirrored() * sensorOffset);

prevOrientations[availableJoints[i]] = Quaternion.Slerp(prevOrientations[availableJoints[i]], jointOrient, lerpFactor);
joints[availableJoints[i]].transform.rotation = prevOrientations[availableJoints[i]] * baseRotationOffsets[availableJoints[i]];
}
}
}
``````

}
[/code]
and extension methods used:

``````using UnityEngine;
using System.Collections;

public static class NuitrackUtils
{
public static Vector3 ToVector3(this nuitrack.Joint joint)
{
return new Vector3(joint.Real.X, joint.Real.Y, joint.Real.Z);
}

public static Quaternion ToQuaternion(this nuitrack.Joint joint)
{
//Vector3 jointRight =    new Vector3( joint.Orient.Matrix[0], joint.Orient.Matrix[3], joint.Orient.Matrix[6] );   //X(Right)
Vector3 jointUp =       new Vector3( joint.Orient.Matrix[1], joint.Orient.Matrix[4], joint.Orient.Matrix[7] );   //Y(Up)
Vector3 jointForward =  new Vector3( joint.Orient.Matrix[2], joint.Orient.Matrix[5], joint.Orient.Matrix[8] );   //Z(Forward)
return Quaternion.LookRotation(jointForward, jointUp);
}

public static Quaternion ToQuaternionMirrored(this nuitrack.Joint joint)
{
//Vector3 jointRight =  new Vector3(  joint.Orient.Matrix[0], -joint.Orient.Matrix[3],  joint.Orient.Matrix[6] );   //X(Right)
Vector3 jointUp =       new Vector3( -joint.Orient.Matrix[1],  joint.Orient.Matrix[4], -joint.Orient.Matrix[7] );   //Y(Up)
Vector3 jointForward =  new Vector3(  joint.Orient.Matrix[2], -joint.Orient.Matrix[5],  joint.Orient.Matrix[8] );   //Z(Forward)
return Quaternion.LookRotation(jointForward, jointUp);
}
}``````

Best regards,
Yuriy

Thank you for your answer and code,

May I ask what does represent the TPoseCalibration.SensorOrientation variable ? How do you calculate it ?

It should be in roomHMD example in SDK. Itâ€™s quaternion that allows to correct pitch angle of sensor. We assume that user stands vertically during calibration and vector from neck to torso is aligned with gravity:

[code]Vector3 torso = SkeletonJointToVector3(NuitrackManager.CurrentSkeleton.GetJoint(nuitrack.JointType.Torso));
Vector3 neck = SkeletonJointToVector3(NuitrackManager.CurrentSkeleton.GetJoint(nuitrack.JointType.Neck));
Vector3 diff = neck - torso;

sensorOrientation = Quaternion.Euler(-Mathf.Atan2(diff.z, diff.y) * Mathf.Rad2Deg, 0f, 0f);[/code]

Thanks, my robot is starting to mimic the users movement nicely.

I will optimize a bit and create a new game soon

Thanks !

Hi all,
I modified my code according to responses but the result is bad. For example, when person is in T-Pose, the model is not (Arms are not aligned). Or when jump or bend, the model not reflecting same behaviour. I am using default character Ethan in Unity with Nuitrack SDK. Any recent work or example about this subject?

EDIT:
I followed recently added tutorial and it works perfectly.

1 Like