I’m creating something similar where I need joints in 3D space to line up with the person on the RGB feed in the back ground but instead of controlling an avatar we’ll be pinning things to them.
I’ve made sure I’m orienting the joints in the same way you do in your example and followed the rest of the advice above but I’m having a real struggle getting them to line up.
I’ve also noticed the head joint doesn’t rotate when I turn my head which is a problem since the app I’m making needs to pin armour and a helmet to the user. Is this correct behaviour? If so I assume my best way around this would be to use the skeleton head joint for the position of the helmet and use your face tracking API to get the face angle to rotate it.
I’m jumping ahead though there, because if I can’t get the skeleton joints to line up with the user in the RGB feed it’s not going to work anyway.
I’ve ran the demo project you posted above with the Unity Chan avatar and get the same problem, even after tweaking the camera position/rotation/FOV to try and get it to line up. Anything I do that gets the wrist joints placed wide enough apart to line-up, results in all the other joints then being out of place.
I feel like the points are never going to match properly because the image is being stretched to fill the screen. Is that the case or have you had this demo working perfectly? Maybe I need to scale the skeletons parent object by the same scale the image is being stretched?
If you have any advise you could give to solve this problem I’d be very grateful.
I’m using a Realsense D415 with this if that makes a difference.
Thanks for your help