For my project, I am trying to create and upload new models to Unreal, then use Nuitracks motion capture to give them real-time body tracking similar to the example avatar in the plugin. I am having quite a bit of difficulty though figuring out how the example avatar is set up so that I can copy this process onto a different model/avatar and achieve the same results. Is it possible to use Nuitrack on different unreal models or is the example avatar the only one that works?
Hello @Slhelus
You can use it with your avatar
Note on point 1 in the second section of this instruction: Do not create a child class, but make a copy and change the parameters for your model in it.
I have been following the process as you mentioned and I am running into an issue with updating some of the blueprint components. When updating the Current Avatar Anim BP, it says that the set function is not compatible with my animation blueprint and is still look for the AB_indirect_animation_blueprint. I dont think that should be happening as I have updated all the variables i could find related to that, namely the “currentAvatarAnimBP” and the “AvatarAnimBP” variables in the side window, and it still wont update. I tried replacing the set function but I cant seem to find the one y’all used. Is there any guidance you could provide on changing the copied blueprint class properly?
Nevermind, I found out how to recreate the variable. You had to drag from the get variable then find it.
Can you list exactly what i need to change for the blueprint to work? I’ve changed everything to my new animation blueprint and skeleton things, but it still wont track during simulation while the Nuitrack example does. Not sure what i havent changed yet but any guidance would be appreciated.
Yes i believe i completed all steps correctly. I figured out the issues with it not tracking (I forgot to bind the skin - unrelated to nuitrack) but now the arms track oddly (bend foward and in when they should be out and upright) and im not sure what the issue is. I can post an example when im back in our lab.
Perhaps the hands just fall out of the sensor’s field of view. Try adding an BP_FrameViewer to the level that visualizes sensor frames.