Hi all~ I am doing a avatar game. There are 2 modes, avatar movement trigger by animation and trigger by Realsense sensor. As the avatar’s dress is big, I don’t want the hand collides and go inside the dress. Therefore, I need to control the hand joint blend limit.
In Unity, I can adjust the “Avatar Muscle & Settings tab” and it works on animation mode. However, when in the Realsense sensor mode, the hand still go inside to the dress. So any method can control the joint blend limit?
Saw this post about the joint angle measurement, is it can find out the shoulder joint angle and limit it to blend near to the body?
Unfortunately, we do not yet have a component for controlling the Humanoid Avatar, taking into account the configured angles of rotation of the limbs, but we plan to add this soon.
It should be understood that on the “Avatar Muscle & Settings tab” you are not setting up constraints for the model, but a range of limb rotations for mapping animations that do not affect the model.
For now, you yourself need to take care of limiting collisions of your model.
Um I’m still trying to understand how the body movement affect the blending angle, caz not sure is it need to take a few joints’ angle to the constraint. Any clues or methods at your side can suggest to help?
Moreover, I’m trying to set the avatar to A-pose. When someone/sth is blocked by the current user, parts of the skeleton joints will be missing or misinterpret, then the avatar pose will be weird and abnormal. Therefore, I want to make the avatar to A-pose (as the default pose) in this case.
So I’m planning to mark down the joint rotation when I do the A-pose in front of the camera, then copy and apply in script. Is that the correct approach or any better method?
You can calculate the angle between the joints using this method:
To avoid “ridiculous” poses, when crossing one user by another or exiting the frame, you can check the confidence of each joint, and intercept their control with some code, for example for A-pose, or for your animation.