Hello everyone!
I don’t have much experience with Nuitrack yet. I want to expand the Fitting Room Demo and develop an app (for university) based on it. Here’s how it currently looks for me:
however, I would like the clothing not only to float in front of the body but to actually adapt to the body, as shown at the end of the following video .
I have already tried several approaches, but unfortunately, I can’t get it to work. Can someone help me figure out what I need to adjust?
I am using an Intel RealSense D435i and working in Unreal Engine 5.2 (I cant run it in 5.4 and also got some issues in 5.3).
I managed to get it running in UE 5.3 and 5.4. However, it still doesn’t look the way it’s supposed to. I’d really appreciate it if someone could help me out. If any further information is needed, I’ll send it as quickly as possible upon request. Thank you in advance!
Hello @30_Samu_03
Thank you for your interest in the plugin. We are preparing improvements regarding this case, we will try to update the plugin as soon as possible
Thank you for your reply @Stepan.Reuk .
Is it foreseeable when the plugin will receive this update? I want to use Nuitrack for a project that needs to be completed by January 20th.
Is there already a way to make the demo look like it does in the video (maybe Unreal Engine 4 or Unity)?
Is there a tutorial for aligning a mixed reality camera view in Unreal Engine 5? We’re launching a project for a client beginning installed testing on Jan 8th and hoping to use Nuitrack for this.
We’re comfortable doing custom development work if there is a description of the necessary steps available, perhaps this could even be contributed back to the plugin. Thanks and happy holidays!
Are you able to provide a quick bullet point list of steps to align a mixed reality camera, in lieu of a full tutorial?
What material is necessary to have the arm occlusion over the shirt as shown in the blueprint video above?
I notice that the hips don’t seem to rotate much, causing stretching of the garment. Is this expected and/or part of the pending plugin updates? (any potential solutions would be greatly appreciated!)
This scenario has not been worked out enough yet and it turns out quite a lot of manual steps. But the general recommendations are as follows:
You need to compare a real sensor, a virtual camera and a model. The camera on the level must be in the same coordinates as the model (for example, (0,0,0)) and have the same fov as the sensor.
the model should use not only the rotations of the joints, but also the positions (Animation Blueprint). In this case, the model will stretch.
for 3D, it’s better to use the classic tracking mode (not AI) now
The usual one-sided material. Just make sure that the inside of the model has no polygons pointing inwards and you can look through the model (through a sleeve or other opening in the clothes)
I don’t quite understand, can you send me a video?
It seems that the BP_AvatarMannequinBlueprint updates both position and rotation via Update Joints Transform, is this correct?
Thank you for getting back on all those points. Knowing that the camera and the model should both be positioned to 0,0,0 and the FOV should match is very helpful. From testing we agree that classic tracking is better than AI for now.
For #3 I was able to replace the skeleton mesh of the clothes directly within the Mannequin of the BP_AvatarMannequinBlueprint to fix this issue. Previously I had replaced the mesh in BP_T-ShirtBlueprint.
I am trying to integrate cursors (hand tracking) into UE5 for my program. The cursors have been implemented according to my vision, but I can’t manage to simulate clicking so the cursor can interact with widgets. The clicking gesture is already detected by my hand, but it cannot interact with the widgets. Is there a simple approach to this that I might be missing?
I’ve also tried using hand tracking to move my actual cursor (and it even moves outside of Unreal Engine), but I still can’t simulate a click (e.g., a simple left-click). I would be very grateful if someone could help me with this!
Hello @30_Samu_03, have you checked information from this page ? It could provide general guidelines on the implementation of NUI / gesture control through a hand tracking.
Another more complete tutorial is here (it is for Unity, but the same principles applies to UE).
Please let us know if there are uncovered issues / topics in these materials, we’ll try to help.