Hello Stefan, thanks again for your support and responses. This is my first interactive installation, so your input is especially valuable to me. I’d like to give you a quick update on the interactive installation we’re working on.
Nuitrack has been an excellent tool for making body tracking integration straightforward and effective. The goal is to create a particle avatar that follows people’s movements: when they raise their hands, words made of particles are generated, creating a visually captivating effect.
For video transmission, we will use Spout to send it to Resolume, which will manage the final projection.
Regarding the tracking, we’re trying to limit it to a specific area along a wall to ensure people are tracked only when they are close to the projection. We’ve implemented a trigger system in Unreal Engine to handle tracking activation/deactivation without using Nuitrack’s exposed variables. Do you think this is the best approach? Additionally, we were wondering if there are any variables or nodes in Nuitrack that we could use directly in the Blueprints to activate tracking more easily.
Finally, we’ve noticed that the mannequin’s movement appears a bit choppy. Do you have any advice on how to make these movements smoother?
For the technical part, we’re using a 15-meter active USB cable to connect the sensor, which seems to be working well.
I’m currently using the classic version of Nuitrack, but I’d like to try the AI version. Should I change this setting in the Nuitrack software itself (where Nuitrack AI is activated)? Also, I seem to have read that the AI version might not work properly in Unreal Engine. Do you have any information on this?
Lastly, do you have specific suggestions for improving tracking in variable lighting conditions, like those often found in museums? Additionally, are there best practices for positioning and angling the sensor to ensure optimal tracking accuracy, considering that the installation is close to a wall?
Thanks again for your help!