/*php if ( ( is_home() || is_front_page() ) ) echo "This is the front page";*/?>

Using face tracking AI to drive Unreal Engine cameras

- August 6, 2024

So much great tech is now available for artists and technologists to assemble in novel ways. We all have the means to invent the grammar of tomorrow’s language. In this case, can we make the interaction with virtual spaces more natural and intuitive? The clip below shows a prototype that uses an AI face tracker to move a virtual camera in Unreal Engine. The Unreal camera moves left and right when I move my head left and right, and I can rotate the camera by turning my head back and forth. Moving forward and backward is achieved by moving my head towards or away from the camera.

This setup uses the OAK-D AI assisted computer vision camera. The camera supports Google’s Mediapipe library of edge AI models and has a robust python API which is used to capture, calibrate and stream the data to Unreal Engine in real time. Using the OSC protocol which makes the data available in blueprints via a plugin, the information is applied to the camera transforms and to a UI overlay that provides the necessary visual feedback. While not necessarily designed for permanent installations, this setup makes a very efficient toolset for rapid prototyping of ideas, particularly since Mediapipe includes many other models such as full body pose estimation or hand gesture recognition amongst others.