“To see the results of the light on the actor—even when light was not present on set—and to change the light later on, you want to work in this kind of environment,” he says. “It felt natural and the workflow was really, really amazing.”
For added realism, real-world movie cameras, complete with dolly tracks, were brought into the mocap studio. These were tracked, along with Bukvić’s body, and of course, his face. All this enabled the team to precisely recreate the camera motions in Unreal Engine, with Bukvić’s MetaHuman acting directly to the camera.
“One of the main engines of acting is imagination,” he says. “If you follow your inner emotions, your body will move spontaneously in front of the camera. [...] In this work, I was trying to explore my inner world and the result is really stunning.”
Getting up close and personal
Perhaps the key challenge the team set for themselves was that the action should focus almost entirely on the character’s face.
“The close-up shot is really the base of the movie and the cinema,” says Šijak. “That’s really how the viewer is communicating and exchanging emotions with the character. Everything shows up.”
3Lateral Business Development Lead Uroš Sikimić explains how this challenge was used to drive the development of MetaHuman Animator. “The human face is tremendously complex,” he says. “We needed a tool powerful enough to be able to analyze all of that information, instead of trying manually to recreate each individual muscle movement on a human face.”
Aleksandar Popov, who is Art Director at 3Lateral, takes up the thread. “One of the most important aspects is obviously eyes,” he says. “That is what makes or breaks the whole impression. And so we put in a lot of effort to actually design from that kind of artistic standpoint and allow the technology to do the rest.”
Bukvić’s facial performance was recorded using a pair of stereo head-mounted cameras. The video and calculated depth data was then processed in MetaHuman Animator using a 4D solver, capturing every subtle detail and nuance of the performance and even reconstructing the eye gaze movements.
With MetaHuman Animator, the results are available to review in minutes. If required, the animation can be further tweaked in Unreal Engine for refinement or dramatic effect—however, the Blue Dot team found that there was minimal manual intervention required to polish the animation output from MetaHuman Animator.
“I was not expecting that my performance would be as natural as it is,” says Bukvić. “I was blown away when I saw all the tiny details are there.”
It’s the immediacy of these results and the iterative process that this facilitates between artists, animators, and actors—combined with the fidelity of the capture—that makes MetaHuman Animator such a powerful tool for creating cinematics. Animation studios can now work with an actor on set, using their creative intuition to direct a performance that will faithfully translate into the animated content to create emotional cinematic pieces.
“You cannot distinguish the quality of the output that was done through this technology or shot on set with a real camera,” says Šijak. “The camera work and the performance of the actor and everything that gets the audience involved in the shot, it’s there. Nothing’s lost.”