Release

December 15, 2025

MetaHuman 5.7 is now available

Features

Houdini

Live Link Face

Maya

MetaHuman Animator

MetaHuman Creator

Earlier this year, we announced that MetaHuman was leaving Early Access with a slew of new features that included MetaHuman Creator becoming part of Unreal Engine on Windows, as well as new integrations and licensing options that dramatically extended their reach.
 
Our latest release, MetaHuman 5.7, shipped alongside Unreal Engine 5.7 on November 12. Delivering you powerful improvements that offer greater precision and control, broader platform support, and enhanced tools to style, shape, and animate MetaHumans, including updates delivered through plugins for third-party tools.

Let’s take a look at all the updates.

What’s new in MetaHuman 5.7

MetaHuman Creator enhancements


The MetaHuman Creator plugin for Unreal Engine is now available on Linux and macOS in addition to Windows, meaning that you can enjoy all the benefits of fully integrated character creation on your platform of choice. MetaHuman Animator support for Linux and macOS is planned for a future release.

If you were previously using the MetaHuman Creator web application, you can follow this handy guide to migrate your existing characters and continue working on them.
 
We’ve enhanced the body mesh conformation process with support for arbitrary poses, enabling more accurate results with less effort. We’ve also introduced the option for UV-space vertex correspondence between the conform template and model meshes, supporting mesh round-tripping with external DCC tools via FBX.

The parametric body controls are also now more intuitive, making it easier to stay within human proportions with increased precision and control.

And with this release, you can automate and batch process nearly all editing and assembly operations for MetaHuman character assets using Python or Blueprints, as well as building additional tools to more deeply integrate MetaHuman Creator into your pipeline and creative workflow.
 
The API supports sculpting, conforming, wardrobe, rigging, and textures, streamlining full character assembly. You can preview edits live in MetaHuman Creator, and run them either interactively in the Unreal Editor or on a compute farm for offline processing.

New tools for styling, shaping, and animating MetaHuman grooms


The latest MetaHuman for Houdini plugin update (available on Fab) brings a guide-driven workflow for parametrically creating hairstyles using pre-authored data—a fast and approachable way to block and iterate new hairstyles.
 
The toolset—which comes with a number of adjustable preset hairstyles for artists to use as starting points—includes a follicle generator to handle density and positioning around the hairline; guides are generated from these based on flow, length, and volume parameters. There are also features for hair strand interpolation and styling for advanced detailing such as clumpiness.

Meanwhile, for those wanting to learn how to create more advanced hairstyles, such as ponytails and braids, we’ve published the new MetaHuman Groom Advanced Kit for Houdini on Fab as an extra learning resource. The example project and assets also demonstrate the process of integrating a groom into Unreal Engine.
Grooms can also be created in MetaHuman for Maya using XGen. The latest version of that plugin—which also enables you to assemble your MetaHumans and customize their faces and bodies to apply technical fixes, art direct expressions, and build unique characters—is now compatible with Maya 2023–2026.
 
Once you’ve created your grooms, you can take advantage of the updated Unreal Engine Groom plugin that now provides finer control in shaping and animating hair strands with a joint-based workflow directly in Unreal Engine. You can now art-direct motion using skeletal mesh animation workflows, blending keyframed animation with rigid-body simulation—for example, to achieve hero poses such as a ponytail lying over a shoulder, or to ensure that hair is not in the way at key action moments. There’s also the Experimental ability to blend keyed animation with hair physics simulation.

Extended performance capture pipeline


Also on the animation front, you can now generate real time animation using an external camera connected to an iPad or supported Android device using Live Link Face. This opens the door for more pocket-sized, budget-friendly real-time facial capture setups that can deliver high-quality animation than the on-device cameras currently supported. You can also record video directly from an external or on-device camera on iPads. Note that iPhones do not support external camera hardware.

For those choosing the stereo head-mounted camera (HMC) route, the workflow for generating calibration data from the checkerboard take has been improved to provide better automatic frame selection, new visualization to aid manual frame selection, and repeatable configurations across multiple takes. The new MetaHuman Animator Calibration Diagnostics plugin (Experimental) also enables you to validate calibration against performance footage to more easily determine if it is the best calibration to use.

You can check out all the updates in MetaHuman 5.7 in the Release Notes.

MetaHumans everywhere!


It’s been a busy year for MetaHumans: we’re seeing them pop up all over the place, from the Belle Époque France of Clair Obscur: Expedition 33, to the Lovecraftian horror of The Sinking City 2, set in gloomy 1920s America. And they’re also making inroads into the world of fashion, as the recent Interline report into the industry’s real-time roadmap noted.
And, check out MetaHuman creation in action in the Behind the scenes of Quentin Tarantino’s The Lost Chapter: Yuki’s Revenge. Discover how The Third Floor and Epic Games merged authentic performances with stylized animation, using MetaHuman technology.

Get MetaHuman today!

MetaHuman is now part of Unreal Engine. Download the latest release from the Unreal Engine website and extend your MetaHuman pipeline with our DCC plugins.
Download Unreal EngineSee the plugins