Metahumans are getting TOO REALISTIC in Unreal Engine 5.4

ENFANT TERRIBLE
20 Mar 202429:46

TLDRThe script discusses the integration of metahumans into the Unreal Engine (Ufn) as non-player characters, highlighting the optimization from one gig for a hero metahuman to approximately 60 megs. It covers the workflow for creating costumes using Marvelous Designer and the new USD export option for garments. The introduction of cloth physics in Ufn as Early Access is also mentioned, along with the use of machine learning for realistic facial expressions and performances. The script concludes with a live demonstration of the technology and a look at the character's metahuman DNA, emphasizing the real-time capabilities and potential for detailed, high-fidelity character animations.

Takeaways

  • 💻 Humans can now be imported into UFN as non-player characters, significantly reducing file sizes from nearly 1GB to about 60MB.
  • 🖼 The process for importing custom metahumans into UFN has been streamlined, involving saving in the MetaHuman Creator and importing via a new MetaHuman importer.
  • 🏆 Collaboration with CL, makers of Marvelous Designer and CLO 3D, integrates metahuman body data for creating realistic digital clothing.
  • 🧪 Cloth physics are now available in UFN, offering cinema-quality simulations for more realistic character and environmental interactions.
  • 💪 MetaHuman animator tools from UE are now accessible in UFN, enhancing character animation capabilities with real-time performance capture.
  • 👨‍💻 Live Link Hub application supports a wide range of capture devices for streamlined data integration into UFN.
  • 🛡️ The authenticity of character outfits is crucial for immersion, achieved through detailed material properties and machine learning-driven deformations.
  • 👤 Great facial performances are essential, demonstrated by high-fidelity animation captures that preserve the nuances of actor performances.
  • 📺 Performance capture technology aims for a mirror-like accuracy in capturing actors' emotions, further enhanced by MetaHuman DNA for personalized facial rigs.
  • 📖 The narrative explores themes of unity and survival in a fragmented world, with characters working towards bridging divides and overcoming existential threats.

Q & A

  • What is the significance of the new Metahuman import feature in UFN (Unreal Framework)?

    -The new Metahuman import feature in UFN allows for the easy integration of custom Metahuman characters into the Unreal Framework. This process significantly reduces the file size from a gigabyte to approximately 60 megabytes, making it more efficient and accessible for creators to use high-quality, detailed characters in their projects.

  • How does the Metahuman Creator work in conjunction with Marvelous Designer for clothing creation?

    -The Metahuman Creator works in tandem with Marvelous Designer by integrating the Metahuman body data into the digital clothing software. This allows creators to design garments that fit the Metahuman characters accurately. The software also provides a new USD export option, which includes geometry, materials, and simulation setup data for the garments.

  • What are the benefits of the auto Sim setup introduced in the upcoming UE 5.4?

    -The auto Sim setup in UE 5.4 automates the process of setting up simulation data, Auto LOD generation, and auto skinning. This feature streamlines the workflow for creators, allowing them to achieve realistic and efficient cloth simulations without the need for manual adjustments.

  • How does the cloth physics feature in UFN enhance character realism?

    -The cloth physics feature in UFN adds a dynamic and realistic movement to the clothing of characters. It allows for the simulation of how cloth moves and interacts with the character's body and environment, resulting in more lifelike and convincing characters in the game.

  • What is the role of machine learning in achieving high-quality facial performances for Metahumans?

    -Machine learning is used to train models that produce high-fidelity facial animations in real-time. By running complex simulations and training the ML model with the data, the system can generate facial expressions and movements that closely match the original performance capture data, ensuring a high level of realism and nuance in the final animation.

  • How does the live link Hub application facilitate the integration of performance capture data?

    -The live link Hub application allows for the streaming and recording of performance capture data from various devices directly into UFN. This feature simplifies the process of integrating motion capture data, making it easier for creators to incorporate realistic movements and expressions into their digital characters.

  • What is the significance of the metahuman DNA, and how is it generated?

    -Metahuman DNA is a rig that predicts all facial expressions of an actor, generated from video and depth data captured during the performance. It is created using a custom Epic facial solver and landmark detector, and it only requires a few frames of video and depth data to produce a highly accurate representation of the actor's facial movements and expressions.

  • How does the performance capture process differ from traditional methods?

    -The performance capture process described in the script is significantly faster and more efficient than traditional methods. It allows for the conversion of captured data into high-fidelity animation in real-time, taking less than a minute for a performance of considerable length. This rapid processing and conversion enable creators to see the results of their work almost immediately.

  • What is the purpose of the live link face mobile app, and how does it enhance the performance capture process?

    -The live link face mobile app is designed to capture facial data at the highest possible resolution. It uses video and depth data to convert this information into high-fidelity performance animation. The app can even utilize audio to produce convincing tongue animation, making the performance capture process more comprehensive and detailed.

  • How does the 4D rig work in the performance capture process?

    -The 4D rig is a bespoke tool created in collaboration with Ninja Theory for Hellblade 2, but it can be used on any Metahuman or any other rig that follows the new Metahuman standard. It works in conjunction with the performance capture data to accurately reproduce the actor's facial expressions and movements in the digital character.

  • What is the overarching goal of the UCA (United Cities of America) in the context of the script?

    -The overarching goal of the UCA in the script is to bring regions outside of its network into the distribution network, thereby connecting the world and eliminating the need for human porters. The UCA aims to expand its influence and network coverage to ensure humanity is free from the need to move around in dangerous conditions, ultimately enhancing safety and efficiency in deliveries and transportation.

Outlines

00:00

🚀 Introducing Metahumans and Cloth Physics in UE5

The video begins with an introduction to the new Metahuman technology, highlighting the ability to import human characters into Unreal Engine (UE) as non-player characters. The crew demonstrates how they've optimized the process for quality and efficiency, reducing the file size significantly. They discuss the Metahuman Creator and the new importer in UE, as well as the various quality options available. The video also covers the workflow for creating costumes using Marvelous Designer and the integration of Metahuman body data into the software, allowing for realistic simulations. The crew showcases the cloth physics feature in UE5.4, which is now available as Early Access, and explains how it enhances character realism by simulating dynamic cloth movement.

05:00

🎭 Capturing Nuance with Metahuman Animator

This segment focuses on the importance of capturing the nuances of an actor's performance and how Metahuman Animator, now available in UE, allows for this. The crew introduces actors Drew Morline and K Payton, who play Captain America and Black Panther, respectively. They discuss the process of using Metahuman Animator to honor and transform the actors' performances into powerful digital performances. The video also touches on the technical aspects of using machine learning to run complex simulations and the use of the Live Link Hub application for capturing facial expressions in real time.

10:01

🌟 Showcase of Real-Time Metahuman Performance

The crew presents a special treat, showcasing the entire Bridge scene with AI's mask removed to highlight the capabilities of Metahuman technology. They emphasize the real-time aspect of the demonstration, which includes the actors' performances, the detailed models, and the integration of various features to create a seamless and immersive experience.

15:02

🎮 Behind the Scenes with Metahuman and Rigging

In this part, the crew discusses the behind-the-scenes process of using Metahuman technology and rigging for character animation. They explain how a bespoke 4D rig created for Hellblade II can be used on any Metahuman or rig that follows the new Metahuman standard. The video also covers the use of performance capture to accurately reflect an actor's emotions and the rapid processing of these performances into animation.

20:04

🎵 Music and Applause Break

This section of the video script includes moments of applause and music, indicating a live presentation or demonstration. It serves as a transition between different segments of the video, providing a pause for the audience to react and engage with the content presented.

25:04

🛸 Dr. Strange and the UCA Network

The video transitions to a narrative involving Dr. Strange and the UCA (United Cities of America) network. It introduces the concept of the UCA and its role in bringing regions together. The crew discusses the challenges of expanding the network and the resistance faced from those who do not wish to join. The video also touches on the role of the commander and the importance of maintaining the network for the survival and unity of humanity.

🌌 The Chrysalis and the Origins of Life

The final segment delves into the mystery surrounding the Chrysalis and the origins of life. It explores the discovery of amino acids within the Chrysalis, which are identical to those found in tar, suggesting a primordial soup theory. The video discusses the implications of these findings and the potential origins of the creatures that emerged from it. It also hints at a deeper story involving a character seeking revenge and the broader implications for the world.

Mindmap

Keywords

💡Metahuman

Metahuman refers to a highly realistic digital human character, created using advanced 3D modeling and animation techniques. In the context of the video, it signifies the next generation of non-player characters (NPCs) that can be imported into the Unreal Engine (UFN), offering lifelike interactions and appearances. The term is used to describe the characters that have been optimized for quality and efficiency, going from a large file size to a more manageable one while retaining high fidelity.

💡Unreal Engine (UFN)

Unreal Engine is a powerful game engine developed by Epic Games, used for creating video games and other interactive media. In the video, UFN likely refers to the same engine, which is being updated to support the import and use of metahuman characters. This signifies a leap in technology for creating more immersive and realistic gaming experiences.

💡Marvelous Designer

Marvelous Designer is a 3D digital clothing software used for creating realistic garments in a virtual environment. In the context of the video, it is used in conjunction with the Unreal Engine to create lifelike clothing for metahuman characters, demonstrating an integrated workflow between character design, clothing simulation, and game engine implementation.

💡Cloth Physics

Cloth physics refers to the simulation of the physical behavior of cloth or fabric in a digital environment. This involves creating realistic animations of how cloth moves and interacts with other objects or characters. In the video, cloth physics are being introduced as an Early Access feature in the Unreal Engine, allowing for more dynamic and lifelike animations of clothing on characters.

💡Auto Sim Setup

Auto Sim Setup refers to an automated process for setting up simulations, which in the context of the video, involves the creation of realistic cloth and character animations. This feature streamlines the process of generating simulation data, Auto LOD (Level of Detail) generation, and skinning, making it easier for creators to achieve high-quality results without extensive manual work.

💡Performance Capture

Performance Capture is a technology that records the movements and expressions of actors, which are then used to animate digital characters. This technique allows for the translation of live-action performances into digital environments, ensuring that the nuances of the actor's performance are preserved in the final animation. In the video, performance capture is used to create highly realistic facial expressions and body movements for the metahuman characters.

💡Live Link Hub

Live Link Hub is an application that facilitates the streaming of performance capture data directly into the Unreal Engine. It supports various capture devices and allows for the recording and processing of high-quality animation data in real-time. This technology is crucial for integrating live performance data with digital characters and environments.

💡Machine Learning (ML)

Machine Learning is a subset of artificial intelligence that involves training algorithms to learn from and make predictions or decisions based on data. In the video, machine learning is used to create high-fidelity facial animations by training models on complex simulations, which can then produce film-quality deformations in real-time.

💡Metahuman DNA

Metahuman DNA, as used in the video, refers to the unique facial rig or data set generated for each actor through performance capture. This data set allows for the accurate prediction and replication of the actor's facial expressions in the digital character, ensuring a faithful reproduction of the original performance.

💡Cinematic Quality

Cinematic quality refers to the level of production value and visual fidelity that is comparable to what is seen in professional films. In the context of the video, it is used to describe the high standard of realism and detail achieved in the digital animations and simulations, creating an immersive and visually stunning experience.

💡Real-Time Animation

Real-Time Animation refers to the process of generating and playing back animations immediately as they are created, without the need for pre-rendering. This technology is crucial for interactive media like video games, where the animations must respond instantly to user input. The video emphasizes the use of real-time animation to create dynamic and responsive character performances.

💡Digital Performance

Digital Performance refers to the act of animating and bringing digital characters to life through motion capture and other digital techniques. It involves capturing the nuances of a live actor's performance and translating them into a digital medium, ensuring that the final animation is both realistic and expressive.

Highlights

Humans are now available for import into UFN as non-player characters.

The crew has optimized for both quality and efficiency, reducing the file size from almost one gig for a hero metahuman to approximately 60 megs in UFN.

The process of saving custom metahumans in the metahuman Creator makes them available in the new metahuman importer in UFN.

Multiple quality options are provided for different project requirements.

The workflow for creating costumes involves using Marvelous Designer, a leading digital clothing software.

Metahuman body data has been integrated into Marvelous Designer and Clo3D software, providing a new USD export option for garments.

In UE 5.4, custom chaos simulations are set up for realistic cinema-quality looks.

An auto Sim setup is introduced in UE 5.4, including Sim data, auto LOD generation, and auto skinning.

Cloth physics are available in UFN as Early Access, enhancing the realism of dynamic cloth objects.

The metahuman process allows for the faithful transformation of actor performances into digital animations.

The live link Hub application enables capture devices to stream directly into UFN, with more third-party devices to be supported soon.

Machine learning is utilized to run complex simulations in Houdini and produce film-quality deformations that run in real time.

Facial performances are crucial for character realism, and AI is used to generate high-fidelity facial expressions.

The live link face mobile app captures data at the best resolution possible for high-fidelity performance animation.

A single button click initiates processing, converting performance data into animation in less than a minute.

The metahuman DNA, generated from video and depth data, predicts all facial expressions and only needs to be done once per actor.

The performance capture process aims to work like a mirror, accurately reflecting the actor's emotions and expressions.

The 4D rig created for Hellblade 2 is ready to use on any metahuman or rig following the new metahuman standard.

The distribution network has evolved, with new groups like Drawbridge handling work outside the UCA, aiming to connect the world.

The UCA is not looking to expand its borders but rather bring new regions into the network, with Drawbridge supporting this effort.

The mission is to help humanity move beyond the need for human porters by leveraging advanced technology and networks.

The journey continues with the aim to finish the expedition and find the strength to carry on, despite the challenges faced.