Metahumans are getting TOO REALISTIC in Unreal Engine 5.4
TLDRThe script discusses the integration of metahumans into the Unreal Engine (Ufn) as non-player characters, highlighting the optimization from one gig for a hero metahuman to approximately 60 megs. It covers the workflow for creating costumes using Marvelous Designer and the new USD export option for garments. The introduction of cloth physics in Ufn as Early Access is also mentioned, along with the use of machine learning for realistic facial expressions and performances. The script concludes with a live demonstration of the technology and a look at the character's metahuman DNA, emphasizing the real-time capabilities and potential for detailed, high-fidelity character animations.
Takeaways
- 💻 Humans can now be imported into UFN as non-player characters, significantly reducing file sizes from nearly 1GB to about 60MB.
- 🖼 The process for importing custom metahumans into UFN has been streamlined, involving saving in the MetaHuman Creator and importing via a new MetaHuman importer.
- 🏆 Collaboration with CL, makers of Marvelous Designer and CLO 3D, integrates metahuman body data for creating realistic digital clothing.
- 🧪 Cloth physics are now available in UFN, offering cinema-quality simulations for more realistic character and environmental interactions.
- 💪 MetaHuman animator tools from UE are now accessible in UFN, enhancing character animation capabilities with real-time performance capture.
- 👨💻 Live Link Hub application supports a wide range of capture devices for streamlined data integration into UFN.
- 🛡️ The authenticity of character outfits is crucial for immersion, achieved through detailed material properties and machine learning-driven deformations.
- 👤 Great facial performances are essential, demonstrated by high-fidelity animation captures that preserve the nuances of actor performances.
- 📺 Performance capture technology aims for a mirror-like accuracy in capturing actors' emotions, further enhanced by MetaHuman DNA for personalized facial rigs.
- 📖 The narrative explores themes of unity and survival in a fragmented world, with characters working towards bridging divides and overcoming existential threats.
Q & A
What is the significance of the new Metahuman import feature in UFN (Unreal Framework)?
-The new Metahuman import feature in UFN allows for the easy integration of custom Metahuman characters into the Unreal Framework. This process significantly reduces the file size from a gigabyte to approximately 60 megabytes, making it more efficient and accessible for creators to use high-quality, detailed characters in their projects.
How does the Metahuman Creator work in conjunction with Marvelous Designer for clothing creation?
-The Metahuman Creator works in tandem with Marvelous Designer by integrating the Metahuman body data into the digital clothing software. This allows creators to design garments that fit the Metahuman characters accurately. The software also provides a new USD export option, which includes geometry, materials, and simulation setup data for the garments.
What are the benefits of the auto Sim setup introduced in the upcoming UE 5.4?
-The auto Sim setup in UE 5.4 automates the process of setting up simulation data, Auto LOD generation, and auto skinning. This feature streamlines the workflow for creators, allowing them to achieve realistic and efficient cloth simulations without the need for manual adjustments.
How does the cloth physics feature in UFN enhance character realism?
-The cloth physics feature in UFN adds a dynamic and realistic movement to the clothing of characters. It allows for the simulation of how cloth moves and interacts with the character's body and environment, resulting in more lifelike and convincing characters in the game.
What is the role of machine learning in achieving high-quality facial performances for Metahumans?
-Machine learning is used to train models that produce high-fidelity facial animations in real-time. By running complex simulations and training the ML model with the data, the system can generate facial expressions and movements that closely match the original performance capture data, ensuring a high level of realism and nuance in the final animation.
How does the live link Hub application facilitate the integration of performance capture data?
-The live link Hub application allows for the streaming and recording of performance capture data from various devices directly into UFN. This feature simplifies the process of integrating motion capture data, making it easier for creators to incorporate realistic movements and expressions into their digital characters.
What is the significance of the metahuman DNA, and how is it generated?
-Metahuman DNA is a rig that predicts all facial expressions of an actor, generated from video and depth data captured during the performance. It is created using a custom Epic facial solver and landmark detector, and it only requires a few frames of video and depth data to produce a highly accurate representation of the actor's facial movements and expressions.
How does the performance capture process differ from traditional methods?
-The performance capture process described in the script is significantly faster and more efficient than traditional methods. It allows for the conversion of captured data into high-fidelity animation in real-time, taking less than a minute for a performance of considerable length. This rapid processing and conversion enable creators to see the results of their work almost immediately.
What is the purpose of the live link face mobile app, and how does it enhance the performance capture process?
-The live link face mobile app is designed to capture facial data at the highest possible resolution. It uses video and depth data to convert this information into high-fidelity performance animation. The app can even utilize audio to produce convincing tongue animation, making the performance capture process more comprehensive and detailed.
How does the 4D rig work in the performance capture process?
-The 4D rig is a bespoke tool created in collaboration with Ninja Theory for Hellblade 2, but it can be used on any Metahuman or any other rig that follows the new Metahuman standard. It works in conjunction with the performance capture data to accurately reproduce the actor's facial expressions and movements in the digital character.
What is the overarching goal of the UCA (United Cities of America) in the context of the script?
-The overarching goal of the UCA in the script is to bring regions outside of its network into the distribution network, thereby connecting the world and eliminating the need for human porters. The UCA aims to expand its influence and network coverage to ensure humanity is free from the need to move around in dangerous conditions, ultimately enhancing safety and efficiency in deliveries and transportation.
Outlines
🚀 Introducing Metahumans and Cloth Physics in UE5
The video begins with an introduction to the new Metahuman technology, highlighting the ability to import human characters into Unreal Engine (UE) as non-player characters. The crew demonstrates how they've optimized the process for quality and efficiency, reducing the file size significantly. They discuss the Metahuman Creator and the new importer in UE, as well as the various quality options available. The video also covers the workflow for creating costumes using Marvelous Designer and the integration of Metahuman body data into the software, allowing for realistic simulations. The crew showcases the cloth physics feature in UE5.4, which is now available as Early Access, and explains how it enhances character realism by simulating dynamic cloth movement.
🎭 Capturing Nuance with Metahuman Animator
This segment focuses on the importance of capturing the nuances of an actor's performance and how Metahuman Animator, now available in UE, allows for this. The crew introduces actors Drew Morline and K Payton, who play Captain America and Black Panther, respectively. They discuss the process of using Metahuman Animator to honor and transform the actors' performances into powerful digital performances. The video also touches on the technical aspects of using machine learning to run complex simulations and the use of the Live Link Hub application for capturing facial expressions in real time.
🌟 Showcase of Real-Time Metahuman Performance
The crew presents a special treat, showcasing the entire Bridge scene with AI's mask removed to highlight the capabilities of Metahuman technology. They emphasize the real-time aspect of the demonstration, which includes the actors' performances, the detailed models, and the integration of various features to create a seamless and immersive experience.
🎮 Behind the Scenes with Metahuman and Rigging
In this part, the crew discusses the behind-the-scenes process of using Metahuman technology and rigging for character animation. They explain how a bespoke 4D rig created for Hellblade II can be used on any Metahuman or rig that follows the new Metahuman standard. The video also covers the use of performance capture to accurately reflect an actor's emotions and the rapid processing of these performances into animation.
🎵 Music and Applause Break
This section of the video script includes moments of applause and music, indicating a live presentation or demonstration. It serves as a transition between different segments of the video, providing a pause for the audience to react and engage with the content presented.
🛸 Dr. Strange and the UCA Network
The video transitions to a narrative involving Dr. Strange and the UCA (United Cities of America) network. It introduces the concept of the UCA and its role in bringing regions together. The crew discusses the challenges of expanding the network and the resistance faced from those who do not wish to join. The video also touches on the role of the commander and the importance of maintaining the network for the survival and unity of humanity.
🌌 The Chrysalis and the Origins of Life
The final segment delves into the mystery surrounding the Chrysalis and the origins of life. It explores the discovery of amino acids within the Chrysalis, which are identical to those found in tar, suggesting a primordial soup theory. The video discusses the implications of these findings and the potential origins of the creatures that emerged from it. It also hints at a deeper story involving a character seeking revenge and the broader implications for the world.
Mindmap
Keywords
💡Metahuman
💡Unreal Engine (UFN)
💡Marvelous Designer
💡Cloth Physics
💡Auto Sim Setup
💡Performance Capture
💡Live Link Hub
💡Machine Learning (ML)
💡Metahuman DNA
💡Cinematic Quality
💡Real-Time Animation
💡Digital Performance
Highlights
Humans are now available for import into UFN as non-player characters.
The crew has optimized for both quality and efficiency, reducing the file size from almost one gig for a hero metahuman to approximately 60 megs in UFN.
The process of saving custom metahumans in the metahuman Creator makes them available in the new metahuman importer in UFN.
Multiple quality options are provided for different project requirements.
The workflow for creating costumes involves using Marvelous Designer, a leading digital clothing software.
Metahuman body data has been integrated into Marvelous Designer and Clo3D software, providing a new USD export option for garments.
In UE 5.4, custom chaos simulations are set up for realistic cinema-quality looks.
An auto Sim setup is introduced in UE 5.4, including Sim data, auto LOD generation, and auto skinning.
Cloth physics are available in UFN as Early Access, enhancing the realism of dynamic cloth objects.
The metahuman process allows for the faithful transformation of actor performances into digital animations.
The live link Hub application enables capture devices to stream directly into UFN, with more third-party devices to be supported soon.
Machine learning is utilized to run complex simulations in Houdini and produce film-quality deformations that run in real time.
Facial performances are crucial for character realism, and AI is used to generate high-fidelity facial expressions.
The live link face mobile app captures data at the best resolution possible for high-fidelity performance animation.
A single button click initiates processing, converting performance data into animation in less than a minute.
The metahuman DNA, generated from video and depth data, predicts all facial expressions and only needs to be done once per actor.
The performance capture process aims to work like a mirror, accurately reflecting the actor's emotions and expressions.
The 4D rig created for Hellblade 2 is ready to use on any metahuman or rig following the new metahuman standard.
The distribution network has evolved, with new groups like Drawbridge handling work outside the UCA, aiming to connect the world.
The UCA is not looking to expand its borders but rather bring new regions into the network, with Drawbridge supporting this effort.
The mission is to help humanity move beyond the need for human porters by leveraging advanced technology and networks.
The journey continues with the aim to finish the expedition and find the strength to carry on, despite the challenges faced.