Epic’s new MetaHuman Creator delivers super-realistic digital actors

Epic Games today lifts the lid on MetaHuman Creator – a new tool designed to bring the highest fidelity facial rendering to the wider development community. It’s part of Epic’s ongoing mission to democratise high-end graphics technology, giving a wider range development studios the chance to deliver characters up there with the industry’s best and to get the most out of today’s hardware. According to Epic, we’re looking at the kind of facial quality and animation seen in high-end titles like The Last of Us Part 2 – and you can see for yourself just how close Epic’s technology gets via the embedded video on this page.

MetaHuman Creator takes the form of a browser-based app, plumbed into Unreal Engine Pixel Streaming. Vladimir Mastilovic, VP of Digital Humans Technology at Epic told us that the initial process as being as simple as a game, with no programming knowledge required as developers create and sculpt their digital actors – you get a sense of that in the video below. As changes and enhancements are made, MetaHuman Creator intelligently uses data from its cloud-based library to extrapolate a realistic digital person. At the end of the process, the final creation can be imported into Unreal Engine via Quixel Bridge, with full animation rigging and Maya source data provided. At that point, a massive degree of rendering customisation is available via the features of Unreal Engine itself – and the data is, of course, compatible with both UE4 and the upcoming UE5.

Based on the quality of the sample (and there’s another one here, this looks like an impressive showing for Epic, though in reaching the fidelity seen in the character rendering found in the first-party triple-A juggernauts, there is more to the process than just the graphics – quality of performance and motion capture are going to be key. However, we are clearly seeing some cutting edge technology here and these initial demos are striking. Skin shading, texture quality and geometric density are very impressive, while eyes look expressive. Additionally, hair is always a particularly tricky part of rendering convincing characters – but MHC can tap into the very latest strand rendering technology to produce a convincing look, a ‘next-gen’ feature we’ve only really seen on proprietary engines so far. While likely too demanding to run on anything other than next-gen consoles and high-end systems, MHC can fall back to more standard texture ‘cards’ for hair rendering. In fact, the system itself scales its creations to eight LOD levels, ensuring scalability from powerful systems down to mobile platforms.

Read more

Source

About Author