The Changing Face of Facial Mocap
Peter Busch, vice president of business development at Faceware, an industry leader in facial animation and mocap, shares Faceware’s vision of the future for this important aspect of content creation with GBR.
In Peter’s own words, “Facial animation has advanced significantly in the past several years, and it’s gearing up to advance even more in the next several. Here are five ways we expect that to happen”:
1) Details, details – Characters in today’s games are already pushing the realism envelope when it comes to facial performances; but that will only increase with time. Computing power keeps going up, driving up the level of detail that can be rendered in real-time in a scene, both on mobile devices and in PC and console games. Look for more realistic facial movement and in particular, eye movement, in tomorrow’s games.
2) Not just for games and film – Content creation tools are getting easier to use, less expensive, and now support multiple languages. That means that more people will have access to the hardware and software needed to create good facial performances. That, in turn, means facial animation, whether pre-rendered or rendered in real time, will appear in more indie games and expand globally beyond games and films into live shows, theme parks attractions and more.
3) Replace your face – You saw it done in The Curious Case of Benjamin Button, Fast and Furious 7 and in The Walk. Facial replacement is a thing. It can turn young people old, older people young, and famous people into the craziest stunt people ever. In other words, it opens up a world of possibilities for actors. Given the lowering barrier of entry, these techniques are now expanding to burgeoning markets, like India and China, enabling those filmmakers to take on far more ambitious projects, like L.O.R.D and Ek Tha Tiger, where most of the scenes incorporated facial replacement.
4) Game engines get in the game – Game artists and animators can spend days creating amazing looking faces, but in the past, most of that detail would fall apart when it got to the game engine. Game engines just weren’t capable of supporting many of the details and techniques needed for realistic facial performances. That is changing. Unreal Engine 4 just released a new facial module to help standardize facial rigging. Look for this sort of support to become the norm, not the exception.
5) Interactivity – It’s one thing to watch an animated character in a game. Is quite another to interact with one. Today, we’re able to interact with characters animated in real time via live performances or kiosks at theme parks, as RoosterTeeth recently did in Australia. It’s rudimentary, but it’s effective. Tomorrow, we’ll be able to interact with player-driven characters or AI-driven avatars, in game, in real time. Imagine saying something to a character in a game and having that character respond to you as they would in real life. This will change the face of games. We’ve been working toward this future for some time and will have more exciting things to announce shortly.
GBR Analyst’s view:
As facial animation tools and correlated AI continue to mature, we should see more realistic and engaging animated characters appearing in AR and VR games/applications as well. One can envision possibilities for virtual team mates in VR games, human-like helpdesk “bots” that more or less pop up in AR apps and other characters as Peter alludes to above, further driving the state of the art for facial animation, particularly for real time rendering. Game engine developers, take note.
About Faceware Technologies:
PR Contact Information: