It's been an eventful week for Epic Games. The company has taken a position in the industrial juggernaut SideFX. It also has and acquired Hyprsense, a move that builds on Epic's Digital Human program, which is both research and expansion fueled. In the meantime, the institutions and houses of the VFX industry are making their own advances with UE4 for projects outside of normal VFX projects, such as moving from digital domain to digital assistants.
Epic Games took a minority stake in SideFX, the makers of Houdini, earlier this week. SideFX's press release indicates its intention to partner with Epic. Sci-Tech Oscar winner Kim Davidson remains the majority owner of SideFX as well as President and CEO. The company stated, "He continues his strong, unwavering commitment to SideFX employees, customers and partners." SideFX and Epic are both keen to see SideFX continue to work with other industry partners – including all other content creation applications and game engines. This new development will have no impact on the Houdini development roadmap as SideFX will continue to define its own path as the industry-leading 3D procedural platform for film, TV, advertising and games.
It bodes well for more real-time simulation work in UE and for a narrower way to get assets from Houdini into the Unreal Engine. Over the past several years, Epic has shown a great deal of commitment to real-time simulation work, first with Chaos' high-performance physics and destruction system, and most recently with Epic's strand-based hair and fur system. While Houdini is much more than just a simulation tool, it is generally considered the “best of class” in 3D VFX simulation. Houdini is the de facto standard for high visual effects simulation used in most (if not all) major visual effects pipelines. The most effective collaboration could be to create simpler data and asset round trips between UE and Houdini that would support game, VFX and virtual productions worldwide.
Hyprsense was founded in 2015 by Jihun Yu, Jungwoon Park and Kenneth Ryu. Hyrpsense's goal is to provide real-time facial motion capture solutions for animation, 3D avatars, and digital people. The company first developed its face tracking system for VR HMDs and then rolled out the 2D webcam-based generic face tracking solution for mobile, PC and embedded platforms.
Animating compelling digital characters is clearly an integral part of Epic's strategy. Hyprsense's technology and team are joining other key players who have become part of Epic Games, notably 3Lateral and Cubic Motion.
Epic hopes to give developers the ability to use digital people in a range of applications that go beyond what is traditionally referred to as a "game." While Hyprsense has grown into a leader in the gaming industry over the past 5 years, and the technology has been adopted by the world's top AAA gaming companies, Epic is also interested in a wider audience. With this acquisition, Epic continues its work to make the production of digital human content more accessible. This acquisition will help accelerate Epic's journey to building new tools and give Unreal Engine users the ability to deploy and control the advanced character assets on virtually any platform.
“We are proud and excited to be adding our character animation technology to the Epic Games ecosystem. Joining Epic gives us the opportunity to deliver new solutions and experiences at scale, ”said Jihun Yu, co-founder and CEO of Hyprsense. "We are very grateful to the Hyprsense team for their tireless work in getting us here, as well as our customers, partners and supporting investors."
"By engaging the Hyprsense team, we can continue to innovate digital characters and move closer to the goal of giving all developers full control over how they can express their vision to the smallest nuance," said Kim Libreri, CTO of Epic Games. Since joining Epic Games as CTO six years ago, Libreri has worked closely with founder Tim Sweeney to bring movie industry technology to the gaming community and bring Epic's tools to a wider audience. He was also central to Epic's advances in digital human technology across all of the company's many industry segments.
Douglas digital domain.
Digital Domain, which Digital Doug had already shown as an avatar or digital puppet, is now expanding its work with realistic digital assistants. "Douglas" is at the forefront of a multitude of new realistic autonomous real-time digital people. Douglas is a real-time person using the latest UE4.25 software. It's still under development, but the goal is to make the digital human agent the new user interface (UI) for the human computer interface (HCI). Douglas is different from Digital Doug in that the latter has just emulated or customized the facial reactions of the actual Doug Roble (Senior Director of Software Research and Development) as we described in our original Greenscreen-fxpodcast on Digital Doug in 2018. The new Douglas has to do a lot more than just look good. This new digital agent needs to connect to Natural Language Processing (NLP) to have conversations that feel natural. It must both make sense of the question asked and provide a plausible answer. Douglas is also a chameleon-like character in that he can switch faces and gives future customers even more flexibility when he launches in 2021.
"Everywhere you look you see virtual assistants, chatbots, and other forms of AI-based communication interacting with people," said Darren Hendler, director of the Digital Humans Group at Digital Domain. “As companies decide to expand on previous voice-only interactions, there will be a real need for photorealistic people to behave as we expect them to. This is where Douglas comes in. "
Douglas uses numerous types of machine learning and digital research and development to reproduce the most common mannerisms that people expect from a lifelike digital human. By focusing on language processing, expressions, vision tracking and more, Douglas can learn from conversations and, most importantly, remember people. Many current voice-only assistants, such as Siri and Alexa, are made so that the people who interact with them are not visually identified. Douglas' actual response rate is currently the same as Alexa and Siri's in terms of natural conversation flow, but shows a memory of his previous interactions and tries to eliminate the long processing pauses that slow down many other autonomous digital people.
Digital Domain (DD) also creates and emulates new human voices. Using many of the same underlying advanced software approaches, a new digital human can be created with just 30 minutes of audio or 10 minutes of video. This is dramatically lower than many other approaches, including earlier versions of DD's own original technology from just 2 years ago.
DD will offer Douglas' core technology as a service to businesses that need a digital agent to answer questions or help customers with repetitive tasks. The current version is already designed for connection with most other chatbot or assistance systems. It is crucial that the realistic face delivers an emotionally intelligent response to real-time interactions. Starting next year, DD hopes that this technology can be used online, on meeting platforms or in kiosks around the world.
Douglas is the newest digital UE4 human, modeled on Dr. Doug Roble was developed. DD's Digital Doug program has already led to several advances in both real-time digital people and AI face capture. By comparing Douglas to the real Roble, DD was able to improve the realism of their design by preparing the technology for wider use.
To create Douglas, Roble underwent over a hundred hours of performance tracking, including a live book reading that recorded his voice and expression. A neural rendering tool was also trained by photographing Roble in different lighting conditions, as we described in this Greenscreen story on DD's Masquerade. From this data, the tool can now deliver a level of realism previously unattainable using conventional techniques, including replicating someone else's mannerisms with just a few expressions. In the last few months the team has started exchanging faces
Digital Douglas is involved in a number of projects that businesses use visual effects by promoting expansion and diversification. Such projects are increasingly using the Unreal Engine as we see a trend towards real-time applications.