Sneak peek: Adobe’s Speech-Aware Animation system
200820 Sneakpeekcharacteranimatorspeechawareanimation S.gif

Sneak Peek: Adobe's voice-aware animation system

Thursday, August 20, 2020 | Written by Jim Thacker

Adobe has introduced Speech-Aware Animation, an interesting new AI-controlled system for automatically generating head movements for a character in Character Animator.

The feature is available in a new public beta of the software that generates real-time puppet-style animation from an actor's video, along with a new Limb IK system and updates to the timeline.

Adobe's blog post about the beta doesn't give it a specific version number, but many online documentation refer to it as Character Animator 3.4.

The new language-aware animation system generates head and eyebrow movements from audio recordings
Character Animator 3.4 – or whatever it's called – is the first experimental version of the software Adobe has released since launching its beta program earlier this year.

The main feature is Speech-Aware Animation, a new AI-controlled system that automatically generates head movements and eyebrow positions for an animated character from recorded speech.

The underlying technology is based on Project Sweet Talk, a demo that Adobe showed at Adobe MAX 2019.

In its original form, mouth shapes from audio recordings as well as head movements and eyebrows were raised. However, since Character Animator already has a lip-syncing system, it is not used here.

The implementation is a bit strange: Speech Aware animation has to be calculated separately from lip sync and, unlike lip sync, is not generated from scene audio – you need to import a separate audio file.

However, it can automatically rotate and scale the character's head, as well as tilt it easily, and users can adjust a number of control parameters to manually refine the animation.

Extended Limb IK system as well as animation timeline workflow updates
Other new features in Character Animator 3.4's Public Beta include Limb IK, which, as the name suggests, extends the software's existing Arm IK inverse kinematics system to the legs of a character.

The feature allows a character's entire arm or leg to be repositioned by moving their hand or foot (or paw, claw, or tentacle tip) with the rest of the limb automatically following and deforming.

Users also get a self-explanatory new "Pin Feet When Standing" option for walking behavior.

Other changes include the option to merge takes in the Timeline panel, as well as a number of other workflow improvements, including the option to color-code, filter, or isolate takes.

Prices and availability
Speech-Aware Animation is available as part of a public beta version of Character Animator, which is available free of charge to existing users of the software. New users can install a trial version. It requires Windows 10 or MacOS 10.15.

The current stable build, Character Animator 3.3, is only available for Windows 10, Windows Server 2016+ and macOS 10.13 via Adobe's All Apps subscription. They cost $ 79.49 / month.

Read an overview of upcoming features in Character Animator on the Adobe blog

Read Adobe's FAQs on public betas

Tags: 2D animation, 3D animation, Adobe, AI-based, AI-controlled, automatic, cartoon animation, Character Animatior, Character Animator 3.4, Character Rigging, eyebrow position, raising eyebrows, face animation, facial expression, generate face animation from audio, Facial animation from recorded speech, head movement, IK, inverse kinematics, Isolate Take, Limb IK, merge takes, new features, preview, price, Project Sweet Talk, public beta, puppet animation, sneak peek, speech-aware animation, system requirements, schedule

LEAVE A REPLY

Please enter your comment!
Please enter your name here