Adobe releases Character Animator 3.4
200820 Sneakpeekcharacteranimatorspeechawareanimation S.gif

Adobe releases Character Animator 3.4

Tuesday, October 20, 2020 | Written by Jim Thacker

Originally published August 20, 2020. Scroll down for commercial release news.

Adobe has introduced Speech-Aware Animation, an interesting new AI-controlled system for automatically generating head movements for a character in Character Animator.

The feature is available in a new public beta version of the software that generates real-time puppet-style animations from an actor's video footage, along with a new Limb IK system and updates to the timeline.

Adobe's blog post about the beta doesn't give it a specific version number, but many online documentation refer to it as Character Animator 3.4.

The new language-aware animation system generates head and eyebrow movements from audio recordings
Character Animator 3.4 is the first experimental version of the software Adobe has released since launching its public beta program earlier this year.

The main feature is Speech-Aware Animation, a new AI-controlled system that automatically generates head movements and eyebrow positions for an animated character from recorded speech.

The underlying technology is based on Project Sweet Talk, a demo that Adobe showed at Adobe MAX 2019.

In its original form, mouth shapes from audio recordings as well as head movements and eyebrows were raised. However, since Character Animator already has a lip-sync system, it is not used here.

The implementation is a bit strange: Speech Aware animation has to be calculated separately from lip sync and, unlike lip sync, is not generated from scene audio – you have to import a separate audio file.

Updated September 2, 2020: Adobe announced that separating the two processes will add flexibility to the workflow, but the two may be combined in the future, depending on feedback on the public beta.

However, it can automatically rotate and scale the character's head, as well as tilt it easily, and users can adjust a number of control parameters to manually refine the animation.

Extended Limb IK system as well as animation timeline workflow updates
Other new features in Character Animator 3.4's Public Beta include Limb IK, which, as the name suggests, extends the software's existing Arm IK inverse kinematics system to the feet of a character.

The feature allows a character's entire arm or leg to be repositioned by moving their hand or foot (or paw, claw, or tentacle tip) with the rest of the limb automatically following and deforming.

Users also get a self-explanatory new "Pin Feet When Standing" option for walking behavior.

Other changes include the option to merge takes in the Timeline panel, as well as a number of other workflow improvements, including the option to color-code, filter, or isolate takes.

Updated on October 20, 2020: Character Animator 3.4 is shipping now. In the online documentation, the update is also referred to as the October 2020 software release.

It is available for Windows 10, Windows Server 2016+, and macOS 10.13 (macOS 10.15 required for voice-aware animation) only through Adobe's All Apps subscriptions. They cost $ 79.49 / month.

For a full list of what's new in Character Animator 3.4, see the online documentation

Tags: 2D animation, 3D animation, Adobe, AI-based, AI-controlled, automatic, cartoon animation, Character Animatior, Character Animator 3.4, Character Rigging, eyebrow position, raising eyebrows, face animation, facial expression, generate face animation from audio, Facial animation from recorded speech, head movement, IK, inverse kinematics, Isolate Take, Limb IK, merge takes, new functions, October 2020, preview, price, Project Sweet Talk, public beta, puppet animation, sneak peek, speech-conscious animation, system requirements, Timeline

LEAVE A REPLY

Please enter your comment!
Please enter your name here