CPU-intensive effects like Gaussian Blur now render faster thanks to GPU acceleration.
Lumetri Color has been similarly supercharged along with one-click white balance selection and HSL Secondary color correction, a feature also part of this year’s Premiere Pro CC update. There’s also a new Queue in AME option, which resolves issues with the Add to Adobe Media Encoder Queue option where project settings frequently weren’t retained during exports.
Animators who rely on Cinema 4D also have something to love, courtesy of the new roundtrip workflow. This works by first exporting 3D text or extruded shapes in Cinema 4D format, then importing that file back into After Effects. This creates a Cineware link between the AE composition and Cinema 4D, so edits made in one application are automatically reflected in the other.
There’s still room for improvement. Having to first spit out a Cinema 4D file then place it back into After Effects isn’t as streamlined as it could be, but that minor inconvenience is largely negated by the ability to see live updates in the first place.
Although technically a standalone application, After Effects CC 2015.3 also includes a fourth preview of Character Animator. This software brings to life layered 2D character artwork created in Photoshop or Illustrator as a virtual puppet, which users control in real time by acting out movements in front of a webcam.
It’s easier to tag puppets in Character Animator, thanks to a new panel for applying alternative facial, mouth, or trigger behaviors in a few clicks.
Like many others, I was blown away at how fun and easy to use Character Animator was after playing with the initial release last summer. The ability to track facial movements, capture a performance, and synchronize mouth movements was the kind of innovation we all hope for but rarely receive.
However, that release was fairly rudimentary, and more proof of concept than a full-featured animation package. That’s still the case today, but Adobe inches closer to something useful with an easier way to visually tag layer and handle names, record multiple takes of a character’s movement, and directly export to Adobe Media Encoder.
The biggest limitation with Character Animator is that it currently only works with your head and face. Adobe attempts to rectify this situation by allowing puppets to respond to specific types of motion entered via keyboard or mouse, which trigger corresponding animations on-screen. Such improvements were first used earlier this year on a live episode of the long-running sitcom The Simpsons, in which veteran voice talent Dan Castellaneta brought Homer to life in a whole new way.
Sign up for CIO Asia eNewsletters.