Adobe Character Animator – Adobe Character Animator is a unique animation software that allows users to animate 2D characters in real time using their own face and voice. Part of Adobe’s Creative Cloud, it works by tracking an actor’s facial expressions and movements through a standard webcam and microphone.
When you speak or smile or blink, the software captures those actions and makes your illustrated character (often called a “puppet”) do the same instantly. This means you can perform as a cartoon character live – for example, a streamer can appear as an animated avatar that talks and reacts simultaneously as they do.
Character Animator also supports triggering preset gestures (like a character waving or pointing) via keyboard or UI buttons, and it can track arm and body movement if you use additional input methods. Another strength is its automatic lip-sync: as you talk, it matches your speech to the character’s mouth shapes accurately, saving tons of manual keyframing. Adobe provides a bunch of ready-to-use character puppets, and users can import their own artwork from Photoshop or Illustrator and rig it with Character Animator’s tools (defining eyes, mouth, limbs, etc. to be draggable or face-driven). The software is used both for live entertainment and recorded animation – it has been used in late-night TV shows to have live animated characters and by YouTubers and educators to create cartoon explainer videos quickly. In summary, Adobe Character Animator brings a playful and powerful approach to 2D animation, leveraging AI and motion tracking to allow anyone to puppeteer a character with their face and voice, dramatically speeding up the animation process and enabling live cartoon performances.