The dialogue of Cyberpunk 2077 was lip-synced by AI

The goal was to do all that for every character in the open world. Due to the game’s vast scope, CDPR needed to do all of that to capture zero face motion. To do this, the studio explained in Jali Research’s Lip Syncing and Facial Animation Tech how characters’ faces move.

Some characters in Cyberpunk 2077 Speak multiple languages, sometimes switching between them in a single sentence. To account for that, transcript tagging was used. Tags also helped adjust a character’s facial expressions when their emotional state changed within a line of dialogue. The system also used audio analysis to replicate the emotions of a vocal performance in animations.

CDPR used algorithms that helped it animate the katkneys The Witcher 3, And this is an evolution of that approach. Many games use motion capture on a lip sync dialog in just one language, which can break immersion somewhat for players switching separately.

While in procedurally generated animations Cyberpunk 2077 They may not be as elaborate or expressive as some other AAA games, they can add experience for many players who would like to play in different languages. You will be able to think closely about how you work in practice when you come into the game on 19 November.

Leave a Reply