Kling Adds Lip Sync - a Game Changer for AI Video

Kling Adds Lip Sync - a Game Changer for AI Video

Kling, one of the best artificial intelligence video generators, has added the most accurate lip-sync feature I have ever used It works even with faces that are not looking directly at the camera

Lip-sync is one of the holy grails of the AI video field This is because lip-sync done correctly and made to look realistic can pave the way for artificial actors, for better or worse It would allow, for example, a single AI video producer to produce an entire production with dialogue Kling is getting close, but not as close as Danny DeVito

There have been many updates to Kling in recent weeks, including new v15 models, community features, and motion brushes

Currently, lip-sync only works on human characters, but you can push it to work on humanoid aliens and animals by giving them flat human-like faces (not sure why you would want to do that)

To use lip-sync with cling, first generate a video Then click on “match mouth” and it will track your mouth movements throughout the video This takes about 10 minutes, but this is what makes cling so effective

Once the mouth movements are tracked and isolated, the audio can be uploaded This can be from Eleven Labs, actual audio, or a recording of the advanced voice speaking in ChatGPT

Kling will then match the voice perfectly to the video and animate the mouth so that the character appears to be speaking (or singing) the words of the voice A slight uncanny valley may occur, but this depends in part on how accurately the mouth movements are tracked

Applying lip-sync to a 10-second video requires 10 credits and cannot be applied for more than 10 seconds at a time Kling advertises “no post-production required,” but if you need a longer monologue, you will have to rely on another tool such as LibDubAI, HeyGen, or Hedra

In addition to the lip-syncing feature, the latest update introduces a new community feature that allows you to earn credits for more productions by sharing your work I'm not sure how long you can keep this credit generation loop going, but it's worth a try

Kling introduced motion brushes in the last update This is similar to the motion brush feature in Runway Gen-2 We are still waiting for this to return to Runway Basically, you can select elements in an image and tell Kling how to move them The best example I've seen is a yoga video, where you can select a few elements and tell Kling how to move them around

Kling has also joined Runway with Luma Labs, which confirmed it will release an API, allowing developers to integrate AI video into their products

Overall, Kling is firmly established as a leader in the generative AI video space Kling combines a variety of useful production features with an understanding of motion and physics that is so realistic that no other model can match it

Categories