Recently, I have published a co-authored paper at the 3rd IEEE Conference on Games sponsored by the IEEE Computer Society. The paper was entitled ‘A Novel Lip Synchronization Approach for Games and Virtual Environments’ and presented an algorithm for the offline generation of lip-sync animation. It redefines visemes as sets of constraints on the facial articulators such as lips, jaw, tongue.
The algorithm was comparatively evaluated with 30 healthy participants by presenting a set of phrases delivered verbally by a virtual character. Each phrase was presented in two versions: once with the traditional lip-sync method, and once with our method. Results confirm that the suggested solution produces more natural animation than standard keyframe interpolation techniques.
More information: