The expressions of the characters in video games are becoming more and more realistic, but they can take a huge leap thanks to the techniques used lately by large companies.
Now it is Unreal Engine that presents an app for iOS that can be used to capture facial expressions and animate a character on screen in real time.
Live Link Face It is designed to work in both professional game production environments, on stage with actors or a single artist in an office. It’s available on iTunes, so anyone can try it out.
The app uses Apple’s augmented reality platform, ARKit, and the iPhone’s TrueDepth front camera (introduced on the iPhone X in 2017) to capture facial features and transmit the data to the Unreal Engine. The application also captures head and neck movement, transmitting it to animation.
The app uses multicast networks to synchronize with multiple devices at the same time, which makes everything more realistic, both in professional and amateur environments.
We do not know if something like this will come to Android, since at the moment it uses something internal to the Apple platform, in much more controlled environments thanks to the fact that there are not as many different front cameras as in the Android world.
With this type of technology, more realistic games can be produced, and now they want to democratize it so that virtual production reaches more places, not just the world of professional video games.