Imagine having a 2D avatar that we can control with our body movements or even with our face movements.
That is what we intend to do in Pose Animator, create a system that allows the animation of avatars in real time, so that they imitate exactly what we are doing in real life.
The project is open source, is on github, and can be tested at this address. You just have to select the desired option and give permission to the webcam, the doll will move its body at the same time that we move ours, as well as its mouth, nose and eyes.
Pose Animator takes a 2D vector illustration and animates its curves in real time based on the recognition result of PoseNet and FaceMesh. It borrows the idea of skeleton-based animation from computer graphics and applies it to vector characters.
If you are developers you can apply the technique in your programs: make videoconferences without showing the real face and be an avatar, or create interactive games where we have to move to move forward.
The possibilities are huge, and being on github it can be improved and re-adapted to get more immersion in the programs we create.