FaceRig Studio, targeted at businesses, has six types of licenses for enterprises and commercial establishments.
Anything more, and you need to buy FaceRig Pro, which is just like Classic feature-wise, but can be used by commercial users. The Classic is fully featured for private, non-commercial use, and you can also use it on your YouTube channel, provided, you’re making less than $500 per month. The difference is only in its licensing, which is based on how you use it.
Facerig online pro#
It has three major ‘flavors’ – Classic, Pro and Studio, but there are no technical differences between the free FaceRig Classic and the paid-for Pro/Studio. FaceRig is conceptualized, designed and developed by experienced game developers who went indie.
If you liked this article and think others should read it, please share it on Twitter or Facebook. Furthermore, we can dig into the Live2D SDK to see how we can morph some of the facial features to get a better control over the visual aspects of the avatar. However, down the road we can maybe use a better tracking algorithm to track certain features of the face with greater accuracy (using deep learning models maybe). To fix this, we may have to normalize the values so that the movement of our face would be greater for the corresponding head movements captured from the camera.įor now, this was a quick-and-dirty implementation of FaceRig Live2D module. Using the above method, I was able to make the head rotate slightly - but not fluidly enough like the original implementation as the magnitude of movement is very small. We will use the coordinate point number 62 as our tracking point for the facial tracking. Further down the road we can utilize these parameters to further optimize the tracking of the facial features. The plan here is to map the facial feature tracking coordinates to the mouse pointer and attempt to move the character’s face.įor a very naive approach, we can use the nose as a key point to have it track only the rotation of the head for our initial implementation. The original source for the OpenGL based Live2D implemented a mouse based tracker which would use the mouse pointer’s coordinates as a means to rotate and morph the face to the corresponding angle of where the pointer is. But for now, this would suffice to at least get some of the primary functions of facial feature tracking down.īelow you can see the facial feature tracking algorithm in action: This works fairly well for a prototype (although not really perfect), however can probably truly optimized with either a better model or a completely new tracking algorithm (such as implementing a Lucas-Kanade Optical Flow based tracker in Javascript). For this implementation, I have utilized the same Live2D demo codebase and will simply look through the SDK to be able to adjust the parameters to deform the angular direction of the character’s face.įor facial tracking, I implemented a Javascript based facial tracker based on the Constrained Local Models from Saragih et. In the past, I have implemented a demo of Live2D on a browser based on OpenGL. Of course further refinements to improve the tracking ability or the fluidity of the animation in the future. With that, I wanted to take this an opportunity to hack together a quick-and-dirty web-based implementation of FaceRig’s Live2D module for a web browser. With the recent hype surrounding FaceRig and the release of their Live2D module, it was certainly interesting to see two really interesting pieces of technology merge into one neat application for use in areas like gaming and new human-computer interaction systems.
Facerig online code#
Also, due to TOS reasons with the Live2D SDK, I cannot post the source code to this implementation. The following blog post was based on an original post I wrote for Qiita and was translated from Japanese.