1. Neos VR
  2. News
  3. VIVE Facial Tracker support & automatic avatar setup, progress on desktop mode

VIVE Facial Tracker support & automatic avatar setup, progress on desktop mode

Hello everyone!

We have some exciting news this week! HTC has released the VIVE Facial Tracker to general public, giving everyone access to full realtime facial tracking inside of VR. Neos has had native support for this hardware for a while, but with the release we have added an extra level of polish and automated the avatar setup to make it easy to get started.

If you haven't seen it in action, check out the video below. Combined with eye tracking, this hardware offers an unprecedented level of natural expression on avatars (huge thanks to Rezillo Ryker and GearBell for providing us with the most expressive avatars for demonstration) and allows scripting custom interactions that are triggered purely with your facial movements, like flying by blowing up your cheeks or activating fire particles on your head when baring your teeth.

To learn how to setup an avatar, whether a brand new one or existing one and even do some basic scripting, you can watch our tutorial video as well. It is split into several segments, so you can only watch the parts you're interested in!

And if you're looking to just give the face tracker a try, you can try it out on the new Neos Facebot, a fully free avatar by GearBell with full support for the face tracking working out of the box. You can find it in the “Face Tracking FaceBot Avatar World”.

We can't wait to see what you'll do with this new tech and what kind of amazing avatars and interactions you'll build!

In others news, we have made some progress on the desktop mode as well, adding support for proper aiming (a necessary step towards tools), control over FOV and even multi-touch support! You can read more about what we're working on below.

And also last, but not least, we have just crossed 1000 supporters and 15K on Patreon! We're overwhelmed with this level of support for this project. Without you, we wouldn't be able to keep working on it every day and remain independent, keeping the vision of the metaverse ours and its goals on providing the ultimate creative freedom. Thank you again everyone!



[h2]VIVE Facial Tracker support[/h2]
Last week, HTC released the new VIVE Facial Tracker add-on for the Vive Pro headsets to the general public, providing a real time tracking of the lips, cheeks, jaw and the tongue. Combined with the Vive Pro Eye headset, this gives you a full real time facial tracking while using VR, giving your avatar an unprecedented level of expressiveness.

Neos has supported eye tracking for almost 2 years at this point and face tracking since we got access to the devkit last year. With the release of this hardware to the public, we have applied an extra level of polish and functionality to make the use much easier.

We’re happy that this technology is now available to you, our community, as it brings the level of avatar fidelity in Neos to a whole next level, increasing the immersion for both you and other users, as your natural expressions, both voluntary and involuntary, now transfer to your virtual representation.

To demonstrate some of the possibilities, we have showcased the tracker with several different avatars (huge thanks to RezilloRyker and GearBell for providing us with highly expressive avatars to showcase), as well as some examples of basic scripting - triggering particles and special effects on the avatar with only facial movements and even flying by puffing up your cheeks.

[previewyoutube][/previewyoutube]

[h3]Neos Face Bot - free avatar with full face tracking support by GearBell[/h3]
To make full use of the facial tracker, you need an avatar that has the necessary blendshapes (face deformations) to visualize your facial movements. At the moment, there aren’t many avatars with a full support, but GearBell, one of our prominent community avatar creators, has created the Neos Facebot, a completely free avatar with a full set of necessary blendshapes.



If you’d like to give it a try, you can find it in the Inventory by going to Neos Essentials -> Avatars -> Face Tracking Ready. Alternatively check out the “Face Tracking FaceBot Avatar World” in the Worlds tab, where you can equip the new face bot and take it for a spin.

The avatar has some cool features built-in as well, including customization UI as well as jets and grappling hooks to play with, to showcase some of the cool avatar interactions you can have in Neos.

A huge thanks to GearBell for building this avatar to showcase this new technology!

[h3]Automatic avatar face tracking setup[/h3]
When setting up a new avatar, you can now find a new option called “Setup Face Tracking” in the avatar creator. By checking this option, Neos will use heuristics to automatically map any blendshapes on the avatar to the tracking data coming from the face tracker.

The success rate and tracking coverage will depend heavily on the avatar. For best results, we recommend adding all the blendshapes from the sample models by HTC and following the same naming convention, to ensure that you get full use of the face tracker and maximum fidelity.

However Neos will perform even partial mapping, but some of the face tracking features will be missing with the particular avatar. For example we have added the Ready Player Me avatars to the heuristics, making them work with the new face tracker, but some of the face shapes are missing, such as the tongue movement.

We have recorded the following tutorial which showcases how to setup a brand new avatar, how to add support to existing avatars, how to customize the weights (strength) and even how to script custom behaviors based on your facial movements.

[previewyoutube][/previewyoutube]

[h3]Improved heuristics and new blendshapes[/h3]
As part of the polish, we have also added a few of the missing blendshapes to the face tracking and eye tracking, such as tongue roll, tongue movement (left, right, up and down), eye squeeze and eye frown (currently doesn’t seem to be tracked with Vive Pro Eye however) and corrective tongue shapes.

We have heavily expanded the list of supported expressions on the AvatarExpressionDriver component, which serves as the primary way to drive the blendshapes on the avatar from the face tracking and added automatic assignment using heuristics (the same used by the avatar creator talked about above) to ease the setup.

Currently the heuristics are aware of the HTC sample models, Ready Player Me avatars and some general common face shapes. We’re working on adding a few more conventions, like the Autodesk Character Generator to work out of the box. If you have an avatar source that doesn’t work well with the heuristics and needs manual setup, please let us know on our GitHub with a sample model/naming convention!

The component now also allows you to tune the strength of the blendshapes as well, in case they’re too strong or weak. By setting the value to a really large number you can also channel the Garry’s Mod spirit.

[h3]Future of face tracking in Neos[/h3]
We have built Neos to offer a huge amount of freedom for expression and creativity and supporting cutting edge hardware like this is part of that. Our goal is to always design systems in a highly future proof way to make sure we can adopt new hardware as it comes and make it available to you with as little effort as possible.

The face tracking is part of this as well and we’re planning to support more solutions as they come, including webcam based face tracking for the desktop mode that’s currently in development, without requiring any changes (or at least any significant ones) on your end and instead exposing them through the same set of components, like the EyeManager and AvatarExpressionDriver.

This is just the beginning for face tracking in VR and we can’t wait to see where it leads and what amazing stuff you’ll build with it yourself!

[previewyoutube][/previewyoutube]
On our last livestream, we have showcased the face tracking with the VIVE Facial Tracker and the Vive Pro Eye headset

[h2]Desktop mode progress[/h2]
While a lot of our effort this week has been focused around the release of the face tracking, we have made some important progress on the official desktop mode as well. Currently our focus is on building out interaction systems, allowing the use of context menus and tools from the desktop mode.



This has a few major prerequisites, most of them that have been implemented already. We have expanded on the hand posing system, which now uses the actual avatar’s laser offset and is able to aim the hand exactly at a particular spot in the world, making sure the laser goes in a straight line.

https://www.youtube.com/watch?v=t02AyebR_gc

This part is crucial for building the tool interactions, as it will ensure that the simulated hand aims the tool at the exact point that your screen cursor is pointing at, whether it’s in first person or third person mode. Other interactions will be built on top of this as well, like physical grabbing or placing the hand at a nearby target.

Another important piece is extending the pointer interaction system to allow for a free-form cursor, allowing you to interact with the world and items within while freely moving the cursor around. To enable this, we have extended multiple systems. One of the benefits is that the desktop mode now has explicit control over the field of view (this is necessary to be able to calculate where in the world the laser hits).

https://www.youtube.com/watch?v=FAxGIBrYnPM

The second one is that Neos now has support for multi touch interfaces! Any existing UI can now be interacted with using a touchscreen. This will be particularly useful on mobile phones and tablets, providing a natural way to interact on those devices, although it still needs more work in this area, such as on screen controls for movement.

https://www.youtube.com/watch?v=7CRCwSWCX60

[h2]Transitioning from gamma to linear color space & MTC progress[/h2]
One of our longer ongoing projects is transitioning the renderer from a gamma color space to linear and providing more explicit control over color spaces for textures and colors in Neos. Linear color space will provide more accurate and consistent lighting, while managing the color space for the assets and parameters will ensure that your content looks the same (or as close as possible) as in the software it was authored in.

There is much to do for this transition, from the asset variant system, data model and the renderer, as well as making sure that existing content doesn’t break or look significantly different. We showed some early screenshots of the lighting and we’ll likely have more in the upcoming weeks as we progress.



The Metaverse Training Center is continuing development as well, with the different rooms getting filled out with content and polish. Here are some latest screenshots of the Streamer room and the Creation Plaza:



[h2]Community Highlights[/h2]
[h3]Creator Jam 94: The Four Elements[/h3]
Creator Jam is back again! This week we focus on the four elements earth, wind, fire, and water. There were many Fifth Elements references abound, but we had some nice creations in this jam, so stop by and check out what people have made. Thanks everyone!


[h3]Hidden Forbidden Holy Ground by Storm Zero[/h3]
In this zone created by the keywords above, you feel like someone’s always watching you. Inspired by the .hack franchise, we have a fan recreation of a notable map from the franchise. Come here and listen to a prayer, or check out the stained glass directly in VR. It’s quite a lovely map and I appreciate all the effort that’s gone into it! Thanks Storm!


[h3]DevTip’s Bizarre Adventure by Beaned[/h3]
This wonderful sculpture of a tooltip brought to life by Beaned, showcases the magnificent beauty that is the Developer Tooltip. A tool made to make you a literal god in the Metaverse comes to life as it animates itself. Be wary that the Developer Tooltip doesn’t look for you for disrespecting it. (Yes this is a JoJo Reference) Thanks Beaned!


[h3]VR Fitness Group by Lewis_Snow, Floofboi, and Kressy[/h3]
While it’s been in development, the VR Workout Club has been developing exercise machines in VR! Currently they only have a “Squatatron” (name pending), but it really shows that VR + Fitness can really go hand in hand! I’ve participated myself and it’s definitely a good workout! You really can feel the burn! If you wanna join try contacting one of them, anyone can join! Squat Away!


--------------------------------------------

Anyway that's all for this week, we hope it was an exciting one and that you'll have a lot of fun experimenting with the new hardware in the upcoming weeks as well. We'll continue bringing you more improvements over the upcoming days and weeks, as well as focusing on getting the desktop mode to basic feature parity, so we can perform an official swap.

You can keep an eye on the development on our official Discord in the #devlog channel and on any releases and update notes in #neos-updates.

See you next week!