1. Neos VR
  2. News

Neos VR News

2021.3.16.1287 - Face tracking improvements, improved sync robustness and more

Hello everyone! I wanted to do a bunch more work on the new desktop before releasing this build yet, but there's an important bugfix that heavily improved the cloud API reliability (this should fix sync errors and other problems due to interminnent network problems) as well as some important improvements for the face tracking!

The heuristics got expanded and improved, adding support for more avatars. Autodesk Character Creator avatars should now get setup with lip tracking (not 100% support either, but a bunch of blendshapes will be auto-mapped). Also there's now a mechanism where the voice data supresses certain blendshapes to avoid overdriving the mouth when talking with the Vive Facial Tracker.

There's a bunch of underlying work on the interaction system for the new desktop mode as well. Not that much visible yet, but touchscreen now has official initial support! Still more work to do, but you can play with it a bit if you'd like! I've got more coming soon!

[h2]New Features:[/h2]
- Added Timestamp to AvatarRawEyeData, which is relative time in seconds of the current snapshot of the eye tracking data (requested by @SHFR_H)
-- This allows calculating things like the current angular velocity of the eye movement with much better accuracy
- Added missing "Squeeze" and "Frown" properties to AvatarRawEyeData (based on report by @Ryuvi | Technical Artist)
-- Note that it's strongly recommended to use EyeManager, rather than this component unless you're building specialized application, as EyeManager is significantly more efficient and modular

- Added "Tongue Up Left", "Tongue Up Right", "Tongue Down Left" and "Tongue Down Right" expressions to the AvatarExpressionDriver (based on request by @umbran)
-- These drive based on the combination of both directions. Typically used for corrective deformation blendshapes
-- The heuristics has been updated as well to auto-assign those blendshapes (you will need to re-run the heuristics to assign missing blendshapes on already setup models)
- Added SmileClosed(Left/Right) expression to AvatarExpression driver
-- This will drive a smile that's only active when the lips are covering the teeth and will go to 0 when the teeth are visible
-- This can be used when there are separate smiling blendshapes for when the teeth are visible and when not
- Added VolumeSource and SilenceSource to AvatarExpressionDriver, which allow supressing the face expressions when the user is talking
-- VolumeSource is checked first, then SilenceSource (inverse of the first) when calculating supression
-- Each Expression now has "VolumeSupressionStrength", indicating how much is given expression supressed by volume (1.0 full supression, 0.0 no supression)
-- Existing instances are auto-initialized using VisemeAnalyzer on the avatar default supression weights are setup
-- By default, only jaw open and lips raising and pouting expressions are supressed. This should prevent the mouth opening too much when talking

- Added initial multi-touch support - the new desktop mode now properly handles touch screen inputs and multiple pointers for interacting with the UI
-- While touch inputs are active, the mouse input is supressed (this is to avoid double interactions due to Windows simulating mouse inputs from the primary touch)
-- Note that it doesn't go through the laser interaction system yet, there's more work that needs to be done on this
- You can now adjust the FOV (field of view) of the new desktop mode in the settings

[h2]Tweaks:[/h2]
- The FOV increases slightly in new desktop mode when running with the physical locomotion
- Extended the heuristics of AvatarExpressionDriver to provide better support for automatically assigning Autodesk Character Creator models
-- Note that you might need to tweak strengh of some of the blendshapes, like smiling
- Improved blendshape/bone name splitting to support naming conventions that start with a chirality letter (e.g. "RlipDown", which would previously split to "Rlip" and "Down")
-- This heaivly improves auto-detection of blendshapes on the Autodesk Character Creator models
- AvatarExpressionDriver no longer needs "shape" name for the "Ape Shape" (Jaw Down) name heuristic (based on model by @GearBell)
-- E.g. "mouth_ape" is now sufficient for the blendshape to be mapped, instead of needing "mouth_ape_shape"
- Preset names when starting worlds on the headless are not case-insensitive (based on report by @Epsilion)

- Merged Korean locale additions by @MirPASEC
- Merged Russian locale additions and tweaks by @Shadow Panther [RU/EN, UTC+3]
- Merged Japanese, Esperanto and Chinese locale additions by @Melnus
- Merged Czech locale additions by @rampa_3 (UTC +1, DST UTC +2)

[h2]Bugfixes:[/h2]
- Fixed Neos not retrying Cloud API requests when the server returns 429 or 500 codes
-- This should greatly increase the robustness of any cloud based functionality under load or interminnent network errors, fixing cases where different processes would get stuck
-- This should also significantly reduce number of sync errors due to network issues (e.g. connection briefly dropping) as reported by @PlasmaRaven, @Elektrospy, @Honeypotbutt and @Shifty | Quality Control Lead
- Added a mechanism to manually correct Patreon support for accounts where the amount of support stopped reporting correctly from Patreon API
-- If your Patreon account got messed up and you're no longer receiving rewards, even though the charges are going through, let us know and we'll update your account manually
- Fixed GrinLeft using the right side of the upper lip to modulate
- Fixed invite and restart headless commands throwing exceptions when no world is focused (reported by @Glitch)
- Fixed non-host users showing "Hidden Contacts Only World" as "Contacts Only World" (reported by @Shifty | Quality Control Lead and @Cyro)
- Fixed Neos adding invalid entry to the configuration file when a world is started with invalid template and then configuration is saved (reported by @Epsilion)

VIVE Facial Tracker support & automatic avatar setup, progress on desktop mode

Hello everyone!

We have some exciting news this week! HTC has released the VIVE Facial Tracker to general public, giving everyone access to full realtime facial tracking inside of VR. Neos has had native support for this hardware for a while, but with the release we have added an extra level of polish and automated the avatar setup to make it easy to get started.

If you haven't seen it in action, check out the video below. Combined with eye tracking, this hardware offers an unprecedented level of natural expression on avatars (huge thanks to Rezillo Ryker and GearBell for providing us with the most expressive avatars for demonstration) and allows scripting custom interactions that are triggered purely with your facial movements, like flying by blowing up your cheeks or activating fire particles on your head when baring your teeth.

To learn how to setup an avatar, whether a brand new one or existing one and even do some basic scripting, you can watch our tutorial video as well. It is split into several segments, so you can only watch the parts you're interested in!

And if you're looking to just give the face tracker a try, you can try it out on the new Neos Facebot, a fully free avatar by GearBell with full support for the face tracking working out of the box. You can find it in the “Face Tracking FaceBot Avatar World”.

We can't wait to see what you'll do with this new tech and what kind of amazing avatars and interactions you'll build!

In others news, we have made some progress on the desktop mode as well, adding support for proper aiming (a necessary step towards tools), control over FOV and even multi-touch support! You can read more about what we're working on below.

And also last, but not least, we have just crossed 1000 supporters and 15K on Patreon! We're overwhelmed with this level of support for this project. Without you, we wouldn't be able to keep working on it every day and remain independent, keeping the vision of the metaverse ours and its goals on providing the ultimate creative freedom. Thank you again everyone!



[h2]VIVE Facial Tracker support[/h2]
Last week, HTC released the new VIVE Facial Tracker add-on for the Vive Pro headsets to the general public, providing a real time tracking of the lips, cheeks, jaw and the tongue. Combined with the Vive Pro Eye headset, this gives you a full real time facial tracking while using VR, giving your avatar an unprecedented level of expressiveness.

Neos has supported eye tracking for almost 2 years at this point and face tracking since we got access to the devkit last year. With the release of this hardware to the public, we have applied an extra level of polish and functionality to make the use much easier.

We’re happy that this technology is now available to you, our community, as it brings the level of avatar fidelity in Neos to a whole next level, increasing the immersion for both you and other users, as your natural expressions, both voluntary and involuntary, now transfer to your virtual representation.

To demonstrate some of the possibilities, we have showcased the tracker with several different avatars (huge thanks to RezilloRyker and GearBell for providing us with highly expressive avatars to showcase), as well as some examples of basic scripting - triggering particles and special effects on the avatar with only facial movements and even flying by puffing up your cheeks.

[previewyoutube][/previewyoutube]

[h3]Neos Face Bot - free avatar with full face tracking support by GearBell[/h3]
To make full use of the facial tracker, you need an avatar that has the necessary blendshapes (face deformations) to visualize your facial movements. At the moment, there aren’t many avatars with a full support, but GearBell, one of our prominent community avatar creators, has created the Neos Facebot, a completely free avatar with a full set of necessary blendshapes.



If you’d like to give it a try, you can find it in the Inventory by going to Neos Essentials -> Avatars -> Face Tracking Ready. Alternatively check out the “Face Tracking FaceBot Avatar World” in the Worlds tab, where you can equip the new face bot and take it for a spin.

The avatar has some cool features built-in as well, including customization UI as well as jets and grappling hooks to play with, to showcase some of the cool avatar interactions you can have in Neos.

A huge thanks to GearBell for building this avatar to showcase this new technology!

[h3]Automatic avatar face tracking setup[/h3]
When setting up a new avatar, you can now find a new option called “Setup Face Tracking” in the avatar creator. By checking this option, Neos will use heuristics to automatically map any blendshapes on the avatar to the tracking data coming from the face tracker.

The success rate and tracking coverage will depend heavily on the avatar. For best results, we recommend adding all the blendshapes from the sample models by HTC and following the same naming convention, to ensure that you get full use of the face tracker and maximum fidelity.

However Neos will perform even partial mapping, but some of the face tracking features will be missing with the particular avatar. For example we have added the Ready Player Me avatars to the heuristics, making them work with the new face tracker, but some of the face shapes are missing, such as the tongue movement.

We have recorded the following tutorial which showcases how to setup a brand new avatar, how to add support to existing avatars, how to customize the weights (strength) and even how to script custom behaviors based on your facial movements.

[previewyoutube][/previewyoutube]

[h3]Improved heuristics and new blendshapes[/h3]
As part of the polish, we have also added a few of the missing blendshapes to the face tracking and eye tracking, such as tongue roll, tongue movement (left, right, up and down), eye squeeze and eye frown (currently doesn’t seem to be tracked with Vive Pro Eye however) and corrective tongue shapes.

We have heavily expanded the list of supported expressions on the AvatarExpressionDriver component, which serves as the primary way to drive the blendshapes on the avatar from the face tracking and added automatic assignment using heuristics (the same used by the avatar creator talked about above) to ease the setup.

Currently the heuristics are aware of the HTC sample models, Ready Player Me avatars and some general common face shapes. We’re working on adding a few more conventions, like the Autodesk Character Generator to work out of the box. If you have an avatar source that doesn’t work well with the heuristics and needs manual setup, please let us know on our GitHub with a sample model/naming convention!

The component now also allows you to tune the strength of the blendshapes as well, in case they’re too strong or weak. By setting the value to a really large number you can also channel the Garry’s Mod spirit.

[h3]Future of face tracking in Neos[/h3]
We have built Neos to offer a huge amount of freedom for expression and creativity and supporting cutting edge hardware like this is part of that. Our goal is to always design systems in a highly future proof way to make sure we can adopt new hardware as it comes and make it available to you with as little effort as possible.

The face tracking is part of this as well and we’re planning to support more solutions as they come, including webcam based face tracking for the desktop mode that’s currently in development, without requiring any changes (or at least any significant ones) on your end and instead exposing them through the same set of components, like the EyeManager and AvatarExpressionDriver.

This is just the beginning for face tracking in VR and we can’t wait to see where it leads and what amazing stuff you’ll build with it yourself!

[previewyoutube][/previewyoutube]
On our last livestream, we have showcased the face tracking with the VIVE Facial Tracker and the Vive Pro Eye headset

[h2]Desktop mode progress[/h2]
While a lot of our effort this week has been focused around the release of the face tracking, we have made some important progress on the official desktop mode as well. Currently our focus is on building out interaction systems, allowing the use of context menus and tools from the desktop mode.



This has a few major prerequisites, most of them that have been implemented already. We have expanded on the hand posing system, which now uses the actual avatar’s laser offset and is able to aim the hand exactly at a particular spot in the world, making sure the laser goes in a straight line.

https://www.youtube.com/watch?v=t02AyebR_gc

This part is crucial for building the tool interactions, as it will ensure that the simulated hand aims the tool at the exact point that your screen cursor is pointing at, whether it’s in first person or third person mode. Other interactions will be built on top of this as well, like physical grabbing or placing the hand at a nearby target.

Another important piece is extending the pointer interaction system to allow for a free-form cursor, allowing you to interact with the world and items within while freely moving the cursor around. To enable this, we have extended multiple systems. One of the benefits is that the desktop mode now has explicit control over the field of view (this is necessary to be able to calculate where in the world the laser hits).

https://www.youtube.com/watch?v=FAxGIBrYnPM

The second one is that Neos now has support for multi touch interfaces! Any existing UI can now be interacted with using a touchscreen. This will be particularly useful on mobile phones and tablets, providing a natural way to interact on those devices, although it still needs more work in this area, such as on screen controls for movement.

https://www.youtube.com/watch?v=7CRCwSWCX60

[h2]Transitioning from gamma to linear color space & MTC progress[/h2]
One of our longer ongoing projects is transitioning the renderer from a gamma color space to linear and providing more explicit control over color spaces for textures and colors in Neos. Linear color space will provide more accurate and consistent lighting, while managing the color space for the assets and parameters will ensure that your content looks the same (or as close as possible) as in the software it was authored in.

There is much to do for this transition, from the asset variant system, data model and the renderer, as well as making sure that existing content doesn’t break or look significantly different. We showed some early screenshots of the lighting and we’ll likely have more in the upcoming weeks as we progress.



The Metaverse Training Center is continuing development as well, with the different rooms getting filled out with content and polish. Here are some latest screenshots of the Streamer room and the Creation Plaza:



[h2]Community Highlights[/h2]
[h3]Creator Jam 94: The Four Elements[/h3]
Creator Jam is back again! This week we focus on the four elements earth, wind, fire, and water. There were many Fifth Elements references abound, but we had some nice creations in this jam, so stop by and check out what people have made. Thanks everyone!


[h3]Hidden Forbidden Holy Ground by Storm Zero[/h3]
In this zone created by the keywords above, you feel like someone’s always watching you. Inspired by the .hack franchise, we have a fan recreation of a notable map from the franchise. Come here and listen to a prayer, or check out the stained glass directly in VR. It’s quite a lovely map and I appreciate all the effort that’s gone into it! Thanks Storm!


[h3]DevTip’s Bizarre Adventure by Beaned[/h3]
This wonderful sculpture of a tooltip brought to life by Beaned, showcases the magnificent beauty that is the Developer Tooltip. A tool made to make you a literal god in the Metaverse comes to life as it animates itself. Be wary that the Developer Tooltip doesn’t look for you for disrespecting it. (Yes this is a JoJo Reference) Thanks Beaned!


[h3]VR Fitness Group by Lewis_Snow, Floofboi, and Kressy[/h3]
While it’s been in development, the VR Workout Club has been developing exercise machines in VR! Currently they only have a “Squatatron” (name pending), but it really shows that VR + Fitness can really go hand in hand! I’ve participated myself and it’s definitely a good workout! You really can feel the burn! If you wanna join try contacting one of them, anyone can join! Squat Away!


--------------------------------------------

Anyway that's all for this week, we hope it was an exciting one and that you'll have a lot of fun experimenting with the new hardware in the upcoming weeks as well. We'll continue bringing you more improvements over the upcoming days and weeks, as well as focusing on getting the desktop mode to basic feature parity, so we can perform an official swap.

You can keep an eye on the development on our official Discord in the #devlog channel and on any releases and update notes in #neos-updates.

See you next week!

2021.3.12.44 - Face tracking & Desktop interaction improvements, tweaks, fixes

Hello everyone! With the release of the Vive Facial Tracker, here's a bunch more eye and lip/mouth tracking improvements, to make the setup of avatars easier and more automated.

Heuristics of the blendshape matching were improved significantly and there's now a checkbox in the Avatar Creator to automatically set it up for face tracking. It's off by default, as it can mess up avatars if they're not setup for it, but feel free to experiment and see how well it works. You'll get best results following HTC's naming scheme (tutorial & samples coming soon).

There are some important improvements for the new desktop mode as well, notably with hand posing, which is now exact and capable of aiming the laser perfectly straight at the target. This is an improtant pre-requisite for implementing tools, as it will allow the system to aim the tool exactly at the point where the screen cursor is pointing.

A bunch of other improvements, tweaks and additions too, the VIVE Hand Tracking SDK was updated too (they seem to have made quite good improvements on the tracking quality!) and a few new blendshapes for the eye tracking that Neos didn't register before.

More to come soon!

[h2]New features:[/h2]
- Added "Setup Face Tracking" to the avatar creator, which will scan the blendshapes on the avatar and attempt to setup face tracking
-- This can be used to easily setup avatars for the Vive Facial Tracker
-- Note that success rate will depend on the available blendshapes, Neos will use name heuristics to try to find best matches. You can get best results by following the naming convention from the HTC sample model

- Added Squeeze and Frown eye tracking parameters (based on feedback from @Reactant)
-- Appropriate fields were added to EyeManager and EyeLinearDriver
-- Squeeze indicates how tightly is the eye closed
-- Frown indicates the eye frowning (note that from my testing this doesn't seem to be currently tracked with Vive Pro Eye)
- Added StrengthMultiplier to AvatarExpressionDriver, which allows applying a global scale to the driven targets/blendsdhapes
- Added "Show Laser in Desktop Mode" setting which will display the laser in the new desktop mode when pointing at interactable objects
-- I mostly use this for debugging, but you can use this if you prefer for immersion

[h2]Tweaks:[/h2]
- Heavily improved hand posing in desktop mode when grabbing items and interacting with them
-- The actual position/direction of the laser is now respected, posing the hand exactly so the laser is straight when at rest and hand is properly offset for different avatars
--- This is an important mechanism for tool support, as it allows to pose the hand to aim at exactly specific point in the world
-- When the target point is near face, the hand is pushed to the side and down to avoid it from intersecting/obscuring the face/viewpoint
-- The hand interaction transition now tracks velocity, preventing instant snapping of movement direction and has tweaked velocities/smoothing
- The cursor reticle now fades away after 2 seconds of inactivity in the desktop mode

- Sliding items all the way towards the face will no longer equip/physically grab in then desktop mode
- Improved internal moderation tools to allow for more finegrained control (based on feedback by the moderation team)
- Userspace laser in desktop mode now correctly overrides the world-space laser, rather than having both activate at the same time
- Holding items in desktop mode no longer uses the simulated hand's twist to rotate the item, to prevent unwanted rotations when moving the item around
-- Note that some partial rotation might still be transferred when initially grabbing
- Improved Avatar Expression driver blendshape assignment heuristics
-- The heuristics now detects more keywords, improving face tracking support for Ready Player Me models and others
-- The assignment system now also has a face target filtering, ignoring ambigous blendshapes if non-ambigious are present (e.g. "Mouth Smile" vs just "Smile") and preventing multiple similar blendshapes from being all assigned, causing over-driving (e.g. having both "Mouth Open" and "Jaw Open")
- EyeManager parameters now use sliders to represent values within the 0...1 range
- Increased Stiffness on DynamicBoneChain decimal points from 2 to 4 (based on feedback by @H3BO3)

- Updated VIVE Hand Tracking SDK to 0.10.0 (from 0.9.3)
- Updated VIVE SR Anipal SDK (for eye and face tracking) to 1.3.2.0 (from 1.3.1.1)

- Merged Czech locale additions and tweaks by @rampa_3 (UTC +1, DST UTC +2)
- Merged Korean locale tweaks by @MirPASEC

[h2]Bugfixes:[/h2]
- Fixed Interaction Laser raycast portals not working properly when the laser goes through another object that's ignored, causing UI to be unusable
-- This fixes cases of UI not being interactable in desktop mode when using portal raycast and the initial interaction hit hitting the user's avatar head
-- This also fixes cases of UI not being interactable when laser passes through canvas that let's the hit pass through if it doesn't hit anything (e.g. dash not being usable in VR when laser passes through the notifications canvas)
- Fixed grabbing a VRIKAvatar causing the neck and hips position to be taken from the held avatar, even though it's not equipped
- Fixed Cameras rendering the overlay layer
-- This should fix the mirror facet from rendering the dash while the desktop mode is enabled (re-reported by @epicEaston197 and @Khosumi)
- Fixed DictionaryList getting into invalid state when an operation during enumeration results in a single remaining item
-- This fixes various random bugs and potential corruptions due to using a list that's been returned back to memory pool
- Fixed out of range inputs for the maxUsers command on headless throwing an exception (reported by @Glitch)
- Fixed RecordSyncStatus not reporting sync error when the error is anything other than Out Of Space (based on report by @PlasmaRaven, @Elektrospy, @Honeypotbutt and @Shifty | Quality Control Lead)

2021.3.9.618 - Tweak & fix for laser reticle being too large or too small

Hey everyone, just a small patch to fixup a few issues with the laser reticle being too big in some cases, so it doesn't have to wait on other stuff.

Build is compatible with previous one, no need to update immediatelly if you don't experience issues.

[h2]Tweaks:[/h2]
- Virtual Keyboard will no longer activate when using the new desktop mode and clicking into a text field
- Tweaked laser reticle scaling logic in VR so it's non-linear, preventing the cursor from becoming too small when really close and scaling up slower when in distance

- Merged Japanese, Esperanto and Chinese locale updates by @Melnus
- Merged Russian locale update by @Shadow Panther [RU/EN, UTC+3]

[h2]Bugfixes:[/h2]
- Fixed cursor reticle size not compensating for user's scale (reported by @Cyro, @Delphox, @Sox and others)

Clicking, Grabbing & Scrolling in new desktop mode, GitHub reorganization

Hello and welcome to another weekly update!

This week we have some more important progress to share on the new desktop mode, which now has core interactions implemented! You can click, grab, interact with UI's and scroll with your mouse after you switch to the new desktop mode. The system will also pose your avatar's hand to make it look more natural.

There's still a lot of interactions and polish missing though - you can't use tools or context menus and the simulated motions are more robotic looking. Our focus has been on designing and building the underlying mechanisms, so there's a lot more to come before the new desktop mode is well rounded. For the time being it's still hidden behind the F8 key and fully usable only when Neos is launched in VR, but if you like to play with it in the current state you can!

Thanks to getting some development momentum, we have also addressed some of the highly voted GitHub/Patreon issues, focusing on ones that were easier to do to start. For example the dynamic bones scaling incorrectly after exiting an avatar anchor is no longer an issue.

Some of the work this week has also been more in the background. We have reorganized both our private and public GitHub repositories and published some missing ones. They're all now moved to the Neos-Metaverse organization account instead of personal Frooxius one.

This is to make the development process more robust with cloud automation, making collaboration with more developers easier and giving the repositories more official status. If you use the NeosPublic or NeosLocale repositories, make sure to update your links to the new ones!



[h2]Friday Q&A livestream[/h2]
If you missed our regular Friday livestream, you can watch the recording here. We did another usual session of Q&A, answering all your questions about Neos and showcased some of the progress on the desktop mode. This was before the interaction system has been implemented yet though, so for that one we recommend checking out the dedicated section below!
[previewyoutube][/previewyoutube]

[h2]Clicking, Grabbing & Scrolling in desktop mode[/h2]
We have implemented the core of the new interaction system for the new desktop mode, giving you the ability to click, interact with world UI’s, grab and scroll with your mouse wheel. This new system is built on top of Neos’ existing interaction systems that have been used for VR.

Instead of rebuilding the interactions, the desktop mode simulates inputs that a VR user would and feeds the system interaction targets to make the inputs easy to use with keyboard and mouse or gamepad.

[previewyoutube][/previewyoutube]

Thanks for this, the interactions will have consistency for both VR and desktop, with most systems in Neos not needing to distinguish in which mode the user is currently using. This simplifies development both on our side and for any content builders. You can also notice that the interaction laser cursor you see in VR is the same one that's now shown in desktop and changes based on the context the same way!

Not all interactions are implemented yet so far. So far you can click things (e.g. UI’s), grab objects and scroll UI’s with the mouse wheel. Using tools or context menus isn’t supported yet, but is coming soon.

Because of that, desktop mode is still considered in heavy development and is hidden behind the F8 key, but with this step brings even more usability to it. Our focus has been on building the core interaction mechanisms and systems that power it.

Some latest development notes on the interaction system

One of those is also a hand posing system. When you’re on a desktop and you click or grab things, the avatar hand will be posed automatically to make the interaction look more natural. Currently it’s very rudimentary and robotic looking, but once we focus on the layer of polish we’ll make it look more natural.

[h2]More movement options in desktop mode and VR[/h2]
One of the new additions to the desktop mode is also the ability to crouch, giving you more flexibility and improvements to bindings for both mouse & keyboard and gamepads. Gamepad should now provide better precision for camera look, making it easier to do fine motions and can be used to turn in VR as well.

We have added bindings for avatar anchors as well, so any vehicles and other contraptions using the primary and secondary axes can now be used with keyboard & mouse and gamepad as well.

Other issues were fixed as well, for example the VR eye tracking is now ignored when in desktop mode, preventing the eyes from becoming derpy or the VR -> Desktop transition is now skipped when opening/joining a new world while already in desktop.

[h2]Fixing some high priority issues[/h2]
With desktop gaining some more momentum, we have put some time to some of the highly voted GitHub/Patreon issues. Our first pass focused on ones that are easier and faster to implement/fix.

The problem with dynamic bones scaling incorrectly after the user enters and exits an avatar anchor has been finally fixed. The LogiX node for baking meshes has also been expanded to give explicit controls over adding Colliders and Grabbable components to the baked mesh.

For a full list of new features, tweaks and bugfixes, you can always check out our Discord or the Steam patch notes.

[h2]Moving public & internal repositories to organization account[/h2]
Last week we have spent some time reorganizing our internal and public code repositories, to ease the onboarding process for new developers, prepare for a cloud automation of the build process and make things look a fair bit more official.

As some of you have probably noticed the public repositories (e.g. NeosPublic and NeosLocale) have moved from Frooxius account to Neos-Metaverse, along with any of our own open source libraries and forks that we use to build Neos.

We have also published a few libraries and forks that previously weren’t on GitHub at all or were private, for example our QuantityX library that provides unit conversions and formatting (it’s also historically the first library ever written for Neos).

While this doesn’t change much for you in the short term, this is an important step as we grow our team and development process, allowing us to better iterate on various Neos’ dependencies, making the development process more robust and improving collaboration.

[h2]Sneak preview of the upcoming MTC Creator Plaza[/h2]
While the interactions for the MTC avatar lobby are still being built out, we're already putting together models for another of the environments - the creator plaza. This one will serve as a tutorial / sandbox for many of the building tools inside of Neos VR. Here are some early previews of what it looks like so far, but please note that a lot can still change with them before they reach public.



[h2]Community Highlights[/h2]
Greeting and Salutations everyone! It’s time of the day for that weekly update of that good old community content. This week we are focusing on some maps with some sweet atmosphere this week. We also catch up to some of the stuff the folks over at Creator Jam have been up to!

[h3]Ovation by GAWAWA[/h3]
An immaculate map made by GAWAWA. Come into this compostorium as you hear the divine musings of the music ebb and flow through your bones. From what I’m aware this world was made as a new home world and seems pretty grand as a way to start your Neos sessions! So if you ever get a chance take the time to check out the amp and look at the awe inspiring view. Thanks GAWAWA for the map!



[h3]Skate Park by Gearbell[/h3]
Get ready to rip your pants folks! There’s a new skate party in town! Skate Park made by Gearbell, goes into setting up a nice scene for people to throw out a skateboard provided in the map to skate around and make some sweet flips! So grab a board, grab some friends and go skating! Thanks for the map Gearbell!



[h3]Avamora Vodica by Lewis Snow[/h3]
A wonderful world between worlds, a place between the “Everything”. A bleak world that fills a nice balance of void and existence. This place acts as a purgatory between the worlds of NeosVR. You look under your feet and see the energy of the Metaverse ebbing and flowing through the glass that you stand upon. Thanks Lewis_Snow for making this wonderful map to meditate and exist between everything! The sounds are really on point too!



[h3]CJ 93: Push my Buttons![/h3]
For this Creator Jam we had a wonderful theme of where folks made all kinds of contraptions with buttons! Some even model exotic button types as you can by something making an actual “Belly” Button. Memes and jokes and all kinds of neat button creations in here!



[h3]CJ 92: Whatcha Looking at?[/h3]
Watchout! There’s a lot to look at or be looked at in this Jam! Creator Jam 92 “Whatcha Looking At?” takes a stab at using the new feature in Neos where you can have your eyes, ears, or 3d person to have a different perspective for your avatar. This is what allows people to have decoupled heads in Neos! People get a little freaky in here!



[h3]CJ 91 MacGuffin Land![/h3]
In this wonderful land of MacGuffin, there are many things to find in this location! Maybe you can find the Temple of Cheese? Creator Jam 91 MacGuffin Land focuses on interest points of discovery or to find for people to interact with. There’s some interesting things to find here so hopefully you find some treasure!



------------------------------------------

Anyway this is all for this week! As usual, huge thanks for your support, whether it's on Patreon, Twitch, ICO or just by being part of the community and building awesome content! We'll have more news for you next week, so stay tuned!