Official bHaptics integration, Haptic Feedback Framework, new team member
Welcome to another weekly update!
This week we have added official support for the bHaptics hardware to Neos, featuring a brand new haptics framework. This system is designed to allow you to easily enrich your Neos content with different types of haptics feedback and have it work across variety of different devices.
This week has been just as exciting and creative as the last! For community highlights we have cats, singing, and pots!

[h2]Welcome Ryuvi to the team[/h2]
As many of you probably already noticed, we had another community member join our team! Say hello to Ryuvi, our new technical artist! Thanks to his help with the new MTC 2.0 and his excellent MMC entry, we have decided to fully onboard him on the team to work with Aegis and RueShejn on graphical design and content creation.

[h2]Haptic Volumes and Filters[/h2]
[previewyoutube][/previewyoutube]
At the core of the new support for haptics is a new component called HapticsVolume. You can attach this component to any collider, which will turn that collider into a source of haptic feedback. Each haptic volume can provide a specific type of sensation, like force, temperature, pain and so on, depending on what the hardware supports. Currently only force is supported.
While you can modulate the intensity freely using LogiX as any other system in Neos, you can also utilize a set of Haptic Filters for a more fine-grained control that’s not tied to the game’s update loop.
For example the intensity can be modulated based on the time of initial impact, either for individual haptic point or global volume impact. This allows you to model certain things like impact forces (e.g. when a user gets shot or an explosion happens) by simply attaching haptic volume on the user alongside other effects like particles, blood or forcefield, giving the user a strong initial jolt and potentially low lingering force.
Similarly you can attach haptic volumes to weapons or environmental pieces. Intensity can be modulated based on relative position, noise and other properties. For example you could cover a virtual hottub with a box collider and use the simplex noise to simulate the turbulence of water. All of these fine-grained filters are updated off the main thread at a fixed rate, which should help ensure enough precision to create different effects.
Long term we plan to make the fine-grained control over the modulation fully programmable with LogiX, but that requires certain extensions to the system to be done first.
[h2]Haptic Point Samplers[/h2]
The other part of the equation for the haptic subsystem are haptic samplers. These essentially probe the environment for the haptic volumes and sample the intensity for different sensations and relay them to different haptic hardware for interpretation.
Each haptic device (currently specifically bHaptic Vest, forearm sleeves and face cover) is then abstracted into a set of points with a body mapping parametrization. This parametrization is then taken by point mappers on the avatar, which create temporary haptic point samplers based on the hardware you have present on your system.
Currently there are 3 parameterization types - torso, head and arms, but more will come in the future. Both new and existing avatars are automatically initialized with those mappers with estimated values. You can adjust them so they fit your avatar and its body shape better.
The torso points are automatically mapped to the nearest bone, making the set of them follow the shape of your spine, rather than staying rigidly flat. When combined with full body tracking, this should provide a good degree of realism.
The benefit of this system is that once calibrated, the avatars will work with different haptic hardware and different combinations, automatically picking up and mapping the available points. All you need to do is specify the dimensions of the avatar's head, torso, arms and other body parts and the system will do the rest, future-proofing the whole system.
[h2]bHaptics Integration & Avatar haptics[/h2]
Thanks to the two subsystems above, we were able to integrate the bHaptics hardware natively with Neos thanks to a devkit kindly provided by their developer relations. If you have the Tactot vest, Tactile face cushion or Tactosy for arms, Neos will automatically pick them up and supply them with haptic sources from any environment or items that were setup with haptic volumes.
For avatars the support is fully automatic out of the box assuming they’re a full body (IK) avatar. The avatar’s body colliders are automatically utilized and locally injected with haptic volumes. By default the hands and head use 5 % intensity, while the rest of the body does 1 %. This will be likely changed and tweaked as we go, you can play with it yourself by modifying the properties on the HapticsManager at your root in the scene.
You can create custom avatar haptics by attaching AvatarHapticSourceManager anywhere on your avatar and providing it with a list of the active states of the HapticVolumes placed on your avatar. Thanks to this system you will automatically feel other users, not just their hands, but whole bodies as well.
[h2]Future Proofing the haptics system[/h2]
A big part of the integration was making the haptics system future proof enough, so content created today can still work with new haptic devices that come in the future without changes and to provide abstractions so creators don’t need to worry about the implementation specifics.
As new hardware comes in, we’d like to incorporate more sensations, like simulating temperature, pressure and so on, once hardware for those becomes accessible. We hope that exposing the haptics this way will also make it easier for everyone to start experimenting and prototyping with those systems.
What we’ll be doing soon is adapting the controller haptics to this system as well. That way you can utilize this system even if you don’t have any haptic suit and more easily create environmental effects to enhance your creations.
[h2]Community Highlights[/h2]
[h3]Cats in Neos[/h3]
Cats have invaded Neos! They come in many forms! from a BreadCat attached to a tracker brought to you by Mentalish! To a Kitty Kat Matching Band parading through our worlds thanks to Enverex! On top of all this catiness we had a Catssss Creator Jam host by Medra where many a feline foe or friend was created that day!

[previewyoutube][/previewyoutube]
[h3]O Mio Babbino Caro opera performance by Neivi[/h3]
This week we also had a VR Opera Performance made by the Wonderfuil Neivi! Here she uses a HTC Vive Lip Tracker to express her performance with some wonderment and presentation!
[previewyoutube][/previewyoutube]
[h3]Pot maker by Lewis[/h3]
Also our Lead Audio Designer made a fun new Toy this week! Find out all about in his video show casing it’s wonderful uses, ever just wana make a happy little pot?
[previewyoutube][/previewyoutube]
--------------------------------
Anyway this is everything for highlights of this week! If you'd like to know more, check out our official Discord for #neos-updates to see all the smaller additions, tweaks and bugfixes as well as many other amazing creations by our community. See you at the next one!

This week we have added official support for the bHaptics hardware to Neos, featuring a brand new haptics framework. This system is designed to allow you to easily enrich your Neos content with different types of haptics feedback and have it work across variety of different devices.
This week has been just as exciting and creative as the last! For community highlights we have cats, singing, and pots!



[h2]Welcome Ryuvi to the team[/h2]
As many of you probably already noticed, we had another community member join our team! Say hello to Ryuvi, our new technical artist! Thanks to his help with the new MTC 2.0 and his excellent MMC entry, we have decided to fully onboard him on the team to work with Aegis and RueShejn on graphical design and content creation.

[h2]Haptic Volumes and Filters[/h2]
[previewyoutube][/previewyoutube]
At the core of the new support for haptics is a new component called HapticsVolume. You can attach this component to any collider, which will turn that collider into a source of haptic feedback. Each haptic volume can provide a specific type of sensation, like force, temperature, pain and so on, depending on what the hardware supports. Currently only force is supported.
While you can modulate the intensity freely using LogiX as any other system in Neos, you can also utilize a set of Haptic Filters for a more fine-grained control that’s not tied to the game’s update loop.
For example the intensity can be modulated based on the time of initial impact, either for individual haptic point or global volume impact. This allows you to model certain things like impact forces (e.g. when a user gets shot or an explosion happens) by simply attaching haptic volume on the user alongside other effects like particles, blood or forcefield, giving the user a strong initial jolt and potentially low lingering force.
Similarly you can attach haptic volumes to weapons or environmental pieces. Intensity can be modulated based on relative position, noise and other properties. For example you could cover a virtual hottub with a box collider and use the simplex noise to simulate the turbulence of water. All of these fine-grained filters are updated off the main thread at a fixed rate, which should help ensure enough precision to create different effects.
Long term we plan to make the fine-grained control over the modulation fully programmable with LogiX, but that requires certain extensions to the system to be done first.
[h2]Haptic Point Samplers[/h2]
The other part of the equation for the haptic subsystem are haptic samplers. These essentially probe the environment for the haptic volumes and sample the intensity for different sensations and relay them to different haptic hardware for interpretation.
Each haptic device (currently specifically bHaptic Vest, forearm sleeves and face cover) is then abstracted into a set of points with a body mapping parametrization. This parametrization is then taken by point mappers on the avatar, which create temporary haptic point samplers based on the hardware you have present on your system.
Currently there are 3 parameterization types - torso, head and arms, but more will come in the future. Both new and existing avatars are automatically initialized with those mappers with estimated values. You can adjust them so they fit your avatar and its body shape better.
The torso points are automatically mapped to the nearest bone, making the set of them follow the shape of your spine, rather than staying rigidly flat. When combined with full body tracking, this should provide a good degree of realism.
The benefit of this system is that once calibrated, the avatars will work with different haptic hardware and different combinations, automatically picking up and mapping the available points. All you need to do is specify the dimensions of the avatar's head, torso, arms and other body parts and the system will do the rest, future-proofing the whole system.
[h2]bHaptics Integration & Avatar haptics[/h2]
Thanks to the two subsystems above, we were able to integrate the bHaptics hardware natively with Neos thanks to a devkit kindly provided by their developer relations. If you have the Tactot vest, Tactile face cushion or Tactosy for arms, Neos will automatically pick them up and supply them with haptic sources from any environment or items that were setup with haptic volumes.
For avatars the support is fully automatic out of the box assuming they’re a full body (IK) avatar. The avatar’s body colliders are automatically utilized and locally injected with haptic volumes. By default the hands and head use 5 % intensity, while the rest of the body does 1 %. This will be likely changed and tweaked as we go, you can play with it yourself by modifying the properties on the HapticsManager at your root in the scene.
You can create custom avatar haptics by attaching AvatarHapticSourceManager anywhere on your avatar and providing it with a list of the active states of the HapticVolumes placed on your avatar. Thanks to this system you will automatically feel other users, not just their hands, but whole bodies as well.
[h2]Future Proofing the haptics system[/h2]
A big part of the integration was making the haptics system future proof enough, so content created today can still work with new haptic devices that come in the future without changes and to provide abstractions so creators don’t need to worry about the implementation specifics.
As new hardware comes in, we’d like to incorporate more sensations, like simulating temperature, pressure and so on, once hardware for those becomes accessible. We hope that exposing the haptics this way will also make it easier for everyone to start experimenting and prototyping with those systems.
What we’ll be doing soon is adapting the controller haptics to this system as well. That way you can utilize this system even if you don’t have any haptic suit and more easily create environmental effects to enhance your creations.
[h2]Community Highlights[/h2]
[h3]Cats in Neos[/h3]
Cats have invaded Neos! They come in many forms! from a BreadCat attached to a tracker brought to you by Mentalish! To a Kitty Kat Matching Band parading through our worlds thanks to Enverex! On top of all this catiness we had a Catssss Creator Jam host by Medra where many a feline foe or friend was created that day!

[previewyoutube][/previewyoutube]
[h3]O Mio Babbino Caro opera performance by Neivi[/h3]
This week we also had a VR Opera Performance made by the Wonderfuil Neivi! Here she uses a HTC Vive Lip Tracker to express her performance with some wonderment and presentation!
[previewyoutube][/previewyoutube]
[h3]Pot maker by Lewis[/h3]
Also our Lead Audio Designer made a fun new Toy this week! Find out all about in his video show casing it’s wonderful uses, ever just wana make a happy little pot?
[previewyoutube][/previewyoutube]
--------------------------------
Anyway this is everything for highlights of this week! If you'd like to know more, check out our official Discord for #neos-updates to see all the smaller additions, tweaks and bugfixes as well as many other amazing creations by our community. See you at the next one!


