1. Neos VR
  2. News
  3. 2020.11.29.1371 - Official bHaptics support, HapticVolumes & HapticPointSamplers

2020.11.29.1371 - Official bHaptics support, HapticVolumes & HapticPointSamplers

Official bHaptics support is here, sorry it took so long! Major chunk of the work was implementing a generalized haptics system for Neos, which is build around the concept of Haptic Volumes and Haptic Samplers. Haptic Volumes are objects you can place in your environment or on your items to serve as a source of different haptic sensations.

Haptic Samplers are then placed on your avatar based on what hardware you have available (currently bHaptic Vest, Face cover and Forearm sleeves are supported) and map the haptics to those devices. This way we ensure long term support for variety of devices as they come, I'd like to soon adapt the controller haptics to this system as well (maybe in next build or two).

The haptic volumes can also have a bunch of different filters which modulate the intensity based on distance, impact time or some functions like noise or sine. You can also fully script those and drive them from LogiX.

All full body avatars are automatically setup with haptic parametrization and locally injected with haptic volume sources, so you can feel them if you have the right hardware. This is just initial release, so tweaks and additions are to come.

A few other tweaks, additions and bugfixes as well.

[h2]New Features:[/h2]
- Implemented a general framework for haptic feedback devices, built on top of physics system and colliders:

- Added HapticPointSampler, which samples all haptic sources within its radius and relays them to a particular corresponding point on a haptic device

- Added HapticPointMappers (currently Head, Torso and Arm), which automatically creates and binds set of HapticPointSampler on an avatar based on detected haptic feedback hardware
-- All full body avatars (both old and new) are automatically initialized with those components. You can adjust the parametrization afterwards if it's mis-aligned and save with the avatar
-- Parametrization is defined in terms of general shape/size/hierarchy, positioning of individual points on body depends on the haptic hardware
-- HeadHapticPointMapper is parametrized using head size and offset. Each point is parametrized by its pitch and yaw angle relative to the center of the head, with nose being the origin
-- TorsoHapticPointMapper is parametrized using normalized vertical position along the spine (0...1 bottom to top), horizontal position (-1...1 left to right) and side (front/back). Each point is mapped to nearest bone, so the samplers will move with the body (e.g. if you bend, the sampler points will move to match your body shape). You can adjust the front/back offsets and body width, as well as normalized range on the spine
-- ArmHapticPointMapper is parametrized by normalized position along the arm (0 start at shoulder, 0.5 elbow and 1 ending at wrist) and angular offset along the arm (with top of the hand being origin). You can adjust normalized range of the bone chain, directional vectors for proper orientation and radius
- Added HapticVolume which provides source of haptics in the environment
-- This needs to be attached on same slot as a Collider set to HapticTrigger type to work
-- You can define the type of sensation (currently only Force is interpreted, but will be expanded later on)
-- Haptic will activate for any HapticPointSampler that interests with this collider
-- Intensity can be driven with LogiX
-- You can also use a set of filters to modulate the intensity based on internal properties of the intersection (this is updated off the main thread at fixed rate and thus independent of the framerate). Simply attach one or more filters on the same slot as HapticVolume. Filters are multiplied together
-- Independent HapticVolumes are additive

- Added AxisDistanceHapticFilter, which modulates haptic intensity based on distance along an axis within the local space of the volume
- Added ImpactTimeHapticFilter, which modulates haptic intensity based on the elapsed time since the initial impact of the haptic point and haptic volume
-- This can be either impact time of the individual haptic point or global (any haptic point impacting)
- Added RadialDistanceHapticFilter, which modulates haptic intensity based on spatial distance from center of the haptic volume
- Added SimplexNoiseHapticFilter which modulates haptic intensnity using a 3D simplex noise based on the local position of the haptic point within the volume
- Added SineHapticFilter which modulates the haptic intensity based on a Sine wave frequency based on the individual/global impact time
-- Sine wave can also be offset based on either radial distance or distance along axis, to create travelling "wave" effects
- Added ValueNoiseHapticFilter which randomizes the haptic intensity for each sampling instant
-- Note that the rate will depend on the internal update rate of the driver
- Added HapticManager which ensures that all haptic point mappers are properly mapped based on their priority and other user full-body avatars are automatically injected with haptic volumes if AvatarHapticSourceManager is not already present
-- This is automatically attached to the root of the user. You can adjust the intensity of the haptic volumes for hands, head and other body parts as well as visualize the volumes
-- Injected haptic volumes are local-only and won't be seen by other users

- Added AvatarHapticSourceManager, which allows creating custom haptic volumes on avatar
-- It's strongly recommended to add the active state to the list of this component, as it will automatically cull them when the user is too far away
-- The specifics of this component will likely change in the future to give more fine-grained control

- Added native support for bHaptics Tactsuit (Tactot Vest, Tactal face over (forehead haptics) and Tactosy for Arms) thanks to hardware provided by @bhaptics_jen, support requested by @Alex the pet peeve avali 🐦, @AlienInArea51 (MR-Alex), @Enverex, @casoliv, @Danyy59 and others in the past)

- Added UserVoiceMode node, which provides the voice mode of given user (requested by @Raith and @Alex from Alaska)

[h2]Tweaks:[/h2]
- AvatarNameplateVisibilityDriver will now show nametag when the avatar is not equipped (unless all nametags are hidden), allowing it to be adjusted and tweaked (based on feedback by @Turk)

[h2]Bugfixes:[/h2]
- Fixed incorrect title of ReflectionProbeWizard window and changed the title to "Reflection Probe Wizard" for better clarity (based on feedback by @ProbablePrime)
- Fixed user activity being incorrectly detected in some cases (e.g. when controller tracking is lost or there's small amount of jitter), causing the auto-away status to not activate
- Fixed hijacking of physical touch interactions to spoof interactions with developer interfaces (reported by @Cyro)