Buffer-based Haptics are broken for Quest 2
Haptic controller vibrations are a crucial tool for creating game immersion. While simple "buzzes" (a constant vibration amplitude for a specified duration) can be serviceable, modern controllers allow developers to play custom waveforms through the controllers, such as sawtooth waves, sinewaves, and even Audio Clips. This adds texture and nuance to the effect, and is the superior way to play haptics in 2022. After much trying, it appears to me that the buffer-based haptics are fully broken for Quest 2 controllers in the Oculus integration for Unity. I have tried in 3 ways: Using the Unity generic XR system: byte[] samples = new byte[1000]; // This would be loaded with samples to create a custom vibration waveform var handDevice = InputDevices.GetDeviceAtXRNode(hand == Hand.right ? XRNode.RightHand : XRNode.LeftHand); // Pass the buffer into the controller if its "haptic capabilities" say that it supports buffer-based haptics if (handDevice.TryGetHapticCapabilities(out var capabilities) && capabilities.supportsBuffer) handDevice.SendHapticBuffer(0, samples); Using Rift S and Quest 1 Touch controllers, the above code runs successfully. Using Quest 2 and Touch 2 controllers, "supportsBuffer" is false on the capabilities class, and the samples cannot be successfully sent. I know that it is incorrect that the Touch 2 controllers do not support the feature, as I have in a few instances been able to send a buffer to Touch 2 controllers using the below method. Using OVRHaptics: var ovrHapticClip = new OVRHapticsClip(myAudioClipToTurnIntoVibration); var channel = OVRHaptics.RightChannel; channel.Queue(ovrHapticClip); The OVRHaptics class has a function for sending a haptic buffer in through a "channel" (controller). I can actually get this method to work in a test scene. However, it requires me to put the OVR plugin in a legacy setting (Oculus/Tools/OpenXR/Switch to Legacy OVERPlugin (with LibOVR and VRAPI backends) ). If I am not in this setting the function does nothing. In another project, if I set the project to this setting and try to send the haptics buffer, the engine gets stuck in an infinite loop. According to what I can find online, the OVRHaptics class is intended to be deprecated, anyway, so it doesn't seem like a good solution. Using OVRInput: My understanding is that OVRInput is the modern, sanctioned way of sending haptics to Oculus controllers without going through the generic Unity XR system, and they contain a method for a "simple buzz" (amplitude and duration parameters only) via OVRInput.SetControllerVibration. However, they seem to lack any functionality for sending in a custom buffer, unlike the deprecated OVRHaptics. I would love any advice regarding ways I can get this feature to work. I figure I'm either wrong about some of my conclusions above, or the feature is fully broken at the moment -- either way, I'd love to know. Thanks in advance for your help!3.1KViews13likes1CommentUPDATE: Getting the Touch haptics to activate? SOLVED
Hi, all. I've been scouring the web all day for Touch haptic examples in Unity, but it's been difficult finding any beginner-level example scripts. Here's what I have roughly in "MyGunScript" that is attached to my "Rifle" game object: public class MyGunScript{ OVRHapticsClip myHapticsClip; public AudioClip myAudioClip; void Start(){ myHapticsClip = new OVRHapticsClip (myAudioClip); } void Update(){ if (OVRInput.Get (OVRInput.Button.One)){ OVRHaptics.RightChannel.Mix (myHapticsClip); } } } And then I dragged a very loud gun shot sound into the public "myAudioClip" in the script. However, I feel no vibration in either left or right Touch controllers when I pull the trigger during testing. Any clue as to what I'm doing wrong here? Thanks!Solved2.9KViews0likes5CommentsOculus Integration Unity Asset - Haptics not working on Rift S
I'm trying to use the Oculus Integration Unity Asset (version 1.42) to get haptic feedback on the Rift S controllers. In the past I've used OVRHaptics to do this but I see this has been deprecated now (even though I've tried OVRHaptics and that doesn't work on Rift S either) So I'm trying to use OVRInput.SetControllerVibration and I'm just getting no haptic feedback at all on either controller. If I use the built in UnityEngine.XR InputDevice.SendHapticImpulse, then I get haptic feedback on the Rift S just fine so the haptics are working fine on the controllers and are working fine within Unity. But no matter what I try, I cannot get the OVRInput.SetControllerVibration or the OVRHaptics stuff working at all on the Rift S Am I missing something or is this just broken somehow in the SDK?2.5KViews0likes3CommentsHaptics SDK for Unity not working
I'm trying out the new Haptics SDK for Unity I want to test it by playing a single haptic, but when I use this script I get an error with no haptics (testing in Unity editor 2022.2.13, with Air Link to a Quest 2) Script: public class OculusHapticClipTest : MonoBehaviour { [SerializeField] private HapticClip _hapticClip; private HapticClipPlayer _hapticClipPlayer; private void Start() { _hapticClipPlayer = new HapticClipPlayer(_hapticClip); _hapticClipPlayer.Play(HapticInstance.Hand.Left); } } Error: Error: HapticsSDK has already been initialized UnityEngine.Debug:LogError (object) Oculus.Haptics.HapticInstance:Initialize () (at ./Packages/com.meta.xr.sdk.haptics/Runtime/HapticInstance.cs:101) Oculus.Haptics.HapticInstance:MakeClipFromJson (string) (at ./Packages/com.meta.xr.sdk.haptics/Runtime/HapticInstance.cs:127) Oculus.Haptics.HapticClipPlayer:.ctor (Oculus.Haptics.HapticClip) (at ./Packages/com.meta.xr.sdk.haptics/Runtime/HapticClipPlayer.cs:77) OculusHapticClipTest:Start () (at Assets/CombatProto/Impact CFX Haptic/OculusHapticClipTest.cs:14) I can't find anything else that may be calling new HapticClipPlayer() anywhere else in the scene. Any tips? Thanks1.8KViews1like1CommentUnreal 5.3.2 OculusXRInput crash caused by haptics
Hey, I've been having an issue with using haptics. I upgraded from unreal 5.1 to 5.3.2 and updated the MetaXR integration package to V60.0. After the upgrade, using "Set Haptics By Value" or "Play Curve Haptic Effect" both cause a crash in Editor and in the Meta Quest 2&3. When removing these from my BP, the crash does not occur. Did something change to the haptic functionality? Am i missing something? Sincerely, RaymondSolved1.5KViews0likes1CommentNative Quest haptics / vrapi_SetHapticVibrationBuffer
I am adding haptics to my Native SDK project for Go/Quest. I find in the headers a vrapi_SetHapticVibrationSimple where you pass in 0-1. So that's straightforward. However I also find a vrapi_SetHapticVibrationBuffer that takes a start time, a number of samples, "true if this is the end of the buffers being sent", and a series of unsigned char samples. How do I use this? This looks very useful but I am looking hard through the Native Quest documentation and there is no documentation for haptics for the Native API. Things I would like to know: * Can I use this feature? The only reference I find to haptics buffers on the website is this older documentation for Oculus Touch on desktop, which says the feature is deprecated. But it is used in the VrController sample in the native SDK so I assume it is deprecated on Desktop only and not Mobile. * Is there an upper bound on the allowed duration/length of a ovrHapticBuffer? * What is the exact interpretation of the sample values? 0 is no intensity, 255 is max intensity, I assume? * What is the memory ownership of the HapticBuffer member of ovrHapticBuffer? Will vrapi retain a pointer to this buffer? May I safely free/alter the buffer immediately after calling vrapi_SetHapticVibrationBuffer? * Is there a value (such as 0) for BufferTime which will be interpreted as "begin playing immediately"? * Can I use this feature to set a vibration frequency? If so, how? The Unity and Unreal Quest documentation have "frequency" parameters on the haptics methods for those APIs but vrapi_SetHapticVibrationSimple does not. * Should I expect the feature to work on Go? Are there differences in the haptic behavior between Go and Quest?1.1KViews0likes1CommentGetting a crash with OVRHapticsClip in Unity
Hey everybody! Not sure what I'm doing wrong, unless it doesn't like the clips I've tried, but I get a crash every time I try to run my game with the line: myHapticsClip = new OVRHapticsClip(hapClip, 0); I have tried a few clips now but it just goes non responsive when I hit play. Any ideas, or anyone see this before? Thanks in advance, Matt K1.1KViews0likes4CommentsMeta Haptics Studio Feedback and Question
We have been using Meta Haptics Studio with success, however, when manually editing the .haptic files there seems no way to shorten or lengthen the total haptic sound, I can see in the SDK that the length used is the total length (not selected length), is there a hidden feature, or is this not possible? As this is a tool feedback forum, these are the feature requests we would love in Meta Haptic Studio: Trim the length of the haptic Add length to the haptic (even just a length thing) Assign relative paths from a project source to each folder to export (we use folders that contain feature sets, not lots of function based folders, so a footstep haptic would be in the character/haptics folder not all under 1 haptics folder) Removal of source files (no need for them once design is good) Sub Groups of Haptics (We have a-lot) Export as Wav (for other systems, but allows us to focus on Quest first)1KViews0likes0CommentsTouch haptics amplitude/intensity - what is the range, and in what units?
Hi - I'm developing an experiment with Quest 2 in Unity that uses haptics with variable intensity. In Unity, we're given a range of 0 to 1, where 1 is the maximum intensity the Touch controller can deliver. I assume this is the same range as we had on the Rift S, where the range was expressed as between 0 and 255. For my experiment, I really need to know what intensity is being delivered at a given level - eg what is the actual intensity (or the amplitude) of vibrations at 0.2, 0.5 etc in the scale? In other experiments I've read up, the intensity of a vibration is measured in metres per second per second, which I believe ultimately depends on the amplitude of the vibration (in metres). I've searched the Meta documentation, but can't find any information anywhere - can anyone please tell me the range either of intensity (m/s/s) or amplitude (m). Also, whether the 0 to 1 scale is linear or something else. Alternatively, can anyone point to any documentation that would help, please?791Views0likes0Comments