Buffer-based Haptics are broken for Quest 2
Haptic controller vibrations are a crucial tool for creating game immersion. While simple "buzzes" (a constant vibration amplitude for a specified duration) can be serviceable, modern controllers allow developers to play custom waveforms through the controllers, such as sawtooth waves, sinewaves, and even Audio Clips. This adds texture and nuance to the effect, and is the superior way to play haptics in 2022. After much trying, it appears to me that the buffer-based haptics are fully broken for Quest 2 controllers in the Oculus integration for Unity. I have tried in 3 ways: Using the Unity generic XR system: byte[] samples = new byte[1000]; // This would be loaded with samples to create a custom vibration waveform var handDevice = InputDevices.GetDeviceAtXRNode(hand == Hand.right ? XRNode.RightHand : XRNode.LeftHand); // Pass the buffer into the controller if its "haptic capabilities" say that it supports buffer-based haptics if (handDevice.TryGetHapticCapabilities(out var capabilities) && capabilities.supportsBuffer) handDevice.SendHapticBuffer(0, samples); Using Rift S and Quest 1 Touch controllers, the above code runs successfully. Using Quest 2 and Touch 2 controllers, "supportsBuffer" is false on the capabilities class, and the samples cannot be successfully sent. I know that it is incorrect that the Touch 2 controllers do not support the feature, as I have in a few instances been able to send a buffer to Touch 2 controllers using the below method. Using OVRHaptics: var ovrHapticClip = new OVRHapticsClip(myAudioClipToTurnIntoVibration); var channel = OVRHaptics.RightChannel; channel.Queue(ovrHapticClip); The OVRHaptics class has a function for sending a haptic buffer in through a "channel" (controller). I can actually get this method to work in a test scene. However, it requires me to put the OVR plugin in a legacy setting (Oculus/Tools/OpenXR/Switch to Legacy OVERPlugin (with LibOVR and VRAPI backends) ). If I am not in this setting the function does nothing. In another project, if I set the project to this setting and try to send the haptics buffer, the engine gets stuck in an infinite loop. According to what I can find online, the OVRHaptics class is intended to be deprecated, anyway, so it doesn't seem like a good solution. Using OVRInput: My understanding is that OVRInput is the modern, sanctioned way of sending haptics to Oculus controllers without going through the generic Unity XR system, and they contain a method for a "simple buzz" (amplitude and duration parameters only) via OVRInput.SetControllerVibration. However, they seem to lack any functionality for sending in a custom buffer, unlike the deprecated OVRHaptics. I would love any advice regarding ways I can get this feature to work. I figure I'm either wrong about some of my conclusions above, or the feature is fully broken at the moment -- either way, I'd love to know. Thanks in advance for your help!3.1KViews13likes1CommentTouch vibration in unity(C#)?
Hi, I just keep crashing my game trying out different ways to enable touch vibration, does anyone know how to do it correctly in Unity 5? Start() HapticsClip = new OVRHapticsClip(VibeClip); RHit() OVRHaptics.RightChannel.Preempt(HapticsClip); ^ In this example VibeClip is less than a second long audio clip ^This crashes unity. so does trying OVRHaptics.Channels[1].Mix(new OVRHapticsClip(VibeClip));6.2KViews0likes3CommentsSetControllerVibration not working
Hello, I'm trying to use "OVRInput.SetControllerVibration(1f, 0.8f, OVRInput.Controller.All)", for some reason my gamepad (xbox controller) is not vibrating. Am I doing something wrong ? Here's my setup : Unity v2017.2.0f3, Oculus Utilities v1.18.1, OVRPlugin v1.18.1, SDK v1.19.0. Thanks !2KViews0likes5Commentsthe "OVRHapticsClip" can't work, help me to write some codes for example.
if use the "OVRHapticsClip " create with " AudioClips" it cause unity crash, there is a Infinite loop in "while" private void InitializeFromAudioFloatTrack(float[] sourceData, double sourceFrequency, int sourceChannelCount, int sourceChannel) { double stepSizePrecise = sourceFrequency / OVRHaptics.Config.SampleRateHz; int stepSize = (int)stepSizePrecise; double stepSizeError = stepSizePrecise - stepSize; double accumulatedStepSizeError = 0.0f; int length = sourceData.Length; Count = 0; Capacity = length / sourceChannelCount / stepSize + 1; Samples = new byte[Capacity * OVRHaptics.Config.SampleSizeInBytes]; int i = sourceChannel % sourceChannelCount; while (i < length) { if (OVRHaptics.Config.SampleSizeInBytes == 1) { WriteSample((byte)(Mathf.Clamp01(Mathf.Abs(sourceData)) * System.Byte.MaxValue)); // TODO support multi-byte samples } i+= stepSize * sourceChannelCount; accumulatedStepSizeError += stepSizeError; if ((int)accumulatedStepSizeError > 0) { i+= (int)accumulatedStepSizeError * sourceChannelCount; accumulatedStepSizeError = accumulatedStepSizeError - (int)accumulatedStepSizeError; } } } I find the crux of problem: "OVRHaptics.Config.SampleRateHz" is zero!! and can't get the right value from "OVRHaptics.Config",all value is zero in "OVRHaptics.Config". I just want to make the touch vbration, somebady can help me? tell me how to use vbration API with some right code for example. the "OVRInput.SetControllerVibration" is not my need, it can't control vibration duration. please.2.4KViews0likes3Comments