Turning off HMD tracking (all 6 DoF)
Hello there, is there any way that I can turn off the all the degree of freedoms? I mean, I want to use Oculus SDK or Unity XR Interaction Toolkit to cast something from a Unity scene and do not want it to rotate or move either. I want to turn off HMD tracking completely. Is there any way that I can achieve this? Thanks...623Views0likes0CommentsApp Lab, VRC.Quest.Functional.5, The app appears to only support 3DOF, Using OVRCameraRig
Hi, So I have had my app fail to publish to App Lab with the error: 'The app appears to only support 3DOF. Apps on the Quest platform must support 6DOF and respond to the HMD's positional tracking as well as orientation.' error: VRC.Quest.Functional.5 Firstly the I have uploaded 3 apps to App Lab using exactly the same build from Unity just changing the cube map renders, all other uploads have been fine. I am running Unity 2019.4.13f1, using the OVRCameraRig as I need to use per eye skyboxes for my stereoscopic renders, (I have tried using the new XR camera but I need to use per eye unique skyboxes). Everything functions great within the app, I only need 3DOF, and I presume my app is 6DOF with developing for quest, just un-noticeable due to being a camera viewing skyboxes and only physical 3D geometry clickable buttons? NOTE: If I turn on Position tracking in the camera the app feels rubbish, with buttons moving slightly. Is there another setting I have missed? Is this a recent test that other apps have passed before? Or is this just a random test fail and I need to just re-submit with a note saying I only need 3DOF for the client? I have had apps fail before due to icon pixel size and had to add a note saying it is exactly what is required then it gets passed!! Please could anyone give some guidance.1.3KViews0likes1CommentCharacter Camera Constraint not working properly
In Unity for as long as I can remember there has always required a "Character Camera Constraint" component attached to the "OVRPlayerController" game object in order to keep the OVRPlayerController game object following the player camera utilizing the 6dof of the Oculus Quest . Ever since Oculus Intigration 12.0 was released It seems like they have changed how the Character Camera Constraint component works as it now required several other components to function without throwing any errors. Even with all of these components (Capsule Collider, Rigidbody, Simple Capsule with Stick Movement) the OVRPlayerController only rotates with the player camera, the position never changes. Does anyone know how to fix this problem/ how to get this to work, or does Oculus have to patch it?4.3KViews2likes5CommentsSensor data with highest frequency
I recently discovered, by using the new Input System, that the device reports high frequency data (got around 150 Hz in editor). This data includes deviceposition & devicerotation, but also devicevelocity, deviceangularvelocity, deviceacceleration, deviceangularacceleration. Are the velocity and acceleration derived from position and rotation? Is it the raw data from the IMU? As a follow-up, if I integrate the acceleration twice - would I get the position? Or is there any discrepancy?876Views0likes0CommentsQuest: Does Unity support surface shader tessellation for Android / OpenGL ES
I'd asked this on the Unity3D reddit, but haven't got a clear answer wrt the Quest. I'm not a coder at all (a filmmaker) using Unity and creating stereoscopic VR video, and now looking at depthmap driven - for pseudo 6dof- video on the Quest. I've not been able to get a definite answer on if Unity supports surface tessellation shaders on android/Quest. The reason I ask is, because I'd like to convert the below shader (taken from github) which works perfectly on the Quest, to a tessellation based shader so in theory (in my thinking) I can use a lower triangle count sphere and still get smoother displacement. Currently I'm using a 1million triangle sphere with video mapped on it (top half 2d, bottom half grayscale depth map) for doing simulated 6dof vr video. Any help and insights, I'll be very grateful. Kind Regards. Shader "PointCloud/Displacer/Spherical_OU"{ Properties{ _MainTex ("Texture", 2D) = "black" {} _Displacement ("Displacement", float) = 0.1 _Maximum("Maximum", float) = 99.0 _BaselineLength("Baseline Length", float) = 0.5 _SphericalAngle("Spherical Angle", float) = 10.0 _FocalLength("Focal Length", float) = 90.0 } SubShader{ Tags { "RenderType"="Opaque" } Cull Back Lighting Off LOD 300 CGPROGRAM #pragma surface surf Lambert vertex:disp nolightmap // #pragma target 3.0 #pragma target 4.6 sampler2D _MainTex; float _Displacement; float _BaselineLength; float _SphericalAngle; float _FocalLength; float _Maximum; struct Input{ float2 uv_MainTex; }; inline float getDepthSpherical(float d) { return asin(_BaselineLength * sin(_SphericalAngle)) / asin(d); } inline float getDepthFlat(float d) { return (_FocalLength / -100.0) * _BaselineLength / d; } void disp (inout appdata_full v){ v.vertex.xyz = v.normal * clamp(getDepthSpherical(tex2Dlod(_MainTex, float4(v.texcoord.xy * float2(1, 0.5), 0, 0)).r), -_Maximum, 0) * _Displacement; } void surf(Input IN, inout SurfaceOutput o){ fixed4 mainTex = tex2D(_MainTex, IN.uv_MainTex * float2(1, 0.5) + float2(0, 0.5)); //o.Emission = mainTex.rgb; o.Albedo = mainTex.rgb; } ENDCG } FallBack "Diffuse" }1.5KViews1like0CommentsHow should I develop for Go using the Rift?
I'm planning on making a Go program, but it's just so much faster to develop using the Rift. Is there a way to restrict my 6DOF of my camera and controllers to 3DOF to fit that of a Go? The Go moves the controller in a special way when rotated. I want to simulate that for the Rift. It doesn't just stay in place and rotate around itself.683Views0likes1CommentEducational VR - Need testers for Alpha version (Monkey Walk) Oculus Rift + Touch
Hey there, I'm a solo dev assembling various prototypes into the final version of my first game to be published. I'm focusing my career as a vr developer on educational content. I really find VR as an alternative for all the people who ever day-dreamed during class. I'm on a long road until I build my dream app, one that can teach various subjects and be credited in high school or college. But for now, I hope you can help me build this Demo. Monkey Walk is an explore-to-learn game. This demo (fruit and articles) places the user in a climbing scenario where one may learn Spanish and German vocabulary. This mission's main goal is to teach you a basic understanding of singular articles (like "the" and "a/an") along with some fruits' names and their proper gender. Spanish learning path includes voice recognition to practice pronunciation since day one on your learning path. At the moment the Alpha version is focused on testing locomotion, comfort and setting the mood. It is quite simple and could take no more than 5 minutes to go through the introduction; it includes a simple speech recognition game only for Spanish. Educational content will be tested on the Beta version along with the second level, where users decide which language they want to focus on, a video-lecture explaining the lesson (teaching) and way more vocabulary to learn. I will be doing Beta version test probably this weekend, but I would love to add some of the feedback you guys can provide. Requires Oculus Rift + touch and windows 10 for speech recognition to work. -How comfortable was the experience for you? -Was it easy to climb through the path? -How hard is it to grab the path? -How many times did you fall? / Was it hard to get back up into the path? -How would you describe the mood and ambient? -How many different fruits did you eat? -How many fruits could you spawn through speech? (Speech recognition is not a must answer as there is no audio guidance on pronunciation, still you can read the fruits name and try saying it; tip: try different pronunciations, even fluent Spanish speakers have a hard time, specially with banana and orange as it is a machine and not a human processing your words.) I live in a small city where I'm the only person I know that uses VR. I have had friends come over and help me make decisions as I test them, but what I really need is you, people who are used to playing VR and understand what can be improved. I still have MUCH to learn and to improve. If you want to contribute and be included in the Alpha Channel, please send your e-mail to AbstraktAwareness@gmail.com or post it here if you don't mind. I really appreciate your feedback and time, and so will the future vr learners!485Views2likes0Comments