Hand Tracking menu ruins game play
The first thing people do with hand tracking is look at their hands. The second thing they do is touch their fingers. Then Quest shuts down the game; because that's the hand gesture Meta chose as an 'escape' key. I encourage players to see/feel their hands in the experience because it is so much more enjoyable and immersive. Literally the entire point of mixed reality. This menu punishes all that fun with a distracting, overly sensitive button that apparently cannot be disabled. But can it be delayed? Ideally, the icon would not appear until after touching (and holding) thumb/finger together for 2 seconds, then become active (similar to holding controller's menu button down to reset view). I understand Quest "needs" an escape gesture, but not if it constantly interrupts everything. Anyone else dealing with this? Found another solution or workaround?1.8KViews5likes6CommentsGPS for another level experience
It would be great if the next meta quest had gps, this would give an infinite range of development possibilities. You could use the google maps api to make videogames where the map is the real world. Imagine pokemon go VR where you see your real environment full of pokemons, is an example.Solved1.4KViews0likes2CommentsMRUK not found despite it being created...?
I'm currently using Quest 3 v62 (now v63) and Unity 2022.3.10f1. Working on a random spawn mechanic in the MR environment where objects can spawn on the ceiling. The feature worked fine when I tested it on Unity Playtest, but once I built it on the standalone Q3 (or simply hooked it up with a quest link), the scene could no longer be implemented. The room setup does indicate that I have my tables and walls, but there's no ceiling. I presume the Spatial data didn't transfer properly (I did write a script to grant permission to Q3 for spatial data, and clicked the Permission Requests On Startup) I have no idea where it all went south. Any ideas?1.7KViews0likes2CommentsOpenXR error XR_ERROR_SPACE_COMPONENT_NOT_SUPPORTED_FB in build when Mixed Reality scene loaded
Hi, I am getting spammed by the following error after I load my mixed reality scene (room) via the OVRSceneManager.cs [XRCMD][failure] [XR_ERROR_SPACE_COMPONENT_NOT_SUPPORTED_FB]: xrLocateSpace(*(XrSpace*)space, baseSpace, ToXrTime(GetTimeInSeconds()), &spaceLocation), arvr/projects/integrations/OVRPlugin/Src/Util/CompositorOpenXR.cpp:11311 (arvr/projects/integrations/OVRPlugin/Src\Util/CompositorOpenXR.h:318) The mixed Reality room seems to load fine (mostly), and I am able to localize all anchors, but logcat is being spammed by the error. Not sure it's related, but also getting this error often: AnchorManagerHelpers: The tracked root node uuids are different?! 0efe69dd-c03e-a729-a4b2-3108ac493f34 - 00000000-0000-0000-0000-000000000000 Any ideas how to get rid of the error, please? Thank you. Unity 2021.3.25f1, Oculus integration v56.0, OpenXR Plugin 1.7.0, Oculus XR Plugin 3.3.0 Quest 2 and Quest Pro1.1KViews1like1CommentVr passthrough captions for hearing impaired to let you read what people around you are saying
Enhancing Accessibility with Meta Quest 3's Passthrough Captions Imagine a world where Virtual Reality (VR) isn't just a medium for gaming and entertainment but a powerful tool for accessibility. The Meta Quest 3, with its advanced passthrough capabilities, has the potential to transform this vision into reality by incorporating real-time captions for the hearing impaired. How It Works: The Meta Quest 3's passthrough technology allows users to see the real world around them while still being immersed in the virtual environment. By integrating real-time speech-to-text technology, the headset could display captions for conversations happening in the user's vicinity. This feature would enable hearing-impaired users to understand and participate in conversations effortlessly while using the VR headset. Impact on Accessibility: According to the World Health Organization, around 10% of the global population, which translates to approximately 900 million people, have some degree of hearing impairment. For these individuals, daily interactions and communications can be challenging. By offering real-time captions, the Meta Quest 3 can make VR more inclusive, ensuring that hearing-impaired users can enjoy and benefit from VR experiences just as much as others. Boosting Sales and Market Reach: Meta, as a $1.27 trillion company, stands to gain significantly from such an inclusive feature. Even a modest increase in sales can have a substantial financial impact. For instance: If the global population is approximately 9 billion, about 900 million people are hearing impaired. Capturing just 1% of this market with the new feature could result in 9 million additional users. Given the average price of a VR headset, this could translate into billions in additional revenue. By integrating real-time captioning, the Meta Quest 3 not only enhances the user experience for a significant portion of the population but also opens doors to a vast, untapped market. This accessibility feature would position Meta as a leader in inclusive technology, likely resulting in increased sales, a broader customer base, and a stronger market presence. In summary, incorporating real-time captions in the Meta Quest 3's passthrough view is not just a step towards greater accessibility—it is a strategic move that can drive significant business growth and reinforce Meta's commitment to innovation and inclusivity.770Views2likes2CommentsDiscovered anchors are placed at (0,0,0)
Hi there, I'm trying to use MetaXR anchors for my current project. I previously used Unreal 5.1 with PCVR and I could create, save and reload anchors. I am now trying to do the same in an Unreal 5.4 standalone project. My problem is that when I save the anchor, and I reload it with its UUID, it spawns at (0,0,0). Using the logcat from Android Studio, I found out that there were two errors, one pretty seamingless : SP:AF:AnchorFramework: getAnchorUuid can't find anchor for handle 4 And the second one, which I think is the real problem : [XRCMD][failure] [XR_ERROR_SPACE_COMPONENT_NOT_ENABLED_FB]: xrLocateSpace(*(XrSpace*)space, baseSpace, ToXrTime(GetTimeInSeconds()), &spaceLocation), arvr\projects\integrations\OVRPlugin\Src\Util\CompositorOpenXR.cpp:13440 (arvr\\projects\\integrations\\OVRPlugin\\Src\Util/CompositorOpenXR.h:317 My blueprint is ugly right now, but here is how I discover and save anchors :Solved1.1KViews1like2CommentsHow to geometrically align a 3D model to real furniture?
For my video-see-through AR application I want a model to automatically be placed on the biggest found table surface. I achieved the first step: The model is correctly positioned at the table center. However, the orientationis not correct. I want the model to "lie down" on the table, means laying on the back, facing up, while the longest side of the model is orientated just as the longest side of the table - see the following pictures: After a very long time trying, I could not figure out how to align the model correctly. If you have any idea/hint/ or clue I could try, please let me know. Used Asset for testing: Low Poly Human I added an Empty (called ModelPlacer) where I added the script (see below) and pulled this asset in the modelInstance field. Used Meta XR Building Blocks: Camera Rig Passthrough MR Utility Kit Scene Debugger Effect Mesh Hand Tracking Controller Tracking Technical Specifications: Unity 2022.3.50f1 VR-Glasses: Meta Quest 3 Meta MR Utility Kit Code: using Meta.XR.MRUtilityKit; using System.Collections; using System.Collections.Generic; using UnityEngine; public class ModelPlacer : MonoBehaviour { public GameObject modelPrefab; private GameObject modelInstance; void Start() { MRUK.Instance?.RegisterSceneLoadedCallback(OnSceneLoaded); } private void OnSceneLoaded() { SpawnModel(); AlignModelWithSurface(); } public void SpawnModel() { Vector3 spawnPosition = new Vector3(0.0f, 1.0f, -1.0f); modelInstance = Instantiate(modelPrefab, spawnPosition, Quaternion.identity); } public void AlignModelWithSurface() { var largestSurface = MRUK.Instance?.GetCurrentRoom()?.FindLargestSurface(MRUKAnchor.SceneLabels.TABLE); if (modelInstance != null) { if (largestSurface != null) { modelInstance.transform.SetParent(largestSurface.transform); Renderer modelRenderer = modelInstance.GetComponent<Renderer>(); modelInstance.transform.rotation = Quaternion.Euler(-90, 0, 0); Vector3 modelCenter = modelRenderer.bounds.center; Vector3 surfaceCenter = largestSurface.transform.position; Vector3 positionOffset = surfaceCenter - modelCenter; Vector3 adjustedPosition = modelInstance.transform.position + positionOffset; modelInstance.transform.position = adjustedPosition; } else { Debug.LogWarning("No surface found."); } } else { Debug.LogWarning("modelInstance is null."); } } }851Views0likes3CommentsBuilt Unity standalone app becoming laggy after enabling passthrough through airlink
Hi everyone, I have a problem regarding my built app becoming lagging after enabling passthrough in airlink, because I needed the depth api and passthrough mode. When I run the app when disabling passthrough in airlink, my app will run smoothly. I used the oculus debug tool and I will see that my compositor latency is very high when passthrough is being turned on. Any solution for this situation? I have connected my PC using ethernet cable and used a dedicated 5GHz router for my quest 3.294Views0likes0Comments