Disabling Physical Space Features - do Spatial Anchors still work?
I'm trying to have a truly boundaryless mixed reality app for a research project; so I've disabled Physical Space Features, it works as expected however I need Spatial Anchors especially for the large environment I'm in. I have not seen any definitive documentation online regarding Spatial Anchors working or not working when Physical Space Features is disabled - logically it sounds like they would be but in practice I notice the headset still does use some physical space features like floor level regardless of the option toggled on/off so its possible Spatial Anchors might still be working. Has anyone found a definitive answer on this? Thank you!8Views0likes0CommentsHand Tracking menu ruins game play
The first thing people do with hand tracking is look at their hands. The second thing they do is touch their fingers. Then Quest shuts down the game; because that's the hand gesture Meta chose as an 'escape' key. I encourage players to see/feel their hands in the experience because it is so much more enjoyable and immersive. Literally the entire point of mixed reality. This menu punishes all that fun with a distracting, overly sensitive button that apparently cannot be disabled. But can it be delayed? Ideally, the icon would not appear until after touching (and holding) thumb/finger together for 2 seconds, then become active (similar to holding controller's menu button down to reset view). I understand Quest "needs" an escape gesture, but not if it constantly interrupts everything. Anyone else dealing with this? Found another solution or workaround?1.8KViews5likes6CommentsGPS for another level experience
It would be great if the next meta quest had gps, this would give an infinite range of development possibilities. You could use the google maps api to make videogames where the map is the real world. Imagine pokemon go VR where you see your real environment full of pokemons, is an example.Solved1.4KViews0likes2CommentsVr passthrough captions for hearing impaired to let you read what people around you are saying
Enhancing Accessibility with Meta Quest 3's Passthrough Captions Imagine a world where Virtual Reality (VR) isn't just a medium for gaming and entertainment but a powerful tool for accessibility. The Meta Quest 3, with its advanced passthrough capabilities, has the potential to transform this vision into reality by incorporating real-time captions for the hearing impaired. How It Works: The Meta Quest 3's passthrough technology allows users to see the real world around them while still being immersed in the virtual environment. By integrating real-time speech-to-text technology, the headset could display captions for conversations happening in the user's vicinity. This feature would enable hearing-impaired users to understand and participate in conversations effortlessly while using the VR headset. Impact on Accessibility: According to the World Health Organization, around 10% of the global population, which translates to approximately 900 million people, have some degree of hearing impairment. For these individuals, daily interactions and communications can be challenging. By offering real-time captions, the Meta Quest 3 can make VR more inclusive, ensuring that hearing-impaired users can enjoy and benefit from VR experiences just as much as others. Boosting Sales and Market Reach: Meta, as a $1.27 trillion company, stands to gain significantly from such an inclusive feature. Even a modest increase in sales can have a substantial financial impact. For instance: If the global population is approximately 9 billion, about 900 million people are hearing impaired. Capturing just 1% of this market with the new feature could result in 9 million additional users. Given the average price of a VR headset, this could translate into billions in additional revenue. By integrating real-time captioning, the Meta Quest 3 not only enhances the user experience for a significant portion of the population but also opens doors to a vast, untapped market. This accessibility feature would position Meta as a leader in inclusive technology, likely resulting in increased sales, a broader customer base, and a stronger market presence. In summary, incorporating real-time captions in the Meta Quest 3's passthrough view is not just a step towards greater accessibility—it is a strategic move that can drive significant business growth and reinforce Meta's commitment to innovation and inclusivity.776Views2likes2CommentsBuilt Unity standalone app becoming laggy after enabling passthrough through airlink
Hi everyone, I have a problem regarding my built app becoming lagging after enabling passthrough in airlink, because I needed the depth api and passthrough mode. When I run the app when disabling passthrough in airlink, my app will run smoothly. I used the oculus debug tool and I will see that my compositor latency is very high when passthrough is being turned on. Any solution for this situation? I have connected my PC using ethernet cable and used a dedicated 5GHz router for my quest 3.294Views0likes0CommentsNo scene mesh when building in .apk, mixed reality
Greetings. When using Meta XR all-in-one SDK v69.0.0.0 and “Scene mesh” block and working with the project within passthrough (i.e. it is MR project) using quest link, it works properly and generates space mesh, but when trying to build a project for android, i.e. .apk scene mesh stops working and generating mesh. How can this be fixed? I need it to work specifically with .apk. Unity version: 2022.3.44f1 I’m using oculus quest 3. Thanks493Views0likes2CommentsOculus Spatial Anchor drifts in the Passthrough space
I’m trying to spawn a Cube as spatial anchor. But after spawning it drifts if I move physically. I’m add OVRSpatialAnchor after instantiating like this, _cube.AddComponent<OVRSpatialAnchor>(); Please correct me if I have to check something on the build settings or some Oculus settings in Unity. Thanks guys!2.2KViews0likes4CommentsIs it possible to use multiple rooms and move a rigidbody through them?
I am working on a MR Mini Golf game using the MRUK and just now saw that you can utilize multiple rooms in your MR games. So I was wondering first off, is there anything I have to enable to allow players to travel between rooms while using my application. The secondly can I move a golfball with a rigidbody into another room while I am still in the first one? I could use a collider to disable collision between the global mesh, wall, and the ball, to allow the object to pass through the rooms. Or I could capture the velocity direction of the ball on a collider and respawn it in the next room with the same velocity and direction, but the second room would still need to be loaded while I am not in it. So what I’m asking is, is it possible to have more than one rooms active, or just to make sure the room the golf ball is in is loaded (idk if the room the player is in needs to be loaded too, but it would be nice if possible). If you answer, please don’t forget to answer my first question also. Thanks261Views0likes0Comments