Developer Mode - Android Apps - Documentation on iteraction methods?
Hi support team, I'm testing traditional Android apps on Meta Quest 3 to solve some of my needs (specifically remote desktop at this time). I was able to install and run some apps. However, I'm confused when it comes to app touch input. Is there documentation explaining how controllers and direct touch works with Android apps? So far I've found this: B = back button A = tap where the pointer is Trigger = tap where the pointer is 2 triggers + move controllers - pinch-zoom I tried coming closer to the screen and tapping with 2 fingers at once - this doesn't seem to have any effect. Can you confirm? While testing NoMachine Client app, I was able to do right click (done with 2-finger tap) only once, and I did not understand how I did it. Can you provide some guidance? Can I do 2 finger tap with controllers or hands? What about 3 finger, 4 finger? Additionally, can you please allow resizing windows in "Near" view just like they can be in "Far" view? Essentially I'm looking for a direct-touchable remote desktop, and the issues mentioned above really limit my options.908Views0likes0CommentsWhich input system to use with Movement SDK
I'd like to experiment with the Movement SDK. The readme indicates Oculus Integration is a prerequisite.and that package uses the new input system (UnityEngine.InputSystem). The Movement SDK, however, is uses the old input system (UnityEngine.Input). Which system should I use?596Views0likes0CommentsMemory Error when opening the overlay menu
While monitoring the Android logs, and running my Unity Quest game, and pressing the Oculus button to open the overlay menu, I get this wierd memory error: error getting shared memory region, memory type: Controller, failed with Status(-1, EX_SECURITY): 'requested shared memory When this happens, the game stops receiving controller and headset inputs, so the entire screen freezes that is incredible jarring. Any idea how to solve this? I guess it has something to do with the Focus Aware, and how the Input System works, as it only receives input while being active? I'm using Unity 2021.2.7 with OpenXR Plugin v1.3.0 and the "New" Input System.3.3KViews4likes4CommentsIs there a way to invoke the keyboard overlay in a VR app not made with Unity or Unreal?
In the documentation, there seems to be a way to open the keyboard overlay for Unreal and Unity development: https://developer.oculus.com/documentation/unreal/unreal-keyboard-overlays/ https://developer.oculus.com/documentation/unity/unity-keyboard-overlay/ However there is no indication on how to do the same with a third party engine or app written without using those two solutions, I couldn't find anything in the Mobile SDK docs: https://developer.oculus.com/documentation/native/android/mobile-intro/ Is this functionality available? Thanks in advance for your help!1.3KViews2likes0CommentsDifference between Consumer and Business Quest?
Hi all, I am currently working on an app for a client but seems like the APK has not the same behaviour in Consumer and Business models. I was wondering if the Unity input can be affected? The other possibility that could prevent the business model to work correctly can be due to it's time settings, the app allows the user to go through it until a certain date, how can I check if that is an issue? Thank you2KViews0likes1CommentOVRInput.Get problem with Button One always returning false (Unity 2019.3, OVRPlugin 1.44.0)
Hello, I'm using ALVR to connect Quest to my PC. I have a problem when developing anything with my Oculus Quest in Unity. I found that button One on each controller doesn't seem to work when playing in Unity. When I write: Debug.Log(OVRInput.Get(OVRInput.Button.Any)); it only prints true when pressing anything except button A and X on controllers. Do you guys have any fix that? Thanks671Views0likes0Comments