Meta XR Simulator starts only once
Hi, I have a problem with the Meta XR Simulator v60 in Unity 2023.2.7f1 and Meta SDK v60. When i start a Synthetic Environment Server, set the simulator to active and start the game, everything works as intended. But only exactly once! Afterwards i get always these 2 errors: XR Management has already initialized an active loader in this scene. Please make sure to stop all subsystems and deinitialize the active loader before initializing a new one. Failed to set DeveloperMode on Start. They appear each time until i restart Unity completely (e.g. just stopping the server does not work unfortunately). I think everything should be set up correctly (e.g. plug-In provider is set to Oculus and all the items in the verify checklist are applied), because it runs without problems once... But then it stops, no idea why. Any ideas?Solved4KViews0likes4CommentsUnity 2021.3.6 Vulkan 下使用ReadPixels截图失败
我用的Unity版本是2021.3.6 我想实现局部截图的功能,通过下述这个方法 public static void SaveRenderTexture(RenderTexture rt, string folderName) { var saveDir = Application.persistentDataPath + "/" + folderName + "/"; if (!Directory.Exists(saveDir)) { Directory.CreateDirectory(saveDir); } var prev = RenderTexture.active; RenderTexture.active = rt; var png = new Texture2D(rt.width, rt.height, TextureFormat.ARGB32, false); png.ReadPixels(new Rect(0, 0, rt.width, rt.height), 0, 0); var bytes = png.EncodeToPNG(); var pathNow = saveDir + DateTime.Now.Ticks + ".png"; var file = File.Open(pathNow, FileMode.Create); var writer = new BinaryWriter(file); writer.Write(bytes); file.Close(); Object.DestroyImmediate(png); RenderTexture.active = prev; } 1.在Vulkan的情况下打包到Quset 截出的图片只有显示下面一半或者部分,其余部分是黑色的但编辑器模式下该功能是正常的。 2.用OpenGLES3也能正常显示,但会导致其余的模型渲染层级不对,部分模型没有显示,unity提示“Symmetric Projection is only supported on Quest 2 and QuestPro with Vulkan and Multiview.” 似乎在我的项目里Vulkan和OpenGLES3只能选一个或者有冲突,请问您知道原因或解决方法吗393Views0likes0CommentsEnable hand and controller tracking at the same time.
Hi, I have a Oculus Quest Pro, and work on a Unity project that needs hand tracking and controller tracking for a physical object, but I can't enable hand and controller tracking at the same time. So I wonder is this possible? or is there any other ways to track a physical object using Oculus?12KViews5likes15CommentsMeta Interaction SDK: UseInteractable - Requirements to use or any documentation available?
Hi, I'm working with the Meta Interaction SDK (v62.0.0) and trying to get something to work with the UseInteractable component (besides the example Spray bottle). I've been referencing the Spray bottle closely, and for my custom object, I've written a script that implements IHandGrabUseDelegate, and assigned it to UseInteractable's "Hand Use Delegate" field. The object has HandGrabInteractables, etc. (similar to the Spray bottle). However, the methods from the interface (BeginUse, EndUse, and ComputeUseStrength) are not being called. I'm wondering if I'm missing something (probably am), and if there is any documentation on steps / requirements for the Use interaction to work. Thanks for any help!2.6KViews0likes4CommentsUnity Meta Quest App not getting Room Data of the device and always loads the prefab fallback
Hi there, I am facing an issue with my mixed reality app since i switched from the deprecated Room Mesh Controller to the "new" MR Utility Kit. The problem is, that the app is not getting the room data of my scanned room and directly loading one of the fallbacks inside the app, so the room data inside of app is completly different to my real rooms data. The "funny" thing is, that the app is always asking if i wanna setup a new room scan or continue with the existing and showing me the correct room scan i did before, but after accepting the room scan it still not taking the scan inside the app and keeps loading one of the prefabs. first of all, i am using the version 64 of the Meta SDK, because in the newer version 65 the feature got even buggier and it is not even showing the general necessary room setup and continues to the app without any prefab room setup lol And for those ones, who say: "just deactivate the Fallback case with the Prefabs inside the MRUK Component" I did exactly this and in that case the app starts without any room information so every object i spawn is falling threw the ground or the walls. I hope those are enough information, but in case just ask me if you need more and i'll try to answer as quick as i can. See ya and thank you :35.6KViews1like11CommentsMicrophone Not Working After Successful Build with Oculus Voice SDK in Unity
Issue Description: Hello Oculus Community, I've been working on a project that involves integrating the Oculus Voice SDK into my Unity application. I followed the official documentation and tutorials provided by Oculus (https://developer.oculus.com/documentation/unity/voice-sdk-tutorials-1) to implement voice commands in my VR experience. The development process went smoothly, and I successfully built the application. However, after deploying the build on an Oculus device, I encountered an issue with the microphone functionality. The application does not seem to be recognizing or using the microphone at all. As a result, users are unable to utilize voice commands within the VR experience. Here are some additional details about my setup: Unity version: Unity 2021.3.15f1 Oculus Integration package version: 2020.3.16 Oculus device used for testing: Meta Quest 2 Steps I've taken to troubleshoot the problem: Checked the microphone permissions on the Oculus device to ensure that the application has access to the microphone. Verified that the Oculus Voice SDK is properly integrated into the Unity project and that the required components are set up correctly. Tried rebuilding and redeploying the application multiple times, but the issue persists. I'm uncertain about what might be causing this problem, and I'm seeking guidance and suggestions from the community on how to resolve it. Has anyone else encountered a similar issue with the Oculus Voice SDK and microphone functionality? If so, how did you address it? Any insights, troubleshooting tips, or possible solutions to this problem would be greatly appreciated. Thank you in advance for your help! Best regards, Shivam Nimbalkar1.5KViews0likes2CommentsDropDown Interaction issue
Hi, We are facing an issue where the UI TextMesh Pro dropdown interaction is not working, but other UI elements are functioning correctly. Consider the following scenarios: Scenario 1: We have a screen space overlay canvas that is converted into a world space canvas at runtime. Methods used for UI interaction: Ray interaction. Collider surface for interaction. In this scenario, the UI dropdown is not working. Scenario 2: We have a direct world space canvas, and using the same methods, the interaction works fine. Please help us to resolve this issue. FYI - we are using the meta sdk version 65.0.0 Thanks in advance.693Views0likes1CommentI can't start the application with my Meta Avatar created
I created my scene in Unity following the documentation, everything is working right down to the building blocks. I imported the buildingblocks avatar into the scene and it works when I run it with my oculus, but I can't bring in My Avatar created in the Meta account. How can I launch the application with my avatar instead of the default avatar?862Views0likes2CommentsDetecting if joint is covered up using XR Hands
I am using Meta Quest 2 headset to track my hands with XR Hands 1.4.1 package. Is it possible to get indexes of joints that are currently unavailable for tracking (due to partially covered hands, hardware on hands, etc.)? I tried using XRHandJoint.TryGetPose() but after initial hand detection it always returns true with a Pose containing predicted position and rotation (or identity Pose filled with 0s if whole hand tracking is lost). My plan was to detect currently covered up joint and do predictions using other joint positions manually. I know there exists events that can be subcribed to when whole hand tracking is lost, is there something similar for specific joints? Any suggestions are appreciated.545Views0likes0CommentsCan't play anything on PCVR anymore with the Oculus service enabled.
Hi all, I'm a developer and player in VR and I've recently set up my development environment again after moving. However, after setting up everything properly my Unity would start crashing for some reason, and I thought I would check it later and see if I could open the executable for my game, also to no avail. From there I tried to open any other VR application, even in Unreal Engine and it would instantly say "fatal error". Whenever I disable the Oculus service, everything works again and nothing's wrong.. Usually I know exactly what to do but I'm at a loss. I've done all the steps as the automated responses gave out on these forums and I've created a support ticket for it. I am not entirely sure but I'm not sure I've even opened my VR Unity game after the CPU upgrade right before the move, but at the same time, I've updated everything where I could. The stack trace isn't really conclusive since it's unable to find the address, according to the Unity bug reporter tool. Has anyone else had any experience with this?424Views0likes0Comments