How to obtain the user token using the Meta XR Platform SDK
I currently need to link the player's Meta account to our game using the Unity Meta XR Platform SDK for login purposes. After passing the application permission check, the player's ID can be obtained through the Users.GetLoggedInUser() method. However, the Token cannot be obtained using the Users.GetAccessToken() method. This method always returns an empty result. Could you please tell me if I have not configured something correctly or if there is any missing step?SolvedWhich AR glasses to buy for research with LLMs and Raw data
Hello there, I'm writing you because, together with a group of PhD and Master's students at my university, we are exploring the development of new mobile or web-based applications that can interface with the Meta Ray-Ban or Aria smart glasses via SDK. Our goal is to test our own vision-language models (VLLMs) accessing directly the raw data streams—specifically video and audio—from the glasses and providing contextualized responses through the device’s built-in speakers using our own LLMs. We are particularly interested in whether it is possible to develop a mobile app or even getting access through a web browser that can: Collect and transmit raw sensor data (video/audio) Send processed responses back to the glasses Use Bluetooth or an Android mobile app (possibly via XRCore or Unity) as the communication bridge If this is feasible, could you kindly advise: Which smart glasses model(s) you recommend for this type of development. We want to buy some few glasses to start with What plugins, SDKs, or frameworks would be most suitable We appreciate your guidance and thank you in advance for your support. Best regards, Luis F.24Views0likes0CommentsHow to export Quill animations form Unity to an .apk?
So I made a VR animation in Quill. I exported it as an alembic file. Imported it in Unity through an alembic import package. Made a timeline and if I hit play I can see the animation playing and behaving as I would expect (I can also see it with my VR Quest 2 plugged into my Pc). The problem is when I tried to export the .apk form Unity. I just cannot see that file at all.. all the other files that I made from Blender are working, but that alembic one from Quill is not there. Why? Why does it work on unity, on my pc, but doesn't when I export it as an .apk and install it on my Quest 2?1.7KViews2likes2CommentsApp Distribution Upload Issues: DevPost
Hi, Posted this earlier in the general Forum and discovered this may be a better place for help. I'm attempting to Upload a build to my Alpha Channel via the Meta Quest Developer Hub but it keeps failing. Below is the Error message: Error: An unexpected error occurred {"cliErrorType":"UNEXPECTED_ERROR","cliErrorMessage":"File or directory 'D:\\**\\' was not included into executable at compilation stage. Please recompile adding it as asset or script." I'm assuming this means I need to add the Directory and or filepath to the assets within my Unity Project but I'm not finding much in the way of how to actually do so. Please let me know if anyone has any ideas of how to do so or if you've encountered this error and found a solution. Below is a screenshot of the error message. The App is developed for PC and the build is for the Rift S. Once this is all good I plan on making a Quest Build as well. Best regards,Solved2.9KViews1like7CommentsQuest pro eye tracking vergence
Hi all. I finally got Eye tracking to work on my unity project (after updating my developer account on the phone app). Now I'm rendering the gaze rays from my eyes and testing eye independence and vergence. In all of my testings the vergence is nearly constant, regardless of whether I'm focusing on an object one inch or many feet away. However, gaze rays should be nearly parallel when focusing on very far away objects, and almost 90 degrees from each other when focusing on very close objects. That doesn't happen at all. The two rays are correct in that they both point in the direction my eyes were looking, but it seems Meta is not using each eye's data independently to determine the eye object rotation even though that's what the gaze script implies to be doing. My hypothesis is the algorithm uses something like a cyclops eye gaze calculation and then points both eyes towards a pre-established focal point on the cyclops ray. If that's the case then that's a problem because I'm using eye tracking precisely to discriminate between the user focusing on nearby or far away objects that may be aligned with the cyclops eye ref. I need the vergence angle for that, but it seems to be constant.3.8KViews4likes7CommentsMRUK not found despite it being created...?
I'm currently using Quest 3 v62 (now v63) and Unity 2022.3.10f1. Working on a random spawn mechanic in the MR environment where objects can spawn on the ceiling. The feature worked fine when I tested it on Unity Playtest, but once I built it on the standalone Q3 (or simply hooked it up with a quest link), the scene could no longer be implemented. The room setup does indicate that I have my tables and walls, but there's no ceiling. I presume the Spatial data didn't transfer properly (I did write a script to grant permission to Q3 for spatial data, and clicked the Permission Requests On Startup) I have no idea where it all went south. Any ideas?1.7KViews0likes2CommentsNo scene mesh when building in .apk
Greetings. When using Meta XR all-in-one SDK v69.0.0.0 and “Scene mesh” block and working with the project within passthrough (i.e. it is MR project) using quest link, it works properly and generates space mesh, but when trying to build a project for android, i.e. .apk scene mesh stops working and generating mesh. How can this be fixed? I need it to work specifically with .apk. Unity version: 2022.3.44f1 I’m using oculus quest 3. Thanks422Views0likes0CommentsOculus Audio SDK and Meta XR Audio SDK Geometry based acoustics
I have been using Oculus Spatializer Unity (v47.0) in Unity (2021.3.29f1) to create reverberation based on the geometry of the game environment. It seems that there are a number of inconsistencies and behaviors I cannot explain: 1) When a source changes from visible to non visible (i.e the direct sound becomes occluded) the direct sound remains. Toggling Spatialization enabled resets it to as expected, after a while though it returns to the direct component being included again. 2) A different attenuation curve when reflections are enabled compared to when they are disabled. The former begins as a 1 / r releationship but shows a sharp roll off after around 2.3m. The first point in particular means I cannot use the Oculus Spatializer for my project. Therefore, I had a look at using the Meta XR Audio SDK (v55.0) instead as it is newer and seems to be a replacement for the Oculus Spatializer. However, as far as I can find it does not include the same geometry based acoustic rendering. It only seems to include a shoebox room approximation (described here: https://developer.oculus.com/documentation/unity/meta-xr-audio-sdk-unity-room-acoustics/) and is unable to account for occlusion of the direct sound by the scene geometry. Am I missing something? Or are there any plans to add any sort of geometry based acoustics to the Meta XR Audio SDK in the future? Otherwise, are you able to suggest why this behavior may be happening in the Oculus Spatializer?1.8KViews0likes1CommentAPK Upload Laggy
I am experiencing a problem with my APK upload from Unity. I upload the APK from Unity just fine, but when I open the game in my headset to test, all movement is delayed, on top of that it is laggy. You cannot move anywhere, my game uses the Gorilla Locomotion System. There are no errors in the Console. The file is only about 200 megabytes. My headset is a Meta Quest 2, so it should have enough processing power to run the file. My headset has run a game the size of a gigabyte before, so it should be fine. Thank You so Much for Your Time, I greatly appreciate it. -RomanEmpire538Views0likes0Comments