How to obtain the user token using the Meta XR Platform SDK
I currently need to link the player's Meta account to our game using the Unity Meta XR Platform SDK for login purposes. After passing the application permission check, the player's ID can be obtained through the Users.GetLoggedInUser() method. However, the Token cannot be obtained using the Users.GetAccessToken() method. This method always returns an empty result. Could you please tell me if I have not configured something correctly or if there is any missing step?SolvedSuggestion for Developing an SDK for Meta Ray-ban Glasses
Hey everyone, I'm curious: will there ever be a developer kit (SDK) available for Meta glasses? I think the glasses are great, but there's not much to do with them right now. So, I thought, why not build a community around it? If Meta releases an API or something that other apps could call or interact with indirectly (since it seems there's already a service running in the background), this could enable commands from third-party apps. This way, you could allow users to develop their own apps and further customize the experience. For example, let's say I want to do something customizable, like turning on my smart lights using the command "Hey Meta". My own app could register a command (or more than one) so that whenever I say "Hey Meta, [do something]", this command triggers (it could be a post or a deeplink) that my app could receive and then perform the desired action, only for that command. And if this API allows me to use the camera live, like we do in Instagram livestreams, the possibilities would be endless. Think about it, everyone. I really want to build custom experiences with this, so please consider releasing an SDK for Meta glasses. Thanks!61KViews41likes29CommentsWhich AR glasses to buy for research with LLMs and Raw data
Hello there, I'm writing you because, together with a group of PhD and Master's students at my university, we are exploring the development of new mobile or web-based applications that can interface with the Meta Ray-Ban or Aria smart glasses via SDK. Our goal is to test our own vision-language models (VLLMs) accessing directly the raw data streams—specifically video and audio—from the glasses and providing contextualized responses through the device’s built-in speakers using our own LLMs. We are particularly interested in whether it is possible to develop a mobile app or even getting access through a web browser that can: Collect and transmit raw sensor data (video/audio) Send processed responses back to the glasses Use Bluetooth or an Android mobile app (possibly via XRCore or Unity) as the communication bridge If this is feasible, could you kindly advise: Which smart glasses model(s) you recommend for this type of development. We want to buy some few glasses to start with What plugins, SDKs, or frameworks would be most suitable We appreciate your guidance and thank you in advance for your support. Best regards, Luis F.24Views0likes0CommentsTechnical Questions for a LBS Game: Disabling System Gestures, Spatial Mapping & Remote Control API
Hello, I'm looking to create a multi-user, large-scale, location-based (offline) game and have a few questions: 1. Is there a way to disable system-level gestures to prevent players from accidentally exiting the application and returning to the home screen during gameplay? 2. Is there a method for scanning a large physical space (approximately 10x10 meters) to generate a persistent and shareable map file? 3. Is it possible to enable or provide some control API? We need an interface that allows a central controller to remotely start and stop the application on all devices, as well as manage the installation and updating of game content.49Views0likes1CommentSubject: Request for Guidance - Innovation Proposal and Strategic Partnership Request
Dear Meta Community/Support Team, My name is AS33, I am a Strategic Designer and Independent Developer. I am currently researching innovation modules that may be relevant to several Meta teams, including but not limited to Meta Horizon, Generative AI, LLaMA, and Experimental Interface Research. I am seeking guidance on the following: 1. What department, contact, or channel is best for submitting innovation proposals or partnership ideas? 2. Is there a dedicated team within Meta (e.g. Horizon Labs, Research, R&D, Co-Design, etc.) that reviews early-stage concept proposals from external independent authors? 3. Are there any internal innovation or consulting programs (e.g. Co-design Program, Meta Open Research, Meta Quest Creators Hub) that are currently accepting new participants or promising collaborations? I am particularly interested in hybrid models where I can contribute not as a permanent team member, but as an external signal architect, designer, or creative collaborator. My goal is to explore mutually beneficial options that could include: - Strategic consulting on symbolic systems, neuro-alignment, or immersive signal architectures - Early testing collaboration with the Meta Horizon or Generative AI teams If you can forward this to the appropriate team or share the appropriate contact paths or application portals, I would greatly appreciate your help. With respect and gratitude, AS3347Views0likes1CommentPing data overlay
Hi. Fairly new to development and I am stuck with a project. I am trying to develop a program that displays Network information to the user (ping, jitter, packet loss etc.) . I would like this to be able to run over any 3d application in real time Similar to how OVRMetric displays headset information. Any documentation or assistance would greatly be helpful.827Views0likes1CommentMRUK not found despite it being created...?
I'm currently using Quest 3 v62 (now v63) and Unity 2022.3.10f1. Working on a random spawn mechanic in the MR environment where objects can spawn on the ceiling. The feature worked fine when I tested it on Unity Playtest, but once I built it on the standalone Q3 (or simply hooked it up with a quest link), the scene could no longer be implemented. The room setup does indicate that I have my tables and walls, but there's no ceiling. I presume the Spatial data didn't transfer properly (I did write a script to grant permission to Q3 for spatial data, and clicked the Permission Requests On Startup) I have no idea where it all went south. Any ideas?1.7KViews0likes2Commentsusb-c hdmi input capability?
Hi, I'd like to develop a virtual TV app so that people can plug external hardware such as games console into the usb-c input. Can I utilise existing code of the Android API or Quest SDK to achieve this, or would meta need to make system changes to allow the usb-c to be used that way?387Views0likes0Comments