Subject: Request for Guidance - Innovation Proposal and Strategic Partnership Request
Dear Meta Community/Support Team, My name is AS33, I am a Strategic Designer and Independent Developer. I am currently researching innovation modules that may be relevant to several Meta teams, including but not limited to Meta Horizon, Generative AI, LLaMA, and Experimental Interface Research. I am seeking guidance on the following: 1. What department, contact, or channel is best for submitting innovation proposals or partnership ideas? 2. Is there a dedicated team within Meta (e.g. Horizon Labs, Research, R&D, Co-Design, etc.) that reviews early-stage concept proposals from external independent authors? 3. Are there any internal innovation or consulting programs (e.g. Co-design Program, Meta Open Research, Meta Quest Creators Hub) that are currently accepting new participants or promising collaborations? I am particularly interested in hybrid models where I can contribute not as a permanent team member, but as an external signal architect, designer, or creative collaborator. My goal is to explore mutually beneficial options that could include: - Strategic consulting on symbolic systems, neuro-alignment, or immersive signal architectures - Early testing collaboration with the Meta Horizon or Generative AI teams If you can forward this to the appropriate team or share the appropriate contact paths or application portals, I would greatly appreciate your help. With respect and gratitude, AS3320Views0likes1CommentSuggestion for Developing an SDK for Meta Ray-ban Glasses
Hey everyone, I'm curious: will there ever be a developer kit (SDK) available for Meta glasses? I think the glasses are great, but there's not much to do with them right now. So, I thought, why not build a community around it? If Meta releases an API or something that other apps could call or interact with indirectly (since it seems there's already a service running in the background), this could enable commands from third-party apps. This way, you could allow users to develop their own apps and further customize the experience. For example, let's say I want to do something customizable, like turning on my smart lights using the command "Hey Meta". My own app could register a command (or more than one) so that whenever I say "Hey Meta, [do something]", this command triggers (it could be a post or a deeplink) that my app could receive and then perform the desired action, only for that command. And if this API allows me to use the camera live, like we do in Instagram livestreams, the possibilities would be endless. Think about it, everyone. I really want to build custom experiences with this, so please consider releasing an SDK for Meta glasses. Thanks!58KViews38likes28CommentsHow to limit the plane surface to the bounds of the UI actually on screen?
I am using Oculus Integration SDK (yes, the legacy version currently due to needing to make changes to the Assembly Definitions), and I am making a flat canvas. I have placed a PlaneSurface script on the object holding the canvas component. I have a sibling object called "Surface" that I put components `ClippedPlanSurface` and `BoundsClipper` on. I dragged the Canvas object with `PlaneSurface` into the Clipped Plane Surface component's 'Plane Surface' field. Interaction works just fine ... however ... The issue is that now I have an infinite plane surface for ray interaction with world space UI, even though the flat panel is just a rectangle right in front of the player. This makes it so I am able to ray cast against an invisible plane even when there is no UI there. Can anyone help me make the BoundsClipper component work, or somehow to limit the plane surface to the bounds of the UI actually on screen?878Views0likes1CommentMeta Avatars SDK (Feedback/Issues)
Do you have any feedback and/or issues in regards to the Meta Avatars SDK? Use this place to discuss, as we'll have members of the engineering team reviewing this thread! Read the blog on the Meta Avatars SDK here: https://developer.oculus.com/blog/meta-avatars-sdk-now-available/ Refer to the Meta Avatars SDK documentation here: https://developer.oculus.com/documentation/unity/meta-avatars-overview/68KViews2likes160CommentsPing data overlay
Hi. Fairly new to development and I am stuck with a project. I am trying to develop a program that displays Network information to the user (ping, jitter, packet loss etc.) . I would like this to be able to run over any 3d application in real time Similar to how OVRMetric displays headset information. Any documentation or assistance would greatly be helpful.806Views0likes1CommentPrevent Hands from going through table.
Hi, I am using the Unity Movement SDK to controle an Avatar in my unit scene. I want the player to sit in front of a table in real life and place his hands on it. In VR, the hands should also be on a table. I can roughly adjust it so that the hands are on the table. The only problem is that there are always small inaccuracies when tracking and the AVatar's hands keep disappearing into the table. Is there a way to prevent the hands from sliding through the table and always lying on it?629Views0likes1CommentAchievements feature.
Hello, I am thinking about adding achievements to my game. But from what I see in the documentation, the achievement checker app will be discontinued at the end of the year and the meta itself does not encourage the implemtation of this feature. Will this functionality be completely removed in the future ? Or are there other plans. I'm wondering if I should spend time on this, from what I'm orenting people like this feature.524Views2likes1CommentgetAuthToken not supported when trying to use colocation since Firmware 70 and 71 (Unity)
Hi, my team and I have been able to successfully use the experimental Colocation Building Block in Unity successfully until recently. Some of our headsets have been updated from firmware version v69 to v70 or v71. Since then, the Colocation feature is failing to initialize. When PlatformInit.GetEntitlementInformation gets called, it throws the following error: Failed to retrieve access token: 1 - getAuthToken not supported This seems to only affect some headsets that have been updated, on another one it works. Headsets with v69 all seem to work. I tried both Unity Netcode for Gameobjects and Photon Fusion and different users on the headsets which have all been added as test users for the app in the correct release channel in the Meta Developer Dashboard app, the problem persists on the headsets affected. They have also been factory reset - nothing seems to work. Any help would be appreciated. Kind regards, Daniel603Views1like1CommentMRUK not found despite it being created...?
I'm currently using Quest 3 v62 (now v63) and Unity 2022.3.10f1. Working on a random spawn mechanic in the MR environment where objects can spawn on the ceiling. The feature worked fine when I tested it on Unity Playtest, but once I built it on the standalone Q3 (or simply hooked it up with a quest link), the scene could no longer be implemented. The room setup does indicate that I have my tables and walls, but there's no ceiling. I presume the Spatial data didn't transfer properly (I did write a script to grant permission to Q3 for spatial data, and clicked the Permission Requests On Startup) I have no idea where it all went south. Any ideas?1.7KViews0likes2CommentsPre-Existing Kotlin Android App - Unable to upload without Oculus SDK
Hi, I have an existing android application written in kotlin which I wish to upload to the quest store. I have built and run the application on a headset locally and it all works perfectly. Upon uploading to the store i get the error "* Oculus SDK not found or older than 1.0." The app does not include the Oculus SDK or similar as it is not needed. It does not take advantage of any headset specific features and is just your standard 2d app. Is bundling the sdk in required to upload to the store? If so is there documentation on how to do so with an existing kotlin app as all the documentation is largely around apps containing native code and doesn't provide clear guidance on this use case. I've considered grabbing the OpenXR samples and smashing that together with the existing app. Is this a reasonable approach? It seems over the top especially for an app that won't require the SDK (for now). Thanks, Nathan927Views1like2Comments