How can I request the source of Horizon OS?
Hello, I am a college student and I am doing a research project in app testing on Quest. Is it possible to request the source of Horizon OS, so that I can access lower-level or kernel info of the device and set up a testing sandbox directly at the system level? Thanks.440Views1like0CommentsMeta Developer Hub preventing indie developers from uploading?
Hello! I am a strong believer in VR as a creative medium, and I have spent the last several years developing my first Quest game! I picked up quite a following online of people who have been following the journey - millions of view. I also recruited over a dozen creators to be voice actors in the game, so combined its many millions of people paying attention when adding all their combined followers up - these creators have been hyping the game up too. The game is done, but Quest Developer Portal is refusing to even upload the submission. It is saying: "The package name is already in use for another application. Every application must have a unique package name." The thing is, the package name is not already in use, and I get that error even when renaming the package to a long, random string of letters. I also get the error: Your manifest includes the following permissions restricted by Oculus, Please remove these permissions if they are not needed by your application. this is despite following Meta's androidmanifest creation instructions to the letter. The manifest does not have those permission enabled. Furthermore, it is Meta's own Unity tools that were enabling those permissions to begin with. I'm in a tight spot now where so many people have been following this journey, and have seen video demos of the game, and they are wondering why the game is not available despite being finished, and I will have to explain to them soon that Meta is preventing indie developers like myself, and presumably others, from sharing the work they spent years on if there is no resolution. any help is appreciated! Thank you!417Views0likes0CommentsQuest 3 color profile
Hi there I was looking into what is the default color profile on Quest 3 and couldn't find a certain answer to that. Is it Rift CV1 or rec.709 or rec.2020? all i found from meta actual documents was this color management guide but looks like its outdated. Longer story: I'm a VR content creator and im trying to understand which color profile should i use to match best the color profile of Quest 3 displays. We work with Canon R5c and record 8k 60 fps in RAW LT. So to preserve the best image quality this question seems to be very important to us. Thanks1.2KViews0likes0CommentsHow to propperly Align Players/OVRCameraRigs to Shared Spatial Anchors?
I have a net code project where i try to use spatial anchors to achieve co-location type of game. I am using Meta Core SDK and the provided OVR Camera Rig (OVR Manager, OVR Camera Rig)... I am using netcode I am trying to align the Player in the camera rig to the spatial anchor. I managed to synchronize the anchor location, both players see the anchor in the same real life physical location. What I'm struggling with is aligning the joining players to the anchor so their characters overlap with their real life bodies. The flow of logic so far. 1. I spawn the anchor at 0,0,0 for the host 2. Save it, and Share it with players in the lobby 3. Re-Share with joining players. 4. bind unbound anchor for client players, and wait for it to be created and localized (to this point everything is fine, both players see the anchor, but are not align to it yet.) :exclamation_mark:5. Align the client player to the anchor (this is where everything aligns wrongly) The alignment code is basically the same as the one found in sample packages, i take in a OVRSpatialAnchor, then set the "OVRCameraRig" position to 0,0, then setting the "OVRCameraRig" position to the "anchorTransform.InverseTransformPoint(Vector3.zero)" The problem is that after running the alignment code the "zero zero" is put far away outside the playable space about exactly 5 meters wrong on x and z. And if i try to manually correct for this using "camerarigwhatevr += new vector(-5,0,-5)" it just becomes more wrong. Bellow you find my alignment code. I know its something about this code, cause the anchor itself is always physically in the same spot for all players, its only the aligning that is wrong. I constantly compare to both the sample projects, and i cannot find any differences that should affect this. Based on these Samples Unity-Discover/Packages/com.meta.xr.sdk.colocation/Anchors/AlignmentAnchorManager.cs at main · oculus-samples/Unity-Discover (github.com) Unity-SharedSpatialAnchors/Assets/SharedSpatialAnchors/Scripts/AlignPlayer.cs at main · oculus-samples/Unity-SharedSpatialAnchors (github.com) My alignment code (with debugging stuff) (i manually trigger the align with a button bind so i can see what happens more easily): using System; using System.Collections; using System.Collections.Generic; using UnityEngine; // A Script to bind local player positions to the networked player. public class LocalPlayer : MonoBehaviour { public static LocalPlayer Instance; [Header("Colocation Alignment")] [SerializeField] Transform _cameraRigTransform; [SerializeField] Transform _playerHandsTransform; private void Awake() { Instance = this; } private void Update() { if (OVRInput.GetDown(OVRInput.RawButton.A)) AlignPlayerToColocation(); } private Coroutine _alignmentCoroutine; public static void AlignPlayerToColocation() { if (CoLocationAnchorManager.Singleton.LoadedAnchor == null) { Logger.Log("No colocation anchor"); return; } Instance?.AlignPlayerToAnchor(CoLocationAnchorManager.Singleton.LoadedAnchor); } public void AlignPlayerToAnchor(OVRSpatialAnchor anchor) { Debug.Log("AlignmentAnchorManager: Called AlignPlayerToAnchor"); if (_alignmentCoroutine != null) { StopCoroutine(_alignmentCoroutine); _alignmentCoroutine = null; } _alignmentCoroutine = StartCoroutine(AlignmentCoroutine(anchor, 2)); } OVRSpatialAnchor _currentAlignement; private IEnumerator AlignmentCoroutine(OVRSpatialAnchor anchor, int alignmentCount) { Debug.Log("PlayerAlignment: called AlignmentCoroutine"); while (!anchor.Created) { yield return null; } while (!anchor.Localized) { yield return null; } Logger.Log("BEFORE ALIGN [" + alignmentCount + "]: Player Transform : " + _cameraRigTransform.position); Logger.Log("BEFORE ALIGN [" + alignmentCount + "]: Anchor Transform : " + anchor.transform.position); while (alignmentCount > 0) { if (_currentAlignement != null) { // Reset the position to zero _cameraRigTransform.position = Vector3.zero; _cameraRigTransform.eulerAngles = Vector3.zero; Logger.Log("ALIGN [" + alignmentCount + "]: Player Transform : " + _cameraRigTransform.position); Logger.Log("ALIGN [" + alignmentCount + "]: Anchor Transform : " + anchor.transform.position); // wait one frame for anchor to move yield return null; } var anchorTransform = anchor.transform; if (_cameraRigTransform != null) { // set the position to be the inverse of the Ancor Transform Position (Relative to its new position) // this kinda gets the anchors releative position to the world origin. _cameraRigTransform.position = anchorTransform.InverseTransformPoint(Vector3.zero); Logger.Log("ALIGN Inverse [" + alignmentCount + "]: Player Transform : " + _cameraRigTransform.position); Logger.Log("ALIGN Inverse [" + alignmentCount + "]: Anchor Transform : " + anchor.transform.position); _cameraRigTransform.eulerAngles = new Vector3(0, -anchorTransform.eulerAngles.y, 0); } else { Logger.Log("ALIGN: CameraRigTransform is invalid"); } if (_playerHandsTransform != null) { _playerHandsTransform.localPosition = -_cameraRigTransform.position; _playerHandsTransform.localEulerAngles = -_cameraRigTransform.eulerAngles; } _currentAlignement = anchor; alignmentCount--; yield return new WaitForEndOfFrame(); } Logger.Log("ALIGN FINAL [" + alignmentCount + "]: Player Transform : " + _cameraRigTransform.position); Logger.Log("ALIGN FINAL [" + alignmentCount + "]: Anchor Transform : " + anchor.transform.position); Debug.Log("PlayerAlignment: Finished Alignment!"); Logger.Log("Alignment Finished?"); OnAfterAlignment.Invoke(); } public Action OnAfterAlignment; }2.1KViews2likes2CommentsExport space mesh scanned on quest 3?
Hey, To cut to the chase, I want to scan my entire house, then export the space so I can load it into a 3D model. This is so I can take that low poly version of my house interior, and make my own 3D model around it. (Which would be super simple walls, etc) I could go around my house measuring and make the model from scratch but having the scan to kind of trace from would be way too good. I don't have an iphone to do lidar scanning but even then I wouldn't want the images or a huge complex file. What the space scan does is insanely perfect. I don't know if I'm asking a question or...does anyone have any advice for me? Maybe the community could advise me how I'd go about making an app that lets us do this? Maybe we can do this already by finding the file? I couldn't find anything that seemed like it would be the file for the space.24KViews9likes16CommentsUnity Quest 3 Game or App Support
Dear Meta Community, I am writing to seek guidance and assistance regarding the development of a Unity VR program for my project, which focuses on a VR rehabilitation system. The project involves the creation of three distinct environments within Unity for users to navigate through as part of their rehabilitation process. We have successfully developed these environments, but now we are facing challenges in integrating them into a cohesive VR program. Specifically, we need to create a Unity program that incorporates all three environments seamlessly. This program should include a user interface with settings to adjust parameters such as speed and difficulty, as well as a game menu to allow users to select their desired environment. However, our group is currently lacking the necessary hardware to fully test and optimize the VR experience. While we have a computer capable of running the Oculus Link with the Quest 3 Headset, it must be wired in, which poses limitations for our treadmill-based setup. We are unsure if developing an app for the Oculus platform is the best approach for our project, given our hardware constraints and the need for compatibility with treadmill usage. We would greatly appreciate any insights or suggestions you may have regarding alternative solutions or workarounds that could help us achieve our objectives. Thank you very much for your time and assistance. We are eager to hear your recommendations and look forward to your response. Please do not just refer me to a page. Best regards,699Views0likes0Comments[Feature Request] Visual acuity filter for meta quest 3 ???
Hello everyone and especially the people who are in charge of the technical development of the Quest 3, We're developing a non-commercial application for visually impaired people. The goal is to obtain the image representation in the quest 3 as the real image of the visually impaired person. Through this application that person can show and and let his employer experience what his real visual limitations are. The app is almost done but there is one big hurdle we can't cross. We can't figure out how we can create a visual acuity filter for the quest 3, one that is adjustable to the degree of the person in question. We don't have access to camera software or have the possibility to do in post-processing. We desperately need some help and advice. This application has a noble purpose, your help will be a great help to thousands of visually impaired people around the word !!! Many thanks in advance, Bart.🙏1.7KViews4likes3Commentscombine external camera with oculus quest 3
We know that Meta does not allow third parties to access the camera data. However, I would like to develop an application that allows me to combine real objects in the virtual. I want to develop my application for Oculus Quest 3 in Unity, and I would like to combine an external camera to recognize QRs or specific real objects and show their virtual representation in the virtual world. How can I do this?1KViews1like0CommentsMeta Quest 3, UE5.2 Oculus: Which versions Android SDK/NDK, build tools, command-line tools, cmake?
Does anyone have documentation on correct versions to use when deploying from Unreal Engine 5.2 Oculus branch to Meta Quest 3 for the Android SDK, Android NDK, Android Build-Tools, Android Command-Line tools and cmake? What about Unreal-specific settings? Anything I should know there too? Any help is appreciated, thanks in advance.Solved14KViews1like9Commentsfbt for future meta hedsetas
So I was doing some thinking and I know that the quest 3 has upper body tracking, and I might have some form of a solution for camera based lower body tacking using the headset itself (for future headsets). So what I was thinking was that the front of the headset could have a small camera angled downwards in the front and the headset would have some form of elite strap with a camera tilting down on the back. With Meta getting into AI and all they could use AI to simulate the blind spots and to stitch the 2 camera's together. I have made an example here using the quest 3's proportions. Just the feet would be tracked and the rest would be AI IK. and as you can see the blind spots realy arent an issue execpt for under the hips, and if one where to sit it would a form of advanced AI IK that was shown off before. This mostly means the advanced AI IK that they showed of would be mostly runing the show but the cameras would heavly aid it.828Views0likes0Comments