WEB technology for VR (React) + Raspberry Pi (Node (JavaScript) + Python Back-End)
Hello guys, I am not sure if that is the right place to ask or to get guidance, I am a fresh meat here with Oculus. I was wondering if there are any codebases or examples for using WEB technology for VR (React) + Raspberry Pi (Node (JavaScript) + Python Back-End) for Meta Quest 2. I saw a lot of code for C# and other native languages. Maybe there is some sort of wrapper with it e.g like Cordova is for hybrid mobile apps? I am a bit in a limbo now where to start (started checking UnityEngine etc., but I want very optimal solution). I work with A.I. technology where visual is limited and only reason it is used is to show camera view with addons on top like with AR. Maybe anyone can guide me or interested to know more what we are doing and can help? P.S. I would be happy to open public REPO for collaborators where we can work on that if more people want to make it. Possible to make mini-framework for it. Key features focus on roadmap: - Initial setup - Coding standards, configurations - Docker - NestJs framework (ideally for start with JS, has nice integration with Swagger) - CI / CD environment (optional) - Transfer Learning (AI. machine learning) - OpenCV, Tensorflow - Offline Support (when needed)1.1KViews1like1CommentHow to export Quill animations form Unity to an .apk?
So I made a VR animation in Quill. I exported it as an alembic file. Imported it in Unity through an alembic import package. Made a timeline and if I hit play I can see the animation playing and behaving as I would expect (I can also see it with my VR Quest 2 plugged into my Pc). The problem is when I tried to export the .apk form Unity. I just cannot see that file at all.. all the other files that I made from Blender are working, but that alembic one from Quill is not there. Why? Why does it work on unity, on my pc, but doesn't when I export it as an .apk and install it on my Quest 2?1.7KViews2likes2CommentsQuest not loading updated addressable from remote server but loads cached version
Despite my efforts, I'm encountering an issue with my quest app where it fails to reflect updated content from Addressables after the initial run. Upon investigation, it seems like the app is pulling assets from a local cache even after I've updated the content on my server. To test this, I deleted the catalogs and bundles from the server and reran the app, confirming my suspicion – the assets were still loading, indicating the Addressable system was indeed relying on a local cache. I am doing addressable groups-> build ->"Update a previous build". Here's what I've already tried: Ensuring 'Build remote catalog' is enabled in the settings. Disabling 'Use Asset Bundle cache' from the group settings. Confirming that the group settings allow changes post-release. Despite these steps, the app persists in loading from the local cache rather than fetching the updated content. Any insights or suggestions on resolving this issue would be greatly appreciated. P.S: with the same configurations and project everything works fine for android and iOS device, there seems some device specific issue with quest only.917Views1like1CommentCannot change/edit Immersive Image Layer Assets
Hey everyone, we have a new VR game coming out and with this launch, we wanted to update our assets within the Metadata. For some strange reason we cannot find the current Immersive Image Assets, nor can we change/upload new assets specifically for the Immersive Image Layers. Does anyone know how to solve this issue? We have created a new App Submission and everything with no avail.1.2KViews1like2CommentsHow to Hide Controller Models ?
Hi, I am a developer and am wondering how to disable the controllers being shown when using the Building Blocks in Unity. Can I use the Controller Tracking Building Block but hide the Quest controllers? I'm also using Networked Avatars, and want to pick up things using the avatar's hands, and not see the controller in the avatar's hands. Is this possible? It seems that documentation is really lacking when it comes to developing for the Quest. Hell, there's one small page on Building Blocks (https://developers.meta.com/horizon/documentation/unity/bb-overview) that isn't much of a deep dive at all. Am I missing something? Is there a page that has more detailed documentation? After spending quite some time on these forums, google, and YouTube it seems like a lot of others have questions too. Thanks.Solved1.7KViews0likes1Comment[MajorBug] Acceleration returns 0 for all Meta Quest, on Unity and Unreal Engine
Hi, Whatever the Meta Quest version 1, 2 & 3, whatever the node or device, on Unreal Engine 5.4.4 and on Unity 2021 to 6000, the requested acceleration is (0, 0, 0) and the requested angular acceleration is (0, 0, 0). In Unity, these (0, 0, 0) vectors are obtained using the TryGetFeatureValue(CommonUsages.deviceAcceleration, out deviceAcceleration) and TryGetFeatureValue(CommonUsages.deviceAngularAcceleration, out deviceAngularAcceleration) methods (but also using OpenXR) In UnrealEngine, they are obtained using the GetRawDeviceData in a blueprints. The bug has 100% repeatabilty and is very easy to reproduce Consequently: There is no way to get the acceleration for the Meta Quest The problem is neither Unity nor UnrealEngine nor OpenXR --> is has to be a Meta bug1.3KViews2likes5CommentsVelocity estimation is working worse than it was when entering the tracking FOV
Hi, The black box driver estimating the handler position and velocity using, it seems, a Kalmann and an inertial filters, obtains degraded results as compared what it obtained before. More precisely, if the handler comes from outside the FOV of the tracking cameras and enters this FOV, the velocity is completely irrelevant (sometimes pointing rearward while the handler is moved forward for one or even several frames). His is easy to reproduce by printing the angle between velocity and the player camera direction, and by moving (slowly or quickly) from behind the head to in front of the head. Before throwing objects was a real pain in VR before (example : throwing objects in SuperHot VR). Either the object was falling on the floor, or the throwing direction was wrong up to 90°. Now it is worse, since the error is random up to 180°. You would have 2 ways of solving this : 1) Going back to the previous velocity and position engine, and stop modifying it (how can we expect a developed game to react consistently otherwise?) 2) Exposing the raw data (really raw, not already filtered), enabling us to do the job as we wantit to be done (I'm a scientist in image processing for 25 years) Even better, you could do both. Best Regards802Views1like2Comments