The Meta Quest 3 Has Incredible Potential, But Meta Keeps Holding It Back
I’ve been in the Meta Quest ecosystem for years, starting with the Quest 2 that I bought in the U.S. for about $250 on Black Friday. Later, I upgraded to the Meta Quest 3 here in Europe. I purchased the 512GB model at full retail price from Coolblue, which was close to €700 with no discounts. Because of that investment, I expected a polished, next-generation VR experience. Instead, the device feels restricted in ways that make no sense for its price or its potential. To bring friends into VR with me, I gave my Quest 2 to a friend so we could play together. He didn’t enjoy it and passed it to his brother, and now I’m the one trying to convince his brother to use it. I then bought the same friend a Meta Quest 3S, hoping a newer model would change his mind, but he lost interest after a few months and gave it back to me. That says a lot about how empty the ecosystem feels. If Horizon Worlds had more depth, better tools, stronger communities, and easier ways for creators to flourish, people wouldn’t walk away so fast. The biggest problem with the Quest 3 is how creator-unfriendly it is. Streaming to YouTube requires workarounds, third-party apps, and unnecessary steps. Streaming to Facebook is the only direct option, yet very few people use Facebook for live content anymore. The strangest part is that Meta owns Instagram, yet there is still no way to stream directly to Instagram from inside the headset. There’s also no simple option for TikTok, even though VR content performs extremely well on TikTok. If Meta wants VR to grow, they need to empower creators, not limit them. Right now, creators have to fight the system just to show people what VR can do. Inside Horizon Worlds, the gaps become even clearer. VRChat already allows avatar streaming, virtual selfie cameras, expressive tools, and full creative freedom. Horizon Worlds should be leading the industry, not lagging behind it. Instead, it often feels limited, closed off, and inconsistent. Many sessions are filled with trolls, children, and chaotic interactions that make the platform frustrating for adults who bought the device to relax, socialize, or create. Meta needs stronger moderation tools, age controls, and better systems to keep Horizon enjoyable for adults. Productivity is another area that needs improvement. I work remotely, so I wanted to use the Quest for work tasks, but Meta Workrooms and Meta Remote Desktop feel restricted. I had to buy Virtual Desktop just to get the proper functionality. A third-party app should not outperform Meta’s official version on Meta’s own hardware. This shows how much the ecosystem is still unfinished. Even accessories fall short. I bought the Meta Pen (the Logitech stylus collaboration) expecting a deeper creative experience, but many apps don’t correctly display the pen and instead show the standard controller. This breaks immersion and makes it feel like the pen was added to the lineup without developers being prepared to support it. The overall user experience feels inconsistent. Avatar consistency is another issue. Some apps show the updated avatars while others use older versions. This breaks the feeling of a connected metaverse. If Meta wants a unified VR identity system, avatars need to be consistent across all apps, not left to chance. One of the biggest concerns I want to warn buyers about is the replacement process. My original Meta Quest 3 had a strap loop break, so I sent it in expecting a repair. Instead, Meta replaced the device. Normally that would sound positive, but the replacement was not equal in quality. My original Quest 3 had a very clear and sharp screen. Every replacement I received was noticeably blurrier, almost like a downgrade. It felt like Meta was sending refurbished units of lower value instead of matching the premium device I originally purchased. This should not happen to customers who pay full price for a flagship headset. Meta keeps focusing on building the “next headset,” but they are ignoring the problems with the one they already sold to millions of people. The Quest 3 has incredible hardware and could be the strongest VR device on the market, but Meta needs to unlock its potential. They need to improve streaming, open up creator tools, unify avatars, fix Horizon Worlds moderation, push out affordable Quest 2 inventory to grow the user base, improve Workrooms, make the Meta Pen properly supported, and ensure replacement devices match the original quality. I’ve invested time, money, and belief into this platform. I’ve bought multiple headsets for myself, friends, and their family members, and even then, the ecosystem is not strong enough to hold their interest. That’s not a hardware problem. It’s an ecosystem problem. Meta can fix this if they prioritize the users who already believe in their vision. The Quest 3 could be incredible, but Meta needs to stop limiting it and start listening.42Views0likes0CommentsIntegrating Meta Avatars with Application Spacewarp
I am working on an application that requires the use of Application Spacewarp and Meta Avatars 2 SDK. In testing, I have determined that the shaders for the Meta Avatars are not creating Motion Vectors for Application Spacewarp to use to correctly render its faked frame. The rendering artifact that clued me into the issue was seeing the opaque Meta Avatars stuttering. According to the information from Meta's Application Spacewarp sample git repo (https://github.com/oculus-samples/unity-appspacewarp), this is a sign that the Meta Avatar shader is not generating Motion Vectors. Has anyone worked with Application Spacewarp and the Meta Avatar SDK2 as of yet who could advise me in how to address this issue? Any and all help would greatly be appreciated! I do have a possible initial lead in that in the Meta Avatar SDK2, there is a Recommended folder with a sub folder of "app_specific" which has 2 files; app-declarations and app_functions. These seem to be here for the purpose of adding app specific functionality to the Meta Avatar shader code. I am going to try and mess with them, but I don't have much experience in doing this kind of thing so again if the right people find this, any and all guidance on this point would also be appreciated!614Views0likes0CommentsMeta Avatar Joint Transforms Are In Incorrect Position After Calling LoadUser() Again.
I think I've found a bug in the meta avatars sdk version 17.2. After calling LoadUser() on an avatar entity that has already been loaded before(which may happen if you'd want to update an avatar's look), the transforms Joint Head, Joint LeftHandWrist and Joint RightHandWrist stop being in the correct positions and simply fix themselves at 0, 0, 0. Here are the steps to reproduce it: In a blank scene add a AvatarSdkManagerHorizon object and a empty gameobject with a SampleAvatarEntity component. Set the SampleAvatarEntity's BodyTracking input to be the AvatarSdkManagerHorizon's SampleInputManager. Add some code in the SampleAvatarEntity which will enable you to call LoadUser() in runtime. Ensure you have UseStandalonePlatform checked in your OculusPlatformSettings so that your own avatar loads. Connect your headset with Quest Link and run the scene to let your avatar load. In the hierachy see how the Joint Head is in the correct place. Now manually call LoadUser() and see how Joint Head is no longer in the correct place.1.5KViews3likes1CommentMeta Avatar Critical Joint Transforms go to zero position after calling LoadUser()
Our application depends upon attaching objects to each avatars hands and head. Unfortunately this means we can't use LoadUser() to update an avatar's appearance if they changed it because there seems to be a bug where the critical joint transforms just fall to the local zero position when LoadUser() is called. ot know where the users wrists and head are now Here are the steps to reproduce:1.7KViews2likes1CommentMeta Avatar Muiltiple precompiled assemblies error in Oculus Integration v44 (new issues in 2022)
I have a Unity project in version 2021.1.3.11f1, utilizing the Unity Asset store Oculus Integration version 44. I used the integration to enable SpaceWarp and the next step for my project was to import Meta avatars, but I ran into an issue that seems to be covered here previously. When bringing in the Meta Avatars SDK version 17.2, I get pre-compiled assembly errors. A previous post on the Meta forums had a solution which was to exclude the newtonsoft.Json.dll which came with the Avatar package, but this was with Oculus Integration v35 and seems to no longer work (https://forums.oculusvr.com/t5/Get-Help/Meta-Avatar-Error-Multiple-precompiled-assemblies/td-p/929944). When attempting to use this solution with all of the newest versions of Meta tech, I have to restart Unity in safe mode and get three errors: namespace 'OvrAvatarTrackingSkeleton' could not be found in two scripts, and 'OvrSpan<>' could not be found in a third. To my knowledge I am using the newest version of each solution here, so I can't understand how they would not work with each-other. If it helps, I only have the Oculus Integration Spatializer and VR components installed from the Unity package.1.7KViews1like3CommentsOptimizing avatar instantiation. High cost of creating & uploading textures to GPU.
Hello, I am trying to optimize instantiating avatars at runtime using Unity 2021.3.5f1. I am using the GPU Skinning for improved runtime performance. When I create a new avatar currently, there is a few frames where the frame time spikes while textures are created. There are 3 textures created over 3 frames. The three functions that run are: OvrAvatarGpuSkinnedPrimitive.CreateMorphTargetSourceTex OvrAvatarGpuSkinnedPrimitive.CreateJointsTex OvrAvatarGpuSkinnedPrimitive.CreateNeutralPoseTex I have tried reducing the texture precision to reduce memory created but this has little affect. I also looked into pooling the textures but their dimensions are based entirely on the specific avatar being loaded so this was not possible since I do not know the avatars ahead of time. Does anyone have any insights or tips on how I can reduce or mitigate this cost? Thank you856Views0likes0CommentsHow to detect if a user doesn't have a Meta Avatar or if they have been assigned a default avatar?
I am working with the Meta Avatars SDK and can't find any way to detect if a user doesn't already have a Meta Avatar, and as part of this I can't find a way to detect if the user has simply been assigned a default avatar. I have the ability to let the user open the customise avatar menu via a deeplink call, but no way to detect if I should tell the user to create an avatar first time if they don't have one?Solved1.5KViews0likes1CommentPhoton Voice And Meta Avatars 2
I'm wondering how anyone of you have managed to resolve a conflict between how PhotonVoice and the new Meta Avatars access the microphone. When the PhotonVoiceNetwork is initialised it will eventually call Microphone.Start() in order to get at the microphone's input. However in doing so it creates a new audio clip which makes the audio clip that the meta avatar's LipSyncInput was dependent on no longer equal to the audio coming through the microphone. If I then call LipSyncInput.StartMicrophone() it will also eventually call Microphone.Start() which creates a new AudioClip making the audio clip that Photon's recorder was dependent on no longer equal to the audio coming through the microphone. In short, I can either get lip syncing using my mic input OR photon using my mic input, but not both at the same time.1.6KViews0likes2Comments