Looking for 3d CAD files of Meta Quest 3 Controller.
Hi everyone, I'm Felix from Haply Robotics. We specialize in creating haptic force feedback technology that brings a tangible sense of touch to VR experiences. Our work is all about pushing the boundaries of what's possible in virtual reality, and you can see some of our projects at Haply.co. We’ve had great success in integrating our tech with the Meta Quest 2 controllers by designing an adapter to connect the controller to our robot. I was pretty annoying to use the mesh models to do a good job at this. We are now trying to do the same for the Meta Quest 3. To make this happen, we're in need of the surface or solid models of the Quest 3 controllers, anything usable in CAD, so no mesh ideally. This would significantly accelerate our design process and ensure seamless compatibility with the latest Quest technology. I’m reaching out to this community in hopes of connecting with someone from Meta who can assist us with this or someone who could share those files. If anyone can point me in the right direction or provide the contact of someone who can help, it would be greatly appreciated. Thanks for reading, and I’m looking forward to any guidance or connections you can offer! Felix2.6KViews0likes2CommentsQuest 3 POW level stuck at 3, Passthrough + Depth API
So I have this small Quest 3 App, basically a penalty shooter, using Passthrough + Depth API + Shadows + Vulkan (as requirement for Depth) For the 3D elements I have the goal, goal keeper gloves on the user controllers, a mascot character that shoots, and behind him a simple "screen" on a simple steel structure, this screen actually just has a couple textures to update the game progress. I also have a couple sound effects. When I launch it on Q3, with OVR settings I can see the power level of GPU&CPU is always at 3..... Also the scene starts with depth api off, there I have 72 fps, if I turn on soft occlusion drops to 30-50 fps.... What I dont get is I have another VR game, that has way more 3D elements, textures, interactions, a rig on the user, and CPU pow is at 5 and GPU at 4..... Why am I not getting higher power level on my Mixed Reality little game? here's the settings im using on the MR game, on the VR one its pretty much the same. I will send this MR game to AppLab and I'm worried it could get rejected by the performance, if there's any Meta staff reading this, game starts with Depth off, and on the start menu there's a toggle to enable it, would this help with the app review ? thanks!1.2KViews0likes4CommentsGaze/Head Tracking Combined w/ Limited Hand Tracking or Gamepad Controller (Ex. Apple Vision Pro UI)
This implementation could drastically increase the ease of use of the Quest 3 UI, and bring it much closer to the usability of the Apple Vision Pro UI for a fraction of the cost, while still allowing the flexibility of trackable controllers, conventional hand tracking, and the use of gamepad controllers or other Bluetooth devices such as keyboards or mouses. This post is directed to other developers who may have ideas to suggest as a solution, or may be interested in incorporating some of these ideas into their projects. Related Post: https://communityforums.atmeta.com/t5/Get-Help/How-to-reposition-or-resize-windows-with-a-gamepad-controller/m-p/1190574#M300771 How to reposition or resize windows with a gamepad controller? (XBox One Controller) "I love the gaze cursor feature when using a controller. Lets me quickly take actions when I don’t want to use the Quest controllers. One thing bugs me though. I’m unable to drag using the gaze cursor. If I hold the A button on an Xbox controller it will rapidly select an item instead of select and hold on to it. Are there any tricks around this?" This is directly copied from a reddit post by Independent_Fill_570 a month ago and it hasn't received any responses yet. https://www.reddit.com/r/MetaQuestVR/comments/1byjora/is_it_possible_to_resize_windows_with_a_contro... I'm having the same issue. I love the ability to use the gaze tracking for the cursor, but it restricts the ability to resize windows, reposition windows, long click (as it only repeatedly single clicks rapidly instead), so selecting text longer than a single word is also an issue. The gaze control seems to be the best substitute to the Apple Vision Pro's eye tracking cursor. Is there any way of using the gaze control to guide the cursor, but a hand tracking gesture to select or click, without it engaging as conventional hand tracking while gaze tracking is enabled? I've spent many hours now searching the internet, looking into potential sideloading options, and even pouring over the Oculus PC source code, but haven't really found anyone talking about this except for the one unanswered reddit post I've linked to. The XBox One controller has basically the same buttons available as the Quest 3 controllers do, minus the tracking, but with the gaze tracking it would be wonderful to have the controller buttons mapped properly and I can't seem to find a way to remap the gamepad keybinds without it being run through Steam and a PC link. I'd like to be able to do this natively on the Quest 3 standalone. Hand tracking only for the selecting or clicking would also be great, but even just the buttons being mapped properly so that pressing A clicks, and holding A holds the click until released would fix the issue. I am aware that pressing the select and menu buttons together toggles the gaze tracking off, and enables very limited use of the Xbox controller buttons for use in a keyboard or such, but that's not what I'm asking about here. Thanks in advance to anyone who has any helpful information to provide on this issue. If gaze tracking were combined with limited hand tracking gestures like making a closed fist, I feel like the quality of this product could more easily rival the user interface of the Apple Vision Pro.1.2KViews0likes0CommentsHow to make the hand model follow the controller and animate the hand model when you press a button
I'm currently developing a game using Meta XR Core SDK 63. Game operations only use the controller, It is assumed that hand tracking and Meta XR Interaction SDK will not be used however I don't want to display the controller model in-game, but instead I want to display a hand model so that the hand model animates when the controller is pressed. At the time of Oculus Integration, there is a prefab of CustomHandLeft/Right, By placing CustomHand under HandAnchor, I was able to display the hand model without displaying the controller model, and when I pressed the controller button, the hand model was able to animate. If there is a way to deal with this with the Meta XR Core SDK, I would like to know. Could not find CustomHandLeft/Right prefab in Meta XR Core SDK 63 Also, OVRCustomHandPrefab_L/R with similar names did not perform the animation when pressing the controller button.2.8KViews0likes6CommentsSuggestion for Quest Control Enhancement
I have been thinking about ways to enhance the user experience in controlling Quest, and I have a suggestion that I believe could make interactions more intuitive and accessible. My suggestion involves implementing a new method of controlling quests without the use of controllers. Instead of relying solely on hand tracking, which can sometimes be challenging to control and click accurately, I propose utilizing a dot in the middle of the screen that moves in response to head movements. Users would be able to point to any desired location on the screen and initiate a click by touching their fingers together. This approach simplifies navigation and interaction by leveraging natural head movements for cursor control and intuitive finger gestures for clicking. It not only streamlines the user experience but also opens up accessibility options for those who may have difficulty using traditional controllers. I believe that implementing this feature could greatly enhance the Quest platform and provide users with a more immersive and user-friendly experience.472Views0likes0CommentsDevelopment of Metaverse Controller
I have developed a metaverse controller that simultaneously controls spatial computing devices like Quest and is compatible with controllers of all platforms, available in two types: portable and board. It features an innovative input method where touch confirms finger position and press executes functions, adapting virtual interfaces according to program requirements. With accuracy, versatility, and scalability, along with ergonomic design minimizing body movement, it will be the cornerstone of metaverse expansion, offering developers high productivity in virtual environments and users diverse content across platforms. Please email me at woogle554@naver.com, and I will send you the introduction video of the metaverse controller and two related patents. Best regards, Woo Yeol Jung376Views0likes0CommentsDevelopement Metaverse Controllers
I have developed a metaverse controller that simultaneously controls spatial computing devices like Quest and is compatible with controllers of all platforms, available in two types: portable and board. It features an innovative input method where touch confirms finger position and press executes functions, adapting virtual interfaces according to program requirements. With accuracy, versatility, and scalability, along with ergonomic design minimizing body movement, it will be the cornerstone of metaverse expansion, offering developers high productivity in virtual environments and users diverse content across platforms. Please email me at woogle554@naver.com, and I will send you the introduction video of the metaverse controller and two related patents. Woo Yeol Jung384Views0likes0CommentsDevelopment of Metaverse Controllers
I have developed a metaverse controller that simultaneously controls spatial computing devices like Quest and is compatible with controllers of all platforms, available in two types: portable and board. It features an innovative input method where touch confirms finger position and press executes functions, adapting virtual interfaces according to program requirements. With accuracy, versatility, and scalability, along with ergonomic design minimizing body movement, it will be the cornerstone of metaverse expansion, offering developers high productivity in virtual environments and users diverse content across platforms. Please email me at woogle554@naver.com, and I will send you the introduction video of the metaverse controller and two related patents. Woo Yeol Jung417Views0likes0CommentsProgrammatically get controller battery levels and status
Hi, Is there a way to programmatically get a Quest's controller battery levels? Preferrable with a native android library/SDK. We would like to be able to monitor battery levels in our XR platform. With ADB it is possible to retrieve this info, but there is no way to execute ADB commands from a non system app. E.g. adb shell "dumpsys OVRRemoteService | grep Paired" Paired device: <redacted>, Type: Right, Firmware: 1.9.0, ImuModel: ICM42686, Battery: 60%, Status: Disabled, ExternalStatus: DISABLED, TrackingStatus: POSITION, BrightnessLevel: GOOD Paired device: <redacted>, Type: Left, Firmware: 1.9.0, ImuModel: ICM42686, Battery: 60%, Status: Disabled, ExternalStatus: DISABLED, TrackingStatus: POSITION, BrightnessLevel: GOOD810Views1like0Comments