Recent Discussions
Android App Development Cost-Effective Compared to iOS?
Android app development is often more cost-effective than iOS. It offers free and open-source tools like Android Studio, flexible development options, and lower app submission fees on the Google Play Store. While testing can take longer due to the wide range of devices, the overall development cost is usually lower—especially for businesses targeting a broader, global audience. Also Read: Android App Development is Shaping the Future of Mobile Techmarines.94935516 days agoHonored Guest1like1CommentRequest for supporting dynamic loading OpenCL
I am developing a native app that utilizes OpenCL, Vulkan, and OpenXR, but I’ve encountered a significant roadblock: `libOpenCL.so` cannot be dynamically loaded with dlopen because it is not included in `/etc/public.libraries.txt`. While it’s technically possible to extract the library using `adb pull /vendor/lib/libOpenCL.so` and bundle it within the APK, this is not a scalable solution for supporting a diverse range of devices, even within the Meta ecosystem. I understand this limitation may be due to security considerations, but it’s worth noting that `libvulkan.so` is already accessible. Restricting developers to Vulkan forces additional effort to convert OpenCL kernels into SPIR-V, which seems unnecessary, IMHO. Providing support for `libOpenCL.so` would significantly enhance compatibility and streamline development. This is especially important given that many deep learning frameworks rely on dynamic loading of the OpenCL library through stub code.waaaa_minjaeeeeee28 days agoExplorer1like2CommentsSuggested Idea for Enhancement - Map your follow businesses
Hello, What about to be able to map the businesses that you are following in Instagram to google map. Right now if I want to do that, I need to bookmark the business manually in Google maps. It will be nice to dynamically see places that you are following in a map.solajosee2 months agoHonored Guest0likes1CommentNeed Help Recovering Hacked Instagram Account – "Request Help from Friends" Option Missing
Hello Meta Support Team and Community, I’m reaching out for help regarding my hacked Instagram account. Earlier, Instagram had a feature where we could recover our account by requesting help from two trusted friends. I have used this option successfully in the past, where I could send a request to selected friends and, once they accepted, I was able to reset my password and recover the account. However, now when I try to recover my account and tap on “Try another way,” instead of showing the trusted friends option, it directly opens Chrome browser. There, I don’t see any option to request help from friends anymore. At first, I thought this might be an issue with my phone. But I have tried the same process on 10+ different phones (friends and family), and the same thing happens on all of them. So it seems like this is a broader issue, not just with my device. I am unable to access my account and I believe it was compromised. I have tried the usual recovery steps via email and phone number, but nothing is working for me. I really need help getting back into my account. Can someone from the support team guide me on: Why the “Request help from friends” option is missing? How can I get that feature back or recover my account through other methods? I would really appreciate any assistance. Thank you in advance. Best regards, [Shiva]heysinchuuu.20252 months agoHonored Guest1like1CommentMediaProjection breaks input & hand tracking by triggering FocusPlaceholderActivity (v76)
Device: Meta Quest 2 / Meta Quest 3 OS Version: v76 We are using Android MediaProjection API for screen mirroring in our contents. problem 1. After starting MediaProjection for screen mirroring in a normal Android activity, the system spawned : [com.oculus.vrshell.FocusPlaceholderActivity] This does not happen every time. But when it does, the `FocusPlaceholderActivity` takes over the input system, causing the original activity to lose input focus. After that, hand tracking completely breaks. Captured Log in Runtime: 05-14 17:14:42.990 3151 3151 I wm_on_create_called: [com.oculus.vrshell.FocusPlaceholderActivity, performCreate] 05-14 17:14:42.991 3151 3151 I wm_on_start_called: [com.oculus.vrshell.FocusPlaceholderActivity, handleStartActivity] Shortly after, hand tracking remapping fails: 05-14 17:14:42.992 6519 6537 W MemoryBrokerClient: Hands: failed to remap HAND_TRACKER_RIGHT: getSharedMemoryFileDescriptor (Status(-1, EX_SECURITY): 'missing runtime permission(s)') 05-14 17:14:42.993 6519 6537 W MemoryBrokerClient: Hands: failed to remap HAND_TRACKER_LEFT: getSharedMemoryFileDescriptor (Status(-1, EX_SECURITY): 'missing runtime permission(s)') ... 05-14 17:14:43.015 1394 1593 I input\_focus: \[Focus leaving 1bcbe39 [packageName]/com.mline.activity.MainActivity (server),reason=setFocusedWindow]: system\_server 05-14 17:14:43.016 1394 1593 I input\_focus: \[Focus entering a84fc08 com.oculus.vrshell/com.oculus.vrshell.FocusPlaceholderActivity (server),reason=setFocusedWindow]: system\_server ... 05-14 17:14:43.018 1849 1849 I ClientMgr: ClientMgr::SetFocusedClient: Updating focus state for all clients: com.oculus.vrruntimeservice 05-14 17:14:43.018 1849 1849 I ClientMgr: ClientMgr::SetFocusedClient: Updating Focus State: 0 for Client: [packageName]:6519, Client DisplayId: 0: com.oculus.vrruntimeservice 05-14 17:14:43.018 1849 1849 I ClientMgr: ClientMgr::SetFocusedClient: Updating Focus State: 0 for Client: com.oculus.vrguardianservice:3956, Client DisplayId: -1: com.oculus.vrruntimeservice 05-14 17:14:43.018 1849 1849 I ClientMgr: ClientMgr::SetFocusedClient: Updating Focus State: 1 for Client: com.oculus.vrshell:3151, Client DisplayId: -1: com.oculus.vrruntimeservice 05-14 17:14:43.018 1849 1849 I ClientMgr: ClientMgr::SetFocusedClient: Updating Focus State: 0 for Client: /system\_ext/bin/mrsystemservice:1167, Client DisplayId: 0: com.oculus.vrruntimeservice 05-14 17:14:43.018 1849 1849 I ClientMgr: SetFocusedPackageName - packageName com.oculus.vrshell processName com.oculus.vrshell clientId 3508380481 clientPid 3151: com.oculus.vrruntimeservice then, Custom hand meshes from content no longer render Oculus default translucent hands appear, but do not respond to gestures. HandTracking features do not recover automatically, even after returning to the original activity or default lobby. problem 2. On first launch, media projection works correctly after the user grants permission. However, after the device goes to sleep mode (press power button) then wakes up(press power button once again), and restart the MediaProjection, sometimes about 20% the problem 1 occured. when these problems were appeared, restart com.oculus.vrshell using adb command : adb shell pm clear com.oculus.vrshell, then the hand tracking is restored and input system also work again. I'm looking for a reliable way to resolve this issue. Ideally, we would appreciate it if Meta could fix this behavior at the system level, especially regarding the unexpected input focus hijacking by FocusPlaceholderActivity. thanks.M-LINESTUDIO2 months agoHonored Guest0likes0CommentsLlama 4
Dear Meta Ai I recently noticed about Meta Ai but the Llama 4 seems like is not working with the conversational flow it just keeps giving me short answers like I keep asking and asking again but its not the same as it used to be they keep giving me short responses can you change that I do not like the responses that short and also giving me a list format can you change that it is causing me confusion with a longer response and I do not want to keep asking and askingethan.6472972 months agoHonored Guest0likes9CommentsNeed Help with Quest Logic and Triggers for Object Interaction
Hi everyone! I'm new to scripting and game development and just getting started with learning the basics. 😊 I'm currently working on a simple quest-style game where the player needs to find a specific object and place it on a table. Once the object is correctly placed, a "congratulations" message should appear. I've tried using collision objects and working with triggers, but I'm still running into errors and the logic isn't working quite right. Could someone please guide me on how to approach this kind of interaction? Any tips or examples would be greatly appreciated! Thanks in advance! 🙏Elletta3 months agoMember0likes0CommentsCan't teleport player to a location when pressing a UI button in meta horizon worlds desktop editor
I've been trying to find different ways for the past week to teleport my player to a spawn point gizmo whenever it presses a button on my custom UI screen. I have 2 scripts called GameManage2 and MainMenuUI. The MainMenuUI is supposed to send a broadcast event to my GameManage2 which is then supposed to send my player to the spawnPointGizmo I had set up. I made sure that the script settings is set up so that it's specifically that spawnPointGizmo and it gives this error: "[Respawn] parameter 0 expected a PlayerID but got a None". Code snippets for both the MainMenuUI and GameManage2 script are below... ===================Main Menu UI================================= initializeUI() { let myLogo: any = this.props.Logo let myLogoSource: ui.ImageSource = ui.ImageSource.fromTextureAsset(myLogo) this.bndLogoSource.set(myLogoSource) return ui.View({ children: [ ui.Text({ text: 'Savor Sim', style: { top: -300, fontSize: 100, color: 'black', fontWeight: 'bold', textAlign: 'center', borderRadius: 20, } }), MyPrompt({ /* Below, the onClickYes object that is passed to the MyPrompt() function is actually a simple arrow (inline) function, which calls out to the public function doYes(). That function writes a simple message to the console. These two references are passed to the MyPrompt function, which references the onClickYes property (a function) for the Yes button and the onClickNo property (a function) for the No button. */ onClickYes: () => { doYes(); this.sendLocalBroadcastEvent(teleportPlayer, {player: hz.Player}); }, onClickNo: () => { doNo(); }, }), ================================================================= ===================Game Manage 2============================= class GameManage2 extends hz.Component<typeof GameManage2> { static propsDefinition = { spawnLocation: { type: hz.PropTypes.Entity }, // player: { type: hz.Player }, }; private player1 = hz.Player; start() { /** Fires any time a user joins the world */ this.connectCodeBlockEvent( this.entity, hz.CodeBlockEvents.OnPlayerEnterWorld, (player1: hz.Player) => { console.log('Player entered world:', player1); return player1; } ); this.connectLocalBroadcastEvent(teleportPlayer, (data: { player: hz.Player }) => { this.doTeleport(data.player); } ); } doTeleport(player: hz.Player): void { const playerId = player.id; // console.log(`Teleporting player with ID: ${playerId}`); if (!player) { console.error('Player is undefined. Cannot teleport.'); return; } (this.props.spawnLocation as any).as(hz.SpawnPointGizmo).teleportPlayer(player); console.log('teleported player'); } } hz.Component.register(GameManage2);kyle1743 months agoMember0likes1Comment3rd party peripheral device support through 2D app
Hello, I've developed a VR locomotion device and it would be awesome to be able to use it to send joystick commands to games running on the Meta Quest. I've looked into OpenXR and SDK support on the Meta Quest but I don't believe it's possible through those methods. I then looked into seamless multitasking and 2D apps and wondered if I could develop a 2D app that receives input from the peripheral, transforms it into joystick input, then sends that input to the game. It would also be great if the 2D app could get the position and orientation of the headset and send that to the peripheral. After looking into it I don't believe what I want to do is currently possible with multitasking and 2D apps, but wanted to post here to see if I am wrong. My understanding is 2D apps can only receive input, not send it, and it only receives input when the 2D app is focused (cursor is pointing at it). It also appears a 2D app does not have a way of receiving headset position and orientation. Is my understanding correct?alex.f.263 months agoExplorer0likes0Comments