Accessibility Feature Request: Conversation Focus Mode for Ray-Ban Meta Display Glasses
Hi everyone! I’m a Ray-Ban Meta display glasses user who is hard of hearing and wears hearing aids daily. I’d love to see a conversation focus mode added that prioritizes voices directly in front of the wearer and reduces background noise. In busy environments, this would make a big difference for hearing-aid users and others who rely on clearer speech in real time. If this type of accessibility feature is ever developed, I would absolutely love the ability to have it added to my glasses and would be happy to provide feedback or participate in any beta or user-testing opportunities. I’ve also submitted this through support channels, but wanted to share here in case the team is gathering feedback.117Views1like0CommentsPassthrough randomly disables during runtime
UE 5.5.4. Meta XR plugin v78. I have an apk that I sideload onto several headsets. The app works fine while connected to my laptop for development. Never an issue. The packaged apk works fine on the headsets during testing. Seems solid. But then I'll launch the app and hand out 10+ headsets to the customers, and they walk around for a while, eventually some of the headsets lose passthrough. The app is still running and tracking. You can see the virtual objects, but they are on the field of black. You can't see and of the surroundings. Double-tapping the side of the headset simply pauses the app and takes me to the menu. I can't get passthrough working again once this happens. I added blueprint code that every 10 seconds checks if the passthrough object is null and, if so, initialize another passthrough, but that didn't help. I've fired up 10 headsets let them run and walked around the room holding several of them, grabbing them all over the headsets (in case the customers are pressing some button), but I can't get the passthrough to stop working. Hence why I don't have a log file to attach. I'm at a loss what could be happening and how to recreate the issue. Any help or suggestions would be appreciated.14Views0likes0Comments3D Raycast from hands... where is it coming from?
I have a Unity Mixed Reality project... and added the hands and real hands and that's cool. Tossed in a Zombie so I can try to get him to follow me or I can shoot him (finger gun special). Now I want to fixup the ray cast from the hands/controllers so I can interact with game objects from afar... but I'm not even sure where that ray is coming from so I can see the script. My "change" would be to have the ray extend 50m... and return a bunch of "hits" with "target class:" GameObject, yellow [put an explosion effect on that zombie -- if it's hitting the mesh] Interactable, a blue disc appears [press trigger to activate object] something from the 3d depth [depth raycast], an orange disc appears [put bullet hole on that] a scene object [floor/wall], a green [grounded] disc appears (note that that may not be the final terminus -- if there's "more models" outside the window or wall (or maybe you're picking something on the other side of a table).... [code has to see if you can shoot through it] All of course, flat against the object, and usable in code (you might be able to fire a laser through the window, but it won't go through a wall; code will see if that works)... But... I don't know where to look... the ray from the hands does #3... but I don't know where in the Asset Tree it's coming from --it will probably also tell me how to make those discs (is it a Gizmo, or a GameObject?). I figure I can add #1/2 [from the cubes, but I haven't quite figured them out yet either, and #4 [EnvironmentalRayCast [ERC] but I might have to iterate on that one because I don't see a "give me all the hits" from the ERC). Questions: a) Where is this 3d ray coming from in the asset tree so I can learn? b) Is there a good way to "scale" the discs so they're "always ~20px dia" no matter how far away they are? c) It looks like I need to change the shader of my zombie, but I'm not getting the terminology -- it occludes fine (eventually I want the IRL table to occlude it), but I need to say "oh, user picked the bellybutton -- spawn an explosion effect in his gut..." -- and how do you change shaders anyway? I can change materials from the editor, but...?Solved27Views0likes1CommentMeta interaction SDK: picked up items going through objects with collision.
I'm a newbie in vr and mixed reality development and I'm trying to create a game in unreal engine. I encountered an issue that if I grab an item and go through e.g. a virtual table with collision mesh. It doesn't react to this at all. I figured that the interaction sdk is turning the physics off in their ISDKGrabTransformerComponent, but I changed that and it still goes through, but with a bit of stuttering and teleporting. I think that's the problem with the implementation, because it snaps the grabbed objects to my hand position and I totally understand it. My question is: is there a way to bypass it, maybe some kind of thing that if I collide with object while grabbing another item it gets the position and it stays there until I don't end overlapping with it? But still I'm not sure how to implement this. If there is some kind of tutorial or anything let me know I need it for my school project.24Views0likes1CommentMeta XR Audio plugin for Unreal Engine improvements
Hello. Meta XR Audio plugin for Unreal Engine 5.6 (v83) also as other versions stores metaxraudio64.dll in Plugins\MetaXRAudio\Binaries\Win64 It is not an "Unreal way" to store DLLs and also heaving binaries folder for compiled DLLs in repo can deal some troubles with project cleanup and build/compile/cicd etc. It is much better to not have binaries folder in repo and all precompiled libraries should be in plugin folder. I did not found any github repo to make PR for changes i would like you to review. Base idea is to remove Plugins\MetaXRAudio\Binaries\Win64\metaxraudio64.dll and move it to something like Plugins\MetaXRAudio\Source\ThirdParty\MetaXRAudio\Win64 and make it to be copied at build time to Plugins\MetaXRAudio\Binaries\Win64 using built-in UE tools. You can add dedicated module to copy this dll or at least make a small fix in Plugins\MetaXRAudio\Source\MetaXRAudio\MetaXRAudio.Build.cs Line 64 - replace original code: if (CurrentPlatform == UnrealTargetPlatform.Win64) { // Automatically copy DLL to packaged builds RuntimeDependencies.Add(System.IO.Path.Combine(PluginDirectory, "Binaries", "Win64", "metaxraudio64.dll")); AddEngineThirdPartyPrivateStaticDependencies(TargetRules, "DX11Audio"); } with this code: if (CurrentPlatform == UnrealTargetPlatform.Win64) { string DLLName = "metaxraudio64.dll"; string SourceDllPath = System.IO.Path.Combine(PluginDirectory, "Source", "ThirdParty", "MetaXRAudio", "Win64", DLLName); string TargetDllPath = System.IO.Path.Combine(PluginDirectory, "Binaries", "Win64", DLLName); // Add DLL to linker delay-load list with FULL SOURCE PATH //PublicDelayLoadDLLs.Add(SourceDllPath); // RuntimeDependencies to copy DLL to PLUGIN binaries RuntimeDependencies.Add(TargetDllPath, SourceDllPath); // Add to receipt for BuildGraph awareness //AdditionalPropertiesForReceipt.Add("BuildProduct", SourceDllPath); // Third-party dependencies AddEngineThirdPartyPrivateStaticDependencies(TargetRules, "DX11Audio"); } This will copy metaxraudio64.dll from Plugins\MetaXRAudio\Source\ThirdParty\MetaXRAudio\Win64 to Plugins\MetaXRAudio\Binaries\Win64 at every build. Checked with PC, CICD, UnrealGameSync (UGS) so it covers all cases. This is a small fix i am asking you to do please and it will help developers have less pain in head. P.S. the same or similar way you can move headers and .so libs from Plugins\MetaXRAudio\Source\MetaXRAudio\Private\LibMetaXRAudio which is also not a good place to store it, and also not an "Unreal way". You can move everything to Plugins\MetaXRAudio\Source\ThirdParty\MetaXRAudio\{platform_name} Please think about it.39Views0likes0Comments[Quest 3] Accessing Camera via Android Camera2 API - Process Succeeds but Image is Black
Hello everyone, I'm developing a native Android application (integrated with Unreal Engine) for the Meta Quest 3. My goal is to programmatically capture a still image from the passthrough camera using the standard Android Camera2 API. First, I want to emphasize that the standard, system-level Passthrough feature works perfectly in the headset. To be clear, I already have Passthrough enabled and working correctly inside my Unreal app. I can see the live camera feed, which confirms the basic functionality is there. My issue is specifically with capturing a still image programmatically. While my code executes without errors and saves a file, the resulting image is always completely black. I'm aware that this was a known limitation on older OS versions. However, I was under the impression this was resolved starting with software v67, based on the official documentation here: https://developers.meta.com/horizon/documentation/native/android/pca-native-documentation Given that my headset is on a much newer version, I'm struggling to understand what I'm still missing. Here's a summary of my setup and what I've confirmed: Android Manifest & Permissions: MyAndroidManifest.xmlis configured according to the documentation and includes all necessary features and permissions. I can confirm through logs that the user grants the runtime permissions (CAMERA and HEADSET_CAMERA) successfully. Here are the relevant entries from my manifest: <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="horizonos.permission.HEADSET_CAMERA"/> <uses-feature android:name="android.hardware.camera2.any" android:required="true"/> <uses-feature android:name="com.oculus.feature.PASSTHROUGH" android:required="true"/> Camera Discovery: I am using the standardCamera2API to find the correct passthrough camera device. This part of the code works perfectly—I successfully identify the camera ID by checking CameraCharacteristics for the Meta-specific metadata key (com.meta.extra_metadata.camera_source). The Problem: The Black Image My code successfully opens the camera, creates a capture session, captures an image, and saves a JPEG file. The entire process completes without any exceptions, and my native function receives a "success" code. However, the saved JPEG file is always completely black. My system version is: v79.1032 so as my per understanding of https://developers.meta.com/horizon/documentation/native/android/pca-native-documentation It should work My Java class which is called to save the image, hello method is invoked to capture the image: package com.jaszczurcompany.camerahelper; import android.Manifest; import android.app.Activity; import android.content.ContentValues; import android.content.pm.PackageManager; import android.graphics.ImageFormat; import android.graphics.SurfaceTexture; import android.hardware.camera2.*; import android.hardware.camera2.params.StreamConfigurationMap; import android.media.Image; import android.media.ImageReader; import android.os.Build; import android.os.Environment; import android.os.Handler; import android.os.HandlerThread; import android.provider.MediaStore; import android.util.Log; import android.util.Size; import android.view.Surface; import androidx.annotation.NonNull; import androidx.core.app.ActivityCompat; import androidx.core.content.ContextCompat; import java.io.OutputStream; import java.nio.ByteBuffer; import java.text.SimpleDateFormat; import java.util.Arrays; import java.util.Collections; import java.util.Comparator; import java.util.Date; import java.util.Locale; import java.util.concurrent.CountDownLatch; import java.util.concurrent.Semaphore; import java.util.concurrent.TimeUnit; import java.util.concurrent.atomic.AtomicInteger; public class CameraHelper { private static final String TAG = "FooTag"; // Zmieniono TAG, aby pasował do Twoich logów public static final int SUCCESS = 0; // ... (reszta kodów bez zmian) public static final int ERROR_NO_CAMERA_FEATURE = -2; public static final int ERROR_MISSING_PERMISSIONS = -3; public static final int ERROR_CAMERA_MANAGER_UNAVAILABLE = -4; public static final int ERROR_NO_PASSTHROUGH_CAMERA_FOUND = -5; public static final int ERROR_CAMERA_ACCESS_DENIED = -6; public static final int ERROR_CAMERA_SESSION_FAILED = -7; public static final int ERROR_CAPTURE_FAILED = -8; public static final int ERROR_IMAGE_SAVE_FAILED = -9; public static final int ERROR_TIMEOUT = -10; public static final int ERROR_INTERRUPTED = -11; public static final int ERROR_UNSUPPORTED_CONFIGURATION = -12; private static final String META_PASSTHROUGH_CAMERA_KEY_NAME = "com.meta.extra_metadata.camera_source"; private static final byte META_PASSTHROUGH_CAMERA_VALUE = 1; private final Activity activity; private CameraManager cameraManager; private CameraDevice cameraDevice; private CameraCaptureSession captureSession; private ImageReader imageReader; private Handler backgroundHandler; private HandlerThread backgroundThread; private SurfaceTexture dummyPreviewTexture; private Surface dummyPreviewSurface; private final CountDownLatch operationCompleteLatch = new CountDownLatch(1); private final AtomicInteger resultCode = new AtomicInteger(SUCCESS); private final Semaphore cameraOpenCloseLock = new Semaphore(1); public CameraHelper(@NonNull Activity activity) { this.activity = activity; } public int hello() { Log.d(TAG, "Starting hello() sequence using Camera2 API (Two-Session approach)."); // ... (wstępne sprawdzenia są takie same) if (!activity.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA_ANY)) return ERROR_NO_CAMERA_FEATURE; cameraManager = (CameraManager) activity.getSystemService(Activity.CAMERA_SERVICE); if (cameraManager == null) return ERROR_CAMERA_MANAGER_UNAVAILABLE; if (!checkPermissions()) return ERROR_MISSING_PERMISSIONS; try { startBackgroundThread(); if (!cameraOpenCloseLock.tryAcquire(5, TimeUnit.SECONDS)) return ERROR_TIMEOUT; String cameraId = findPassthroughCameraId(cameraManager); if (cameraId == null) { setResult(ERROR_NO_PASSTHROUGH_CAMERA_FOUND); } else { openCamera(cameraId); } if (!operationCompleteLatch.await(15, TimeUnit.SECONDS)) setResult(ERROR_TIMEOUT); } catch (InterruptedException e) { Thread.currentThread().interrupt(); setResult(ERROR_INTERRUPTED); } catch (CameraAccessException e) { setResult(ERROR_CAMERA_ACCESS_DENIED); } finally { cleanup(); } Log.i(TAG, "hello() returned " + resultCode.get()); return resultCode.get(); } // Zmieniono nazwę, aby nie mylić z prośbą o uprawnienia private boolean checkPermissions() { String[] requiredPermissions = {Manifest.permission.CAMERA, "horizonos.permission.HEADSET_CAMERA"}; return Arrays.stream(requiredPermissions) .allMatch(p -> ContextCompat.checkSelfPermission(activity, p) == PackageManager.PERMISSION_GRANTED); } private String findPassthroughCameraId(CameraManager manager) throws CameraAccessException { // ... (bez zmian) final CameraCharacteristics.Key<Byte> metaCameraSourceKey = new CameraCharacteristics.Key<>(META_PASSTHROUGH_CAMERA_KEY_NAME, Byte.class); for (String cameraId : manager.getCameraIdList()) { CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId); Byte cameraSourceValue = characteristics.get(metaCameraSourceKey); if (cameraSourceValue != null && cameraSourceValue == META_PASSTHROUGH_CAMERA_VALUE) return cameraId; } return null; } private void openCamera(String cameraId) throws CameraAccessException { // ... (konfiguracja imageReader i dummyPreviewSurface bez zmian) CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId); StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP); if (map == null) { setResult(ERROR_UNSUPPORTED_CONFIGURATION); return; } Size largestSize = Arrays.stream(map.getOutputSizes(ImageFormat.JPEG)).max(Comparator.comparing(s -> (long) s.getWidth() * s.getHeight())).orElse(new Size(1280, 720)); Log.d(TAG, "Selected JPEG size: " + largestSize); imageReader = ImageReader.newInstance(largestSize.getWidth(), largestSize.getHeight(), ImageFormat.JPEG, 1); imageReader.setOnImageAvailableListener(this::onImageAvailable, backgroundHandler); dummyPreviewTexture = new SurfaceTexture(10); dummyPreviewTexture.setDefaultBufferSize(640, 480); dummyPreviewSurface = new Surface(dummyPreviewTexture); if (ActivityCompat.checkSelfPermission(activity, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) return; cameraManager.openCamera(cameraId, cameraStateCallback, backgroundHandler); } private final CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() { @Override public void onOpened(@NonNull CameraDevice camera) { cameraOpenCloseLock.release(); cameraDevice = camera; createPreviewSession(); // ZACZNIJ OD SESJI PODGLĄDU } @Override public void onDisconnected(@NonNull CameraDevice camera) { /* ... bez zmian ... */ cameraOpenCloseLock.release(); setResult(ERROR_CAMERA_ACCESS_DENIED); camera.close(); } @Override public void onError(@NonNull CameraDevice camera, int error) { /* ... bez zmian ... */ cameraOpenCloseLock.release(); setResult(ERROR_CAMERA_ACCESS_DENIED); camera.close(); } }; /** KROK 1: Utwórz i uruchom sesję TYLKO dla podglądu, aby rozgrzać sensor. */ private void createPreviewSession() { try { CaptureRequest.Builder previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW); previewRequestBuilder.addTarget(dummyPreviewSurface); cameraDevice.createCaptureSession(Collections.singletonList(dummyPreviewSurface), new CameraCaptureSession.StateCallback() { @Override public void onConfigured(@NonNull CameraCaptureSession session) { captureSession = session; try { session.setRepeatingRequest(previewRequestBuilder.build(), null, backgroundHandler); Log.d(TAG, "Preview started for warm-up. Waiting 1s..."); backgroundHandler.postDelayed(() -> { // Po opóźnieniu, zatrzymaj podgląd i zacznij przechwytywanie stopPreviewAndStartCapture(); }, 10000); } catch (CameraAccessException e) { setResult(ERROR_CAPTURE_FAILED); } } @Override public void onConfigureFailed(@NonNull CameraCaptureSession session) { setResult(ERROR_CAMERA_SESSION_FAILED); } }, backgroundHandler); } catch (CameraAccessException e) { setResult(ERROR_CAMERA_SESSION_FAILED); } } /** KROK 2: Zatrzymaj sesję podglądu i utwórz nową sesję do zrobienia zdjęcia. */ private void stopPreviewAndStartCapture() { try { captureSession.stopRepeating(); captureSession.close(); // Zamknij starą sesję Log.d(TAG, "Preview stopped. Creating capture session..."); // Utwórz nową sesję TYLKO dla ImageReader cameraDevice.createCaptureSession(Collections.singletonList(imageReader.getSurface()), new CameraCaptureSession.StateCallback() { @Override public void onConfigured(@NonNull CameraCaptureSession session) { captureSession = session; captureImage(); // Zrób zdjęcie używając nowej sesji } @Override public void onConfigureFailed(@NonNull CameraCaptureSession session) { setResult(ERROR_CAMERA_SESSION_FAILED); } }, backgroundHandler); } catch (CameraAccessException e) { setResult(ERROR_CAPTURE_FAILED); } } /** KROK 3: Zrób pojedyncze zdjęcie. */ private void captureImage() { try { Log.d(TAG, "Capturing image with dedicated session..."); CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE); captureBuilder.addTarget(imageReader.getSurface()); captureSession.capture(captureBuilder.build(), null, backgroundHandler); } catch (CameraAccessException e) { setResult(ERROR_CAPTURE_FAILED); } } private void onImageAvailable(ImageReader reader) { // ... (bez zmian) try (Image image = reader.acquireLatestImage()) { if (image == null) return; ByteBuffer buffer = image.getPlanes()[0].getBuffer(); byte[] bytes = new byte[buffer.remaining()]; buffer.get(bytes); saveImage(bytes); } catch (Exception e) { setResult(ERROR_IMAGE_SAVE_FAILED); } } private void saveImage(byte[] bytes) { // ... (bez zmian) String fileName = "IMG_" + new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.US).format(new Date()) + ".jpg"; ContentValues values = new ContentValues(); values.put(MediaStore.MediaColumns.DISPLAY_NAME, fileName); values.put(MediaStore.MediaColumns.MIME_TYPE, "image/jpeg"); if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) { values.put(MediaStore.MediaColumns.RELATIVE_PATH, Environment.DIRECTORY_DOWNLOADS); } android.net.Uri uri = activity.getContentResolver().insert(MediaStore.Downloads.EXTERNAL_CONTENT_URI, values); if (uri == null) { setResult(ERROR_IMAGE_SAVE_FAILED); return; } try (OutputStream os = activity.getContentResolver().openOutputStream(uri)) { if (os == null) throw new java.io.IOException("Output stream is null."); os.write(bytes); setResult(SUCCESS); } catch (java.io.IOException e) { setResult(ERROR_IMAGE_SAVE_FAILED); } } private void setResult(int code) { // ... (bez zmian) if (resultCode.compareAndSet(SUCCESS, code)) { operationCompleteLatch.countDown(); } } // ... (metody start/stop background thread i cleanup są takie same, bez większych zmian) private void startBackgroundThread() { /* ... bez zmian ... */ backgroundThread = new HandlerThread("CameraBackground"); backgroundThread.start(); backgroundHandler = new Handler(backgroundThread.getLooper()); } private void stopBackgroundThread() { /* ... bez zmian ... */ if (backgroundThread != null) { backgroundThread.quitSafely(); try { backgroundThread.join(1000); } catch (InterruptedException e) { /* ignore */ } backgroundThread = null; backgroundHandler = null; } } private void cleanup() { try { if (!cameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) Log.e(TAG, "Cleanup timeout."); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } try { if (captureSession != null) { captureSession.close(); captureSession = null; } if (cameraDevice != null) { cameraDevice.close(); cameraDevice = null; } if (imageReader != null) { imageReader.close(); imageReader = null; } if (dummyPreviewSurface != null) { dummyPreviewSurface.release(); dummyPreviewSurface = null; } if (dummyPreviewTexture != null) { dummyPreviewTexture.release(); dummyPreviewTexture = null; } } finally { cameraOpenCloseLock.release(); stopBackgroundThread(); } } } Earlier, when I created a similar thread [Quest 3] Accessing Camera via Android Camera2, I was advised to increase the onConfigured callback, but this happens here: session.setRepeatingRequest(previewRequestBuilder.build(), null, backgroundHandler); Log.d(TAG, "Preview started for warm-up. Waiting 1s..."); backgroundHandler.postDelayed(() -> { // Po opóźnieniu, zatrzymaj podgląd i zacznij przechwytywanie stopPreviewAndStartCapture(); }, 10000); Additionally, in the previous thread there was a suggestion to retry fetching the image, but every subsequent one is still black. From logcat I see that it succeeded but the output image is still black. Many thanks to VirtuallyARealDinosaur for the previous replies.252Views1like3CommentsPassthrough Camera Access in Unreal Engine
I am on the latest version of the Meta Fork of UE5 (5.6.1-v83) and I'm trying to access passthrough camera images, but I cannot for the life of me manage to do that. I think it only works in a standalone build on the Quest (I have a Quest 3), right? Passthrough works in the MR Sample project (I'm having some priority/stencil issues, but that's not the point). I tried the `ConstructTexture2D` node from the passthrough camera access subsystem and also the `getARTexture` node, but nothing works. I can't find any simple guide or documentation on this. Help, please.24Views0likes0CommentsVR Preview does not work when MetaXR plugin is enabled in Unreal Engine
This issue seems to have started after updating Meta Horizon Link to version 83.0.0.333.349. At least with version 83.0.0.311.349, developer-specific features (such as passthrough and hand tracking) did not work, but VR Preview itself was functioning correctly. Current behavior: When I press VR Preview, the Unreal Engine preview window enters a non-VR state (the camera is fixed at floor level and does not move), and nothing happens on the Meta Quest headset connected via Air Link. On the headset side, the Link waiting screen (the gray environment) continues to be displayed indefinitely. Is anyone else experiencing the same issue? Any information or insights would be greatly appreciated. Environment: Unreal Engine 5.5 Meta XR Plugin: 78 Meta Horizon Link: 83.0.0.333.349Solved244Views1like11CommentsMeta Quest 3, Unreal 5.7: Need help with trying to get Opacity to work.
Hi, is there a way to get Opacity to work on Meta Quest 3? The simplest and most obvious way in UE5.7 obviously refuses to work on VR glasses but I was wondering if there was some work around on this? I tried to look everywhere for any glues how to get it to work but googling around hadn't yield any results. I'd be most grateful for any advice and help I can get.62Views0likes0Comments