[Quest 3] Accessing Camera via Android Camera2 API - Process Succeeds but Image is Black
Hello everyone, I'm developing a native Android application (integrated with Unreal Engine) for the Meta Quest 3. My goal is to programmatically capture a still image from the passthrough camera using the standard Android Camera2 API. First, I want to emphasize that the standard, system-level Passthrough feature works perfectly in the headset. To be clear, I already have Passthrough enabled and working correctly inside my Unreal app. I can see the live camera feed, which confirms the basic functionality is there. My issue is specifically with capturing a still image programmatically. While my code executes without errors and saves a file, the resulting image is always completely black. I'm aware that this was a known limitation on older OS versions. However, I was under the impression this was resolved starting with software v67, based on the official documentation here: https://developers.meta.com/horizon/documentation/native/android/pca-native-documentation Given that my headset is on a much newer version, I'm struggling to understand what I'm still missing. Here's a summary of my setup and what I've confirmed: Android Manifest & Permissions: MyAndroidManifest.xmlis configured according to the documentation and includes all necessary features and permissions. I can confirm through logs that the user grants the runtime permissions (CAMERA and HEADSET_CAMERA) successfully. Here are the relevant entries from my manifest: <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="horizonos.permission.HEADSET_CAMERA"/> <uses-feature android:name="android.hardware.camera2.any" android:required="true"/> <uses-feature android:name="com.oculus.feature.PASSTHROUGH" android:required="true"/> Camera Discovery: I am using the standardCamera2API to find the correct passthrough camera device. This part of the code works perfectly—I successfully identify the camera ID by checking CameraCharacteristics for the Meta-specific metadata key (com.meta.extra_metadata.camera_source). The Problem: The Black Image My code successfully opens the camera, creates a capture session, captures an image, and saves a JPEG file. The entire process completes without any exceptions, and my native function receives a "success" code. However, the saved JPEG file is always completely black. My system version is: v79.1032 so as my per understanding of https://developers.meta.com/horizon/documentation/native/android/pca-native-documentation It should work My Java class which is called to save the image, hello method is invoked to capture the image: package com.jaszczurcompany.camerahelper; import android.Manifest; import android.app.Activity; import android.content.ContentValues; import android.content.pm.PackageManager; import android.graphics.ImageFormat; import android.graphics.SurfaceTexture; import android.hardware.camera2.*; import android.hardware.camera2.params.StreamConfigurationMap; import android.media.Image; import android.media.ImageReader; import android.os.Build; import android.os.Environment; import android.os.Handler; import android.os.HandlerThread; import android.provider.MediaStore; import android.util.Log; import android.util.Size; import android.view.Surface; import androidx.annotation.NonNull; import androidx.core.app.ActivityCompat; import androidx.core.content.ContextCompat; import java.io.OutputStream; import java.nio.ByteBuffer; import java.text.SimpleDateFormat; import java.util.Arrays; import java.util.Collections; import java.util.Comparator; import java.util.Date; import java.util.Locale; import java.util.concurrent.CountDownLatch; import java.util.concurrent.Semaphore; import java.util.concurrent.TimeUnit; import java.util.concurrent.atomic.AtomicInteger; public class CameraHelper { private static final String TAG = "FooTag"; // Zmieniono TAG, aby pasował do Twoich logów public static final int SUCCESS = 0; // ... (reszta kodów bez zmian) public static final int ERROR_NO_CAMERA_FEATURE = -2; public static final int ERROR_MISSING_PERMISSIONS = -3; public static final int ERROR_CAMERA_MANAGER_UNAVAILABLE = -4; public static final int ERROR_NO_PASSTHROUGH_CAMERA_FOUND = -5; public static final int ERROR_CAMERA_ACCESS_DENIED = -6; public static final int ERROR_CAMERA_SESSION_FAILED = -7; public static final int ERROR_CAPTURE_FAILED = -8; public static final int ERROR_IMAGE_SAVE_FAILED = -9; public static final int ERROR_TIMEOUT = -10; public static final int ERROR_INTERRUPTED = -11; public static final int ERROR_UNSUPPORTED_CONFIGURATION = -12; private static final String META_PASSTHROUGH_CAMERA_KEY_NAME = "com.meta.extra_metadata.camera_source"; private static final byte META_PASSTHROUGH_CAMERA_VALUE = 1; private final Activity activity; private CameraManager cameraManager; private CameraDevice cameraDevice; private CameraCaptureSession captureSession; private ImageReader imageReader; private Handler backgroundHandler; private HandlerThread backgroundThread; private SurfaceTexture dummyPreviewTexture; private Surface dummyPreviewSurface; private final CountDownLatch operationCompleteLatch = new CountDownLatch(1); private final AtomicInteger resultCode = new AtomicInteger(SUCCESS); private final Semaphore cameraOpenCloseLock = new Semaphore(1); public CameraHelper(@NonNull Activity activity) { this.activity = activity; } public int hello() { Log.d(TAG, "Starting hello() sequence using Camera2 API (Two-Session approach)."); // ... (wstępne sprawdzenia są takie same) if (!activity.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA_ANY)) return ERROR_NO_CAMERA_FEATURE; cameraManager = (CameraManager) activity.getSystemService(Activity.CAMERA_SERVICE); if (cameraManager == null) return ERROR_CAMERA_MANAGER_UNAVAILABLE; if (!checkPermissions()) return ERROR_MISSING_PERMISSIONS; try { startBackgroundThread(); if (!cameraOpenCloseLock.tryAcquire(5, TimeUnit.SECONDS)) return ERROR_TIMEOUT; String cameraId = findPassthroughCameraId(cameraManager); if (cameraId == null) { setResult(ERROR_NO_PASSTHROUGH_CAMERA_FOUND); } else { openCamera(cameraId); } if (!operationCompleteLatch.await(15, TimeUnit.SECONDS)) setResult(ERROR_TIMEOUT); } catch (InterruptedException e) { Thread.currentThread().interrupt(); setResult(ERROR_INTERRUPTED); } catch (CameraAccessException e) { setResult(ERROR_CAMERA_ACCESS_DENIED); } finally { cleanup(); } Log.i(TAG, "hello() returned " + resultCode.get()); return resultCode.get(); } // Zmieniono nazwę, aby nie mylić z prośbą o uprawnienia private boolean checkPermissions() { String[] requiredPermissions = {Manifest.permission.CAMERA, "horizonos.permission.HEADSET_CAMERA"}; return Arrays.stream(requiredPermissions) .allMatch(p -> ContextCompat.checkSelfPermission(activity, p) == PackageManager.PERMISSION_GRANTED); } private String findPassthroughCameraId(CameraManager manager) throws CameraAccessException { // ... (bez zmian) final CameraCharacteristics.Key<Byte> metaCameraSourceKey = new CameraCharacteristics.Key<>(META_PASSTHROUGH_CAMERA_KEY_NAME, Byte.class); for (String cameraId : manager.getCameraIdList()) { CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId); Byte cameraSourceValue = characteristics.get(metaCameraSourceKey); if (cameraSourceValue != null && cameraSourceValue == META_PASSTHROUGH_CAMERA_VALUE) return cameraId; } return null; } private void openCamera(String cameraId) throws CameraAccessException { // ... (konfiguracja imageReader i dummyPreviewSurface bez zmian) CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId); StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP); if (map == null) { setResult(ERROR_UNSUPPORTED_CONFIGURATION); return; } Size largestSize = Arrays.stream(map.getOutputSizes(ImageFormat.JPEG)).max(Comparator.comparing(s -> (long) s.getWidth() * s.getHeight())).orElse(new Size(1280, 720)); Log.d(TAG, "Selected JPEG size: " + largestSize); imageReader = ImageReader.newInstance(largestSize.getWidth(), largestSize.getHeight(), ImageFormat.JPEG, 1); imageReader.setOnImageAvailableListener(this::onImageAvailable, backgroundHandler); dummyPreviewTexture = new SurfaceTexture(10); dummyPreviewTexture.setDefaultBufferSize(640, 480); dummyPreviewSurface = new Surface(dummyPreviewTexture); if (ActivityCompat.checkSelfPermission(activity, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) return; cameraManager.openCamera(cameraId, cameraStateCallback, backgroundHandler); } private final CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() { @Override public void onOpened(@NonNull CameraDevice camera) { cameraOpenCloseLock.release(); cameraDevice = camera; createPreviewSession(); // ZACZNIJ OD SESJI PODGLĄDU } @Override public void onDisconnected(@NonNull CameraDevice camera) { /* ... bez zmian ... */ cameraOpenCloseLock.release(); setResult(ERROR_CAMERA_ACCESS_DENIED); camera.close(); } @Override public void onError(@NonNull CameraDevice camera, int error) { /* ... bez zmian ... */ cameraOpenCloseLock.release(); setResult(ERROR_CAMERA_ACCESS_DENIED); camera.close(); } }; /** KROK 1: Utwórz i uruchom sesję TYLKO dla podglądu, aby rozgrzać sensor. */ private void createPreviewSession() { try { CaptureRequest.Builder previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW); previewRequestBuilder.addTarget(dummyPreviewSurface); cameraDevice.createCaptureSession(Collections.singletonList(dummyPreviewSurface), new CameraCaptureSession.StateCallback() { @Override public void onConfigured(@NonNull CameraCaptureSession session) { captureSession = session; try { session.setRepeatingRequest(previewRequestBuilder.build(), null, backgroundHandler); Log.d(TAG, "Preview started for warm-up. Waiting 1s..."); backgroundHandler.postDelayed(() -> { // Po opóźnieniu, zatrzymaj podgląd i zacznij przechwytywanie stopPreviewAndStartCapture(); }, 10000); } catch (CameraAccessException e) { setResult(ERROR_CAPTURE_FAILED); } } @Override public void onConfigureFailed(@NonNull CameraCaptureSession session) { setResult(ERROR_CAMERA_SESSION_FAILED); } }, backgroundHandler); } catch (CameraAccessException e) { setResult(ERROR_CAMERA_SESSION_FAILED); } } /** KROK 2: Zatrzymaj sesję podglądu i utwórz nową sesję do zrobienia zdjęcia. */ private void stopPreviewAndStartCapture() { try { captureSession.stopRepeating(); captureSession.close(); // Zamknij starą sesję Log.d(TAG, "Preview stopped. Creating capture session..."); // Utwórz nową sesję TYLKO dla ImageReader cameraDevice.createCaptureSession(Collections.singletonList(imageReader.getSurface()), new CameraCaptureSession.StateCallback() { @Override public void onConfigured(@NonNull CameraCaptureSession session) { captureSession = session; captureImage(); // Zrób zdjęcie używając nowej sesji } @Override public void onConfigureFailed(@NonNull CameraCaptureSession session) { setResult(ERROR_CAMERA_SESSION_FAILED); } }, backgroundHandler); } catch (CameraAccessException e) { setResult(ERROR_CAPTURE_FAILED); } } /** KROK 3: Zrób pojedyncze zdjęcie. */ private void captureImage() { try { Log.d(TAG, "Capturing image with dedicated session..."); CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE); captureBuilder.addTarget(imageReader.getSurface()); captureSession.capture(captureBuilder.build(), null, backgroundHandler); } catch (CameraAccessException e) { setResult(ERROR_CAPTURE_FAILED); } } private void onImageAvailable(ImageReader reader) { // ... (bez zmian) try (Image image = reader.acquireLatestImage()) { if (image == null) return; ByteBuffer buffer = image.getPlanes()[0].getBuffer(); byte[] bytes = new byte[buffer.remaining()]; buffer.get(bytes); saveImage(bytes); } catch (Exception e) { setResult(ERROR_IMAGE_SAVE_FAILED); } } private void saveImage(byte[] bytes) { // ... (bez zmian) String fileName = "IMG_" + new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.US).format(new Date()) + ".jpg"; ContentValues values = new ContentValues(); values.put(MediaStore.MediaColumns.DISPLAY_NAME, fileName); values.put(MediaStore.MediaColumns.MIME_TYPE, "image/jpeg"); if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) { values.put(MediaStore.MediaColumns.RELATIVE_PATH, Environment.DIRECTORY_DOWNLOADS); } android.net.Uri uri = activity.getContentResolver().insert(MediaStore.Downloads.EXTERNAL_CONTENT_URI, values); if (uri == null) { setResult(ERROR_IMAGE_SAVE_FAILED); return; } try (OutputStream os = activity.getContentResolver().openOutputStream(uri)) { if (os == null) throw new java.io.IOException("Output stream is null."); os.write(bytes); setResult(SUCCESS); } catch (java.io.IOException e) { setResult(ERROR_IMAGE_SAVE_FAILED); } } private void setResult(int code) { // ... (bez zmian) if (resultCode.compareAndSet(SUCCESS, code)) { operationCompleteLatch.countDown(); } } // ... (metody start/stop background thread i cleanup są takie same, bez większych zmian) private void startBackgroundThread() { /* ... bez zmian ... */ backgroundThread = new HandlerThread("CameraBackground"); backgroundThread.start(); backgroundHandler = new Handler(backgroundThread.getLooper()); } private void stopBackgroundThread() { /* ... bez zmian ... */ if (backgroundThread != null) { backgroundThread.quitSafely(); try { backgroundThread.join(1000); } catch (InterruptedException e) { /* ignore */ } backgroundThread = null; backgroundHandler = null; } } private void cleanup() { try { if (!cameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) Log.e(TAG, "Cleanup timeout."); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } try { if (captureSession != null) { captureSession.close(); captureSession = null; } if (cameraDevice != null) { cameraDevice.close(); cameraDevice = null; } if (imageReader != null) { imageReader.close(); imageReader = null; } if (dummyPreviewSurface != null) { dummyPreviewSurface.release(); dummyPreviewSurface = null; } if (dummyPreviewTexture != null) { dummyPreviewTexture.release(); dummyPreviewTexture = null; } } finally { cameraOpenCloseLock.release(); stopBackgroundThread(); } } } Earlier, when I created a similar thread [Quest 3] Accessing Camera via Android Camera2, I was advised to increase the onConfigured callback, but this happens here: session.setRepeatingRequest(previewRequestBuilder.build(), null, backgroundHandler); Log.d(TAG, "Preview started for warm-up. Waiting 1s..."); backgroundHandler.postDelayed(() -> { // Po opóźnieniu, zatrzymaj podgląd i zacznij przechwytywanie stopPreviewAndStartCapture(); }, 10000); Additionally, in the previous thread there was a suggestion to retry fetching the image, but every subsequent one is still black. From logcat I see that it succeeded but the output image is still black. Many thanks to VirtuallyARealDinosaur for the previous replies.237Views1like3CommentsPassthrough Camera Access in Unreal Engine
I am on the latest version of the Meta Fork of UE5 (5.6.1-v83) and I'm trying to access passthrough camera images, but I cannot for the life of me manage to do that. I think it only works in a standalone build on the Quest (I have a Quest 3), right? Passthrough works in the MR Sample project (I'm having some priority/stencil issues, but that's not the point). I tried the `ConstructTexture2D` node from the passthrough camera access subsystem and also the `getARTexture` node, but nothing works. I can't find any simple guide or documentation on this. Help, please.13Views0likes0CommentsVR Preview does not work when MetaXR plugin is enabled in Unreal Engine
This issue seems to have started after updating Meta Horizon Link to version 83.0.0.333.349. At least with version 83.0.0.311.349, developer-specific features (such as passthrough and hand tracking) did not work, but VR Preview itself was functioning correctly. Current behavior: When I press VR Preview, the Unreal Engine preview window enters a non-VR state (the camera is fixed at floor level and does not move), and nothing happens on the Meta Quest headset connected via Air Link. On the headset side, the Link waiting screen (the gray environment) continues to be displayed indefinitely. Is anyone else experiencing the same issue? Any information or insights would be greatly appreciated. Environment: Unreal Engine 5.5 Meta XR Plugin: 78 Meta Horizon Link: 83.0.0.333.349Solved221Views1like11CommentsMeta Quest 3, Unreal 5.7: Need help with trying to get Opacity to work.
Hi, is there a way to get Opacity to work on Meta Quest 3? The simplest and most obvious way in UE5.7 obviously refuses to work on VR glasses but I was wondering if there was some work around on this? I tried to look everywhere for any glues how to get it to work but googling around hadn't yield any results. I'd be most grateful for any advice and help I can get.38Views0likes0CommentsView unreal engine projects on MetaQuest 3
Hi! Not sure if this is the right place to ask, but I am new to both Unreal Engine and the metaquest, and I could use some help. I have a project in UE5.6 that contains this fab scene: https://www.fab.com/listings/bc699e8b-59ff-4be9-9c48-c88a0448957f and I would like to view it immersively on the quest. So far I have been able to package the project and install it to the quest, but when I try to open it on the quest, it always either hangs on a loading screen or I just see passthrough; the content/scene never actually appears. Does anyone have any recommendations for how best to view an unreal engine project on the quest? Do I need to make any changes to the scene itself in order for it to work on the quest?MQDH can't see Link and Quest3
I installed - Meta Horizon Link version 83.0.0.330.349 (83.0.0.330.349) - MQDH 6.3.0 HMD in developer mode. Link CAN see HMD and pair. MQDH can't see no HMD, no Link. ADB installed, and working. "Set Up New device" throw Bluetooth requirement. I don't have Bluetooth on standalone PC. How to connect HMD to MQDH?54Views1like1CommentExtremely long loading times for VR PREVIEW in Unreal Engine after updating Meta Quest Link to v85?
Hey Guys! Has anyone else experienced extremely long loading times for VR PREVIEW in Unreal Engine 5.5.1 after updating Meta Quest Link to v85? I literally have to wait 3 minutes, when it used to load in about a few seconds.Solved84Views0likes2CommentsUnreal 5.6.1 Interaction Examples SDK
I'm just getting started. I'm interested in looking at the Unreal version of the Interaction Samples SDK - especially for Unreal 5.6 or 5.7. I got the plug-in(s) working for 5.6, I can't get the Samples working for 5.6. I'd greatly appreciate help from anyone who has the Samples working? Also is there a timeline for 5.7 integration? Here is what I've done: https://github.com/oculus-samples/Unreal-InteractionSDK-Sample I added the: Meta XR Plugin Meta XR Interaction SDK plugin The OcculusInteractionSamples plugin doesn't seem to want to run with UE5.6.1. I've tried a rebuild - Then, I tried Generating VS project files and recompiled. But that didn't work. Thanks for any help!66Views0likes3CommentsConflicting Information in the Horizon OS SBC (Shader Binary Cache) Documentation?
In the documentation regarding building a shader binary cache per-platform (link) the documentation states: Using this feature, once one user starts the app and manually builds the SBC, all other users with the same device and software (Horizon OS, graphics driver, and app) will be able to avoid the shader generation process by downloading a copy of a pre-computed SBC. However, later on the same page, it states there is an automation in place to launch the apps and perform scripted prewarming logic if requested. The system automatically identifies and processes Oculus OS builds and app versions that require shader cache assets. It generates and uploads these assets to the store backend and automatically installs them during an app install or update. Does this feature support both of those setups? If I am not scripting any custom warmup logic, will shader binary caches still be shared between users with identical setups? IE, if I simply play the release candidate on the target OS version/hardware, will my SBC be automatically uploaded, or are SBCs only distributed when a scripted warmup sequence is present? Few details are provided regarding SBCs from other users being uploaded, so I'm curious if this is an inaccuracy or not. Thanks, excited to see features like this in Horizon OS. Very important for first time user experience.16Views0likes0CommentsWeird graphic glitch on Quest 3 / 3s
I encountered this graphic glitch on Quest 3 /3s ... but the same app works flawlessly on Quest 1 and 2! It seems to appear on translucent materials only. I tried all the possibile fixes (custom depth, disable fog, disable depth fade, ...) but nothing worked so far. Suggestions? Ideas? It seems to happen in screen space and not everywhere (that "water" material and randomly around opaque meshes againts tranlucent ones and/or translucent againts opaque. Built with UE4.25 (Oculus branch) back in 2021 - and I won't refactor that on 4.26+ / 5.x ;)
24Views0likes0Comments