Grabbed object with custom pose "wiggles" when moving controller or hands rapidly back and forth
I posted an issue at https://github.com/oculus-samples/Unreal-InteractionSDK-Sample/issues/16 but adding here for visibility also. I found that if I grabbed an object with a custom pose and moved the controller back and forth quickly the grabbed object doesn't seem to to track my virtual hand quite right. It kind of looks like the object "wiggles" or jiggles/lags behind the hands. This was only happening with pose mode set to "Snap Object to Pose". I tracked it down to CalculateInteractorSnapTransform() in OculusInteraction\Private\Interaction\Grabbable\IsdkGrabTransformerComponent.cpp but I'm not sure the best way to fix. I notice some of the other branches in UIsdkGrabTransformerComponent::UpdateTransform() end up calling TransformTarget->SetRelativeTransform(TargetRelativeTransform); which doesn't exhibit the issue. Hands tracking also seems to have the same issue. I asked Gemini for a solution and it suggested this code: FTransform UIsdkGrabTransformerComponent::CalculateInteractorSnapTransform( FTransform& TargetTransformIn, FTransform& InteractorTransformIn) { const FTransform OffsetInverse = InteractorSnapOffset.Inverse(); const FTransform FinalTransform = OffsetInverse * InteractorTransformIn; return FinalTransform; } It did fix the problem, but I'm not sure what other purposes the original code had as it was more complex. Thanks!46Views0likes0CommentsIssues with Oculus Debugger for VS Code
Hello all, I've been trying to get debug attachment to running process on headset for two days now. I normally use Rider but I had installed and setup VSCode to use the Oculus Debugger. I however was not able to attach a running instance on headset. I get an error where the Oculus Plugin doesn't download the LLDB files Here is the Log. Create debug adapter with configuration: {"name":"Oculus (Development) Attach","request":"attach","type":"fb-lldb","android":{"application":{"package":"com.example.project","activity":"com.epicgames.unreal.GameActivity"},"lldbConfig":{"librarySearchPaths":["\\\"C:/PROJECT/Binaries/Android/Symbols_arm64\\\""],"lldbPreTargetCreateCommands":["command script import \\\"C:/PROJECT/Engine/Extras/LLDBDataFormatters/UEDataFormatters_2ByteChars.py\\\""],"lldbPostTargetCreateCommands":["process handle --pass true --stop false --notify true SIGILL"]}},"__configurationTarget":5,"terminateCommands":[]} C:\Users\User\.oculus_debugger_runtime Exists! C:\Users\User\.oculus_debugger_runtime Exists! Previous download directory exists, removing C:\Users\User\.oculus_debugger_runtime\lldb Previous download directory exists, removing C:\Users\User\.oculus_debugger_runtime\debugcito Exception: Error: Can't download LLDB: invalid content length Stack trace: {Error: Can't download LLDB: invalid content length at c:\Users\User\.vscode\extensions\oculus.oculus-vscode-debugger-1.0.8\dist\extension.js:17:39876 at new Promise (<anonymous>) at c:\Users\User\.vscode\extensions\oculus.oculus-vscode-debugger-1.0.8\dist\extension.js:17:39747 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async Ee._downloadWithProgress (c:\Users\User\.vscode\extensions\oculus.oculus-vscode-debugger-1.0.8\dist\extension.js:17:39572) at async Ee._downloadAndInstallWithProgress (c:\Users\User\.vscode\extensions\oculus.oculus-vscode-debugger-1.0.8\dist\extension.js:17:39295)} Exception: Error: Can't download Oculus Runtime: invalid content length Stack trace: {Error: Can't download Oculus Runtime: invalid content length at c:\Users\User\.vscode\extensions\oculus.oculus-vscode-debugger-1.0.8\dist\extension.js:17:39876 at new Promise (<anonymous>) at c:\Users\User\.vscode\extensions\oculus.oculus-vscode-debugger-1.0.8\dist\extension.js:17:39747 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async Ee._downloadWithProgress (c:\Users\User\.vscode\extensions\oculus.oculus-vscode-debugger-1.0.8\dist\extension.js:17:39572) at async Ee._downloadAndInstallWithProgress (c:\Users\User\.vscode\extensions\oculus.oculus-vscode-debugger-1.0.8\dist\extension.js:17:39295)} After this I gave up so I tried installing lldb-server manually onto the headset and was able to get the server running and connect to it via VSCode running using my own launch commands. I got pretty close but I wasn't able to properly attach with permissions of the running game. Instead it was default attaching to a 0 stub process. If anyone can figure out how I can actually hit break points in C++ on the running process of the headset. I've verified that the apk I side loaded is debugable and everything seems fine. Setting up a working system like this seems like undocumented territory.Solved105Views0likes1Commentv77 Grab Hand Pose Translation/Rotation
Hi, I'm using v77 of the Interaction SDK in Unreal 5.5.4 for hand tracking, and am having issues with a HandGrabPose not following the grab location/rotation when using rotation and location transformers. I am using an object setup with multiple grabbable pieces. My blueprint hierarchy is essentially: Mesh -- GrabbableComponent1 -- SubMesh ---- GrabbableComponent2 ------ HandGrabPose GrabTransformer1 (for main mesh, with free transformer) GrabTransformer2 (for sub mesh, with translate constraint applied) The actual interactions of the meshes work without issue. I can grab the main mesh, and grab the submesh to move it with its translation applied. My HandGrabPose component has the following properties set: The initial posing is fine and the hand will move to the intended pose position, but as the submesh is moved, the hand pose stays in the same location rather than following the submesh as I would like. Looking at the source for hand poses, I can see that the hand pose override is set via a call in OculusInteractionPrebuilts/Rig/IsdkHandPoseRigModifier to HandMeshComponent->SetHandPoseOverride once the grab state changes to Selected, but as this is only called upon initial grab, the required RootPoseOffset does not get updated upon translation/rotation of the grabbed component. This is not an issue for components with only one grabbable where the entire object will be moved and thus the offset remains constant, but for my case with sub-objects the pose offset obviously does change but the hand remains in its initial location. Has anyone run into a similar issue and could recommend a solution or workaround? I'd be happy to provide more information on my setup if needed. Thanks!80Views0likes2Comments[Quest 3] Accessing Camera via Android Camera2 API - Process Succeeds but Image is Black
Hello everyone, I'm developing a native Android application (integrated with Unreal Engine) for the Meta Quest 3. My goal is to programmatically capture a still image from the passthrough camera using the standard Android Camera2 API. First, I want to emphasize that the standard, system-level Passthrough feature works perfectly in the headset. To be clear, I already have Passthrough enabled and working correctly inside my Unreal app. I can see the live camera feed, which confirms the basic functionality is there. My issue is specifically with capturing a still image programmatically. While my code executes without errors and saves a file, the resulting image is always completely black. I'm aware that this was a known limitation on older OS versions. However, I was under the impression this was resolved starting with software v67, based on the official documentation here: https://developers.meta.com/horizon/documentation/native/android/pca-native-documentation Given that my headset is on a much newer version, I'm struggling to understand what I'm still missing. Here's a summary of my setup and what I've confirmed: Android Manifest & Permissions: MyAndroidManifest.xmlis configured according to the documentation and includes all necessary features and permissions. I can confirm through logs that the user grants the runtime permissions (CAMERA and HEADSET_CAMERA) successfully. Here are the relevant entries from my manifest: <uses-permission android:name="android.permission.CAMERA" /> <uses-permission android:name="horizonos.permission.HEADSET_CAMERA"/> <uses-feature android:name="android.hardware.camera2.any" android:required="true"/> <uses-feature android:name="com.oculus.feature.PASSTHROUGH" android:required="true"/> Camera Discovery: I am using the standardCamera2API to find the correct passthrough camera device. This part of the code works perfectly—I successfully identify the camera ID by checking CameraCharacteristics for the Meta-specific metadata key (com.meta.extra_metadata.camera_source). The Problem: The Black Image My code successfully opens the camera, creates a capture session, captures an image, and saves a JPEG file. The entire process completes without any exceptions, and my native function receives a "success" code. However, the saved JPEG file is always completely black. My system version is: v79.1032 so as my per understanding of https://developers.meta.com/horizon/documentation/native/android/pca-native-documentation It should work My Java class which is called to save the image, hello method is invoked to capture the image: package com.jaszczurcompany.camerahelper; import android.Manifest; import android.app.Activity; import android.content.ContentValues; import android.content.pm.PackageManager; import android.graphics.ImageFormat; import android.graphics.SurfaceTexture; import android.hardware.camera2.*; import android.hardware.camera2.params.StreamConfigurationMap; import android.media.Image; import android.media.ImageReader; import android.os.Build; import android.os.Environment; import android.os.Handler; import android.os.HandlerThread; import android.provider.MediaStore; import android.util.Log; import android.util.Size; import android.view.Surface; import androidx.annotation.NonNull; import androidx.core.app.ActivityCompat; import androidx.core.content.ContextCompat; import java.io.OutputStream; import java.nio.ByteBuffer; import java.text.SimpleDateFormat; import java.util.Arrays; import java.util.Collections; import java.util.Comparator; import java.util.Date; import java.util.Locale; import java.util.concurrent.CountDownLatch; import java.util.concurrent.Semaphore; import java.util.concurrent.TimeUnit; import java.util.concurrent.atomic.AtomicInteger; public class CameraHelper { private static final String TAG = "FooTag"; // Zmieniono TAG, aby pasował do Twoich logów public static final int SUCCESS = 0; // ... (reszta kodów bez zmian) public static final int ERROR_NO_CAMERA_FEATURE = -2; public static final int ERROR_MISSING_PERMISSIONS = -3; public static final int ERROR_CAMERA_MANAGER_UNAVAILABLE = -4; public static final int ERROR_NO_PASSTHROUGH_CAMERA_FOUND = -5; public static final int ERROR_CAMERA_ACCESS_DENIED = -6; public static final int ERROR_CAMERA_SESSION_FAILED = -7; public static final int ERROR_CAPTURE_FAILED = -8; public static final int ERROR_IMAGE_SAVE_FAILED = -9; public static final int ERROR_TIMEOUT = -10; public static final int ERROR_INTERRUPTED = -11; public static final int ERROR_UNSUPPORTED_CONFIGURATION = -12; private static final String META_PASSTHROUGH_CAMERA_KEY_NAME = "com.meta.extra_metadata.camera_source"; private static final byte META_PASSTHROUGH_CAMERA_VALUE = 1; private final Activity activity; private CameraManager cameraManager; private CameraDevice cameraDevice; private CameraCaptureSession captureSession; private ImageReader imageReader; private Handler backgroundHandler; private HandlerThread backgroundThread; private SurfaceTexture dummyPreviewTexture; private Surface dummyPreviewSurface; private final CountDownLatch operationCompleteLatch = new CountDownLatch(1); private final AtomicInteger resultCode = new AtomicInteger(SUCCESS); private final Semaphore cameraOpenCloseLock = new Semaphore(1); public CameraHelper(@NonNull Activity activity) { this.activity = activity; } public int hello() { Log.d(TAG, "Starting hello() sequence using Camera2 API (Two-Session approach)."); // ... (wstępne sprawdzenia są takie same) if (!activity.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA_ANY)) return ERROR_NO_CAMERA_FEATURE; cameraManager = (CameraManager) activity.getSystemService(Activity.CAMERA_SERVICE); if (cameraManager == null) return ERROR_CAMERA_MANAGER_UNAVAILABLE; if (!checkPermissions()) return ERROR_MISSING_PERMISSIONS; try { startBackgroundThread(); if (!cameraOpenCloseLock.tryAcquire(5, TimeUnit.SECONDS)) return ERROR_TIMEOUT; String cameraId = findPassthroughCameraId(cameraManager); if (cameraId == null) { setResult(ERROR_NO_PASSTHROUGH_CAMERA_FOUND); } else { openCamera(cameraId); } if (!operationCompleteLatch.await(15, TimeUnit.SECONDS)) setResult(ERROR_TIMEOUT); } catch (InterruptedException e) { Thread.currentThread().interrupt(); setResult(ERROR_INTERRUPTED); } catch (CameraAccessException e) { setResult(ERROR_CAMERA_ACCESS_DENIED); } finally { cleanup(); } Log.i(TAG, "hello() returned " + resultCode.get()); return resultCode.get(); } // Zmieniono nazwę, aby nie mylić z prośbą o uprawnienia private boolean checkPermissions() { String[] requiredPermissions = {Manifest.permission.CAMERA, "horizonos.permission.HEADSET_CAMERA"}; return Arrays.stream(requiredPermissions) .allMatch(p -> ContextCompat.checkSelfPermission(activity, p) == PackageManager.PERMISSION_GRANTED); } private String findPassthroughCameraId(CameraManager manager) throws CameraAccessException { // ... (bez zmian) final CameraCharacteristics.Key<Byte> metaCameraSourceKey = new CameraCharacteristics.Key<>(META_PASSTHROUGH_CAMERA_KEY_NAME, Byte.class); for (String cameraId : manager.getCameraIdList()) { CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId); Byte cameraSourceValue = characteristics.get(metaCameraSourceKey); if (cameraSourceValue != null && cameraSourceValue == META_PASSTHROUGH_CAMERA_VALUE) return cameraId; } return null; } private void openCamera(String cameraId) throws CameraAccessException { // ... (konfiguracja imageReader i dummyPreviewSurface bez zmian) CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId); StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP); if (map == null) { setResult(ERROR_UNSUPPORTED_CONFIGURATION); return; } Size largestSize = Arrays.stream(map.getOutputSizes(ImageFormat.JPEG)).max(Comparator.comparing(s -> (long) s.getWidth() * s.getHeight())).orElse(new Size(1280, 720)); Log.d(TAG, "Selected JPEG size: " + largestSize); imageReader = ImageReader.newInstance(largestSize.getWidth(), largestSize.getHeight(), ImageFormat.JPEG, 1); imageReader.setOnImageAvailableListener(this::onImageAvailable, backgroundHandler); dummyPreviewTexture = new SurfaceTexture(10); dummyPreviewTexture.setDefaultBufferSize(640, 480); dummyPreviewSurface = new Surface(dummyPreviewTexture); if (ActivityCompat.checkSelfPermission(activity, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) return; cameraManager.openCamera(cameraId, cameraStateCallback, backgroundHandler); } private final CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() { @Override public void onOpened(@NonNull CameraDevice camera) { cameraOpenCloseLock.release(); cameraDevice = camera; createPreviewSession(); // ZACZNIJ OD SESJI PODGLĄDU } @Override public void onDisconnected(@NonNull CameraDevice camera) { /* ... bez zmian ... */ cameraOpenCloseLock.release(); setResult(ERROR_CAMERA_ACCESS_DENIED); camera.close(); } @Override public void onError(@NonNull CameraDevice camera, int error) { /* ... bez zmian ... */ cameraOpenCloseLock.release(); setResult(ERROR_CAMERA_ACCESS_DENIED); camera.close(); } }; /** KROK 1: Utwórz i uruchom sesję TYLKO dla podglądu, aby rozgrzać sensor. */ private void createPreviewSession() { try { CaptureRequest.Builder previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW); previewRequestBuilder.addTarget(dummyPreviewSurface); cameraDevice.createCaptureSession(Collections.singletonList(dummyPreviewSurface), new CameraCaptureSession.StateCallback() { @Override public void onConfigured(@NonNull CameraCaptureSession session) { captureSession = session; try { session.setRepeatingRequest(previewRequestBuilder.build(), null, backgroundHandler); Log.d(TAG, "Preview started for warm-up. Waiting 1s..."); backgroundHandler.postDelayed(() -> { // Po opóźnieniu, zatrzymaj podgląd i zacznij przechwytywanie stopPreviewAndStartCapture(); }, 1000); } catch (CameraAccessException e) { setResult(ERROR_CAPTURE_FAILED); } } @Override public void onConfigureFailed(@NonNull CameraCaptureSession session) { setResult(ERROR_CAMERA_SESSION_FAILED); } }, backgroundHandler); } catch (CameraAccessException e) { setResult(ERROR_CAMERA_SESSION_FAILED); } } /** KROK 2: Zatrzymaj sesję podglądu i utwórz nową sesję do zrobienia zdjęcia. */ private void stopPreviewAndStartCapture() { try { captureSession.stopRepeating(); captureSession.close(); // Zamknij starą sesję Log.d(TAG, "Preview stopped. Creating capture session..."); // Utwórz nową sesję TYLKO dla ImageReader cameraDevice.createCaptureSession(Collections.singletonList(imageReader.getSurface()), new CameraCaptureSession.StateCallback() { @Override public void onConfigured(@NonNull CameraCaptureSession session) { captureSession = session; captureImage(); // Zrób zdjęcie używając nowej sesji } @Override public void onConfigureFailed(@NonNull CameraCaptureSession session) { setResult(ERROR_CAMERA_SESSION_FAILED); } }, backgroundHandler); } catch (CameraAccessException e) { setResult(ERROR_CAPTURE_FAILED); } } /** KROK 3: Zrób pojedyncze zdjęcie. */ private void captureImage() { try { Log.d(TAG, "Capturing image with dedicated session..."); CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE); captureBuilder.addTarget(imageReader.getSurface()); captureSession.capture(captureBuilder.build(), null, backgroundHandler); } catch (CameraAccessException e) { setResult(ERROR_CAPTURE_FAILED); } } private void onImageAvailable(ImageReader reader) { // ... (bez zmian) try (Image image = reader.acquireLatestImage()) { if (image == null) return; ByteBuffer buffer = image.getPlanes()[0].getBuffer(); byte[] bytes = new byte[buffer.remaining()]; buffer.get(bytes); saveImage(bytes); } catch (Exception e) { setResult(ERROR_IMAGE_SAVE_FAILED); } } private void saveImage(byte[] bytes) { // ... (bez zmian) String fileName = "IMG_" + new SimpleDateFormat("yyyyMMdd_HHmmss", Locale.US).format(new Date()) + ".jpg"; ContentValues values = new ContentValues(); values.put(MediaStore.MediaColumns.DISPLAY_NAME, fileName); values.put(MediaStore.MediaColumns.MIME_TYPE, "image/jpeg"); if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) { values.put(MediaStore.MediaColumns.RELATIVE_PATH, Environment.DIRECTORY_DOWNLOADS); } android.net.Uri uri = activity.getContentResolver().insert(MediaStore.Downloads.EXTERNAL_CONTENT_URI, values); if (uri == null) { setResult(ERROR_IMAGE_SAVE_FAILED); return; } try (OutputStream os = activity.getContentResolver().openOutputStream(uri)) { if (os == null) throw new java.io.IOException("Output stream is null."); os.write(bytes); setResult(SUCCESS); } catch (java.io.IOException e) { setResult(ERROR_IMAGE_SAVE_FAILED); } } private void setResult(int code) { // ... (bez zmian) if (resultCode.compareAndSet(SUCCESS, code)) { operationCompleteLatch.countDown(); } } // ... (metody start/stop background thread i cleanup są takie same, bez większych zmian) private void startBackgroundThread() { /* ... bez zmian ... */ backgroundThread = new HandlerThread("CameraBackground"); backgroundThread.start(); backgroundHandler = new Handler(backgroundThread.getLooper()); } private void stopBackgroundThread() { /* ... bez zmian ... */ if (backgroundThread != null) { backgroundThread.quitSafely(); try { backgroundThread.join(1000); } catch (InterruptedException e) { /* ignore */ } backgroundThread = null; backgroundHandler = null; } } private void cleanup() { try { if (!cameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) Log.e(TAG, "Cleanup timeout."); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } try { if (captureSession != null) { captureSession.close(); captureSession = null; } if (cameraDevice != null) { cameraDevice.close(); cameraDevice = null; } if (imageReader != null) { imageReader.close(); imageReader = null; } if (dummyPreviewSurface != null) { dummyPreviewSurface.release(); dummyPreviewSurface = null; } if (dummyPreviewTexture != null) { dummyPreviewTexture.release(); dummyPreviewTexture = null; } } finally { cameraOpenCloseLock.release(); stopBackgroundThread(); } } } From logcat I see that it succeeded but the output image is still black.231Views1like1CommentVRC.Quest.Functional.9 - How to map Oculus Home Button
Hi! I am in the process of App Review, and it just failed due to VRC.Quest.Functional.9. The user's forward orientation is not reset when the Oculus Home button is long-pressed. But the Documentation at Oculus Controller Input Mapping does not have a reference to Oculus Home Button. How do I get an event in order to execute the Forward Orientation function (Which my game already supports, just in another binding)8.7KViews3likes22CommentsUnreal manifest issues
I'm trying to upload my apk through the meta developer hub and im getting the errors The focus aware manifest meta-data tag must be set with required="true". More information can be found here: https://developer.oculus.com/distribute/vrc-quest-input-4/ Missing <category android:name="com.oculus.intent.category.*> in AndroidManifest.xml. The <uses-feature android:name="android.hardware.vr.headtracking" android:version="1" android:required="true" /> attribute is required for this platform in AndroidManifest.xml I have looked into the project settings and inside of the meta xr plugin I have focus aware checked but its greyed out and not pushing to the manifest, I also dont understand why the headtracking feature is no longer pushing to the manifest either as It should. I am on the new oculus 5.2 branch and the newest meta plugin, any help or ideas would be greatly appreciated.1.3KViews0likes1CommentProper way to replicate Oculus Quest hand tracking
Hello, iam on Unreal 4.26 (Launcher version) and iam trying to replicate the hand tracking for an multiplayer experience. My pawn has a body and two motioncontrollers which are being replicated nicely. Now I added the Oculus hand components, for the left and right hand and it works only for my locally controlled pawn. The other players can't see my gestures, the only thing being replicated is the hands position but not the animations/gestures. Checking "Component Replicates" didn't help at all. Any ideas? My pawn:3.8KViews1like3CommentsHow To Implement An "swinging arm" or "running in place" Locomotion System
Hi, I want to use UE4 to build a “jogging in place” locomotion system. Could anyone please let me know where to start learning how to do this? There is a "swinging arm" locomotion blueprint example provided in the ue4 oculus fork. I was thinking of using that to expand on, but I can't find the actual source code, so I don't know how they implemented it. Mainly, how did they deal with capuring the pysical motions of the controller/hmd, such as accelleration? I’m very familiar with coding; and I bought a few ue4 c++ courses on Udemy that were geared towards VR development such as the following: https://www.udemy.com/course/unrealvr/ But all of these tutorials simply demonstrate how to implement a teleportation system and nothing else. Could someone provide me with some insight on how to implement this system in ue4? Thanks.5.6KViews1like9CommentsSpatial Anchor on Unreal - Samples and how to do it
I managed to install the sample. I took a look at it. I understand the idea but no matter what I tried, i cant start a project and develop spatial anchor from start. I would like to start a simple project from zero to understand it better. I have an actor with grab component (from vr template) and would like to create a spatial anchor on it. This way, a person would grab it and move it around. After he left the object it would be saved and someone else could get the actor from the place it was left in another session.... Any idea how to implement a simple spatial anchor? No menus, hover material etc (the sample I thought to be a little complex to understand the principles)5.8KViews0likes6Comments