When I attach the video to the OVRCameraController, the videostream is stopping and running, stopping and runnig, it seems that don't update enough quickly.
The Time project is 0.166666, as the manual says. And I tried with several update frame:
WebCamTexture webcamTexture = new WebCamTexture(1280,720,25); WebCamTexture webcamTexture = new WebCamTexture(1280,720,30); WebCamTexture webcamTexture = new WebCamTexture(1280,720,60);
But, with same result.
Could you explain, what is the procedure for access to "Passthrough camera" in Unity with the same quality and behaviour as the options in the headset, when you press the "back button".
If you can share this unity scene to the people, I think that it will be great. I tested all the demos, but I don't see anything about this.
There should not be anything unique with displaying the camera in VR versus doing it in a normal Android app. So any help or code you find for general Unity/Android development should apply here.
Hi Cybereality, Thank you for the answer, I tried more things but always with the same result. The Passthrough camera in the goback button options scene is a Unity scene? is native java?
How is coded it?
Now, I'm trying use the vuforia plugin, that has direct access to the camera in android, and plays in a RanderTexture, I see that works in the editor, but not appears in the apk runtime. I think that there is a problem in the manifest.. I'm continue investigating..
I have been working with Mobile SDK and Vuforia aswell.
You are right Oculus does not like Vuforia.
I had to make my own stereo camera rig with Vuforia ARCamera and 2 regular cams.
To get a Barreled Distortion of the phone camera background view I use a Shader with some math that distorts the
The Background Plane Mesh at runtime. I rotate this panel for proper aspect ration distortion. I use a second
background camera with each 0.0-0.5 and 0.5-0.5 for the right just like my 2 ARCamera children.
Each of the ARCarmera left and right child cameras have an FOV of about 77.5.
Compile it.. install and load into the GearVR . everything is running good . The Augmented object stays on the target.
Problem is the background is not fast like the "PassThrough" camera mode. it has a lot of motion blurs when you look around . Looking at a CRT TV I see bands on the screen scrolling with the "PassThrough" Camera which tells me the "PassThrough" camera uses a different Refresh Rate. When I look at the Same CRT TV with my app it has no bands which tells me it is running at a different frequency than the superior "Passthrough" camera. So I am where you Alvaro!
The Vrlib.jar file is where the "PassThrough" class resides. I have tried to call it like this from a C# Sharp Script. ///////////////////////////////////////////////////////////////////////// using System; using System.Collections.Generic; using System.Runtime.InteropServices; using UnityEngine;
public class JNIcall : IDisposable { private static JNIcall _instance; public static JNIcall Instance { get { if(_instance == null) _instance = new JNIcall (); return _instance; } }
private AndroidJavaClass cls_jni = new AndroidJavaClass("com.oculusvr.vrlib.PassThroughCamera");
public void Share() {
using(AndroidJavaClass cls_UnityPlayer = new AndroidJavaClass("com.unity3d.player.UnityPlayer")) {
I really have no Idea how to call this in unity3d but I am trying . In the ovr_mobile_sdk_20150106.zip\VRLib\src\com\oculusvr\vrlib
the uncompiled Java info for the Passthough Camera is all there.
I think the functions we are looking to get the texture is this.
If we can find somebody who can explain a workflow then we can have usable access to the PASSTHROUGH camera istead of just Via the OVRPluginEvent.Issue (RenderEventType.PlatformUI); which brings up the menu. But once you add Vuforia this stops working too :cry:
public class PassThroughCamera implements android.graphics.SurfaceTexture.OnFrameAvailableListener {
public static final String TAG = "VrCamera";
SurfaceTexture cameraTexture; Camera camera;
boolean gotFirstFrame;
boolean previewStarted;
long appPtr = 0; // this must be cached for the onFrameAvailable callback 😞
boolean hackVerticalFov = false; // 60 fps preview forces 16:9 aspect, but doesn't report it correctly long startPreviewTime;
public native SurfaceTexture nativeGetCameraSurfaceTexture(long appPtr); public native void nativeSetCameraFov(long appPtr, float fovHorizontal, float fovVertical);
public PassThroughCamera() { Log.d( TAG, "new PassThroughCamera()" ); }
public void enableCameraPreview( long appPtr_ ) { //Log.d( TAG, "enableCameraPreview appPtr is " + Long.toHexString( appPtr ) + " : " + Long.toHexString( appPtr_ ) );
if ( BuildConfig.DEBUG && ( appPtr != appPtr_ ) && ( appPtr != 0 ) ) { //Log.d( TAG, "enableCameraPreview: appPtr changed!" ); assert false; // if this asserts then the wrong instance is being called }
if ( BuildConfig.DEBUG && ( appPtr != appPtr_ ) && ( appPtr != 0 ) ) { //Log.d( TAG, "disableCameraPreview: appPtr changed!" ); assert false; // if this asserts then the wrong instance is being called }
appPtr = appPtr_;
if ( previewStarted ) { stopCameraPreview(appPtr_); } }
public void onFrameAvailable(SurfaceTexture surfaceTexture) { //Log.d( TAG, "onFrameAvailable" ); if (BuildConfig.DEBUG && (appPtr == 0)) { //Log.d( TAG, "onFrameAvailable: appPtr is NULL!" ); assert false; // if this asserts then the wrong instance is being called } // if this asserts, then init was not called first
if (gotFirstFrame) { // We might want to stop the updates now that we have a frame. return; } if ( camera == null ) { // camera was turned off before it was displayed return; } gotFirstFrame = true;
// Now that there is an image ready, tell native code to display it Camera.Parameters parms = camera.getParameters(); float fovH = parms.getHorizontalViewAngle(); float fovV = parms.getVerticalViewAngle();
public void startCameraPreview(long appPtr_) { startPreviewTime = System.nanoTime();
//Log.d( TAG, "startCameraPreview appPtr_ is " + Long.toHexString( appPtr ) + " : " + Long.toHexString( appPtr_ ) ); if ( BuildConfig.DEBUG && ( appPtr != appPtr_ ) && ( appPtr != 0 ) ) { //Log.d( TAG, "startCameraPreview: appPtr changed!" ); assert false; // if this asserts then the wrong instance is being called }
// If we haven't set up the surface / surfaceTexture yet, // do it now. if (cameraTexture == null) { //Log.d( TAG, "cameraTexture = null" ); cameraTexture = nativeGetCameraSurfaceTexture(appPtr_); if (cameraTexture == null) { Log.e(TAG, "nativeGetCameraSurfaceTexture returned NULL"); return; // not set up yet } } cameraTexture.setOnFrameAvailableListener( this );
// for 120fps parms.set("fast-fps-mode", 2); // 2 for 120fps parms.setPreviewFpsRange(120000, 120000);
// for 60fps //parms.set("fast-fps-mode", 1); // 1 for 60fps //parms.setPreviewFpsRange(60000, 60000);
// for 30fps //parms.set("fast-fps-mode", 0); // 0 for 30fps //parms.setPreviewFpsRange(30000, 30000);
// for auto focus parms.set("focus-mode", "continuous-video");
} else { // not support vr mode } Camera.Size preferredSize = parms.getPreferredPreviewSizeForVideo(); Log.v(TAG, "preferredSize: " + preferredSize.width + " x " + preferredSize.height );
List<Integer> formats = parms.getSupportedPreviewFormats(); for (int i = 0; i < formats.size(); i++) { Log.v(TAG, "preview format: " + formats.get(i) ); }
// YV12 format, documented YUV format exposed to software // parms.setPreviewFormat( 842094169 );
// set the preview size to something small List<Camera.Size> previewSizes = parms.getSupportedPreviewSizes(); for (int i = 0; i < previewSizes.size(); i++) { Log.v(TAG, "preview size: " + previewSizes.get(i).width + "," + previewSizes.get(i).height); }
// Camera seems to be 4:3 internally // parms.setPreviewSize(800, 600); // parms.setPreviewSize(960, 720); parms.setPreviewSize(800, 480); // parms.setPreviewSize(1024, 576); // parms.setPreviewSize(320,240);
// set the preview fps to maximum List<int[]> fpsRanges = parms.getSupportedPreviewFpsRange(); for (int i = 0; i < fpsRanges.size(); i++) { Log.v(TAG, "fps range: " + fpsRanges.get(i)[0] + "," + fpsRanges.get(i)[1]); }
public void stopCameraPreview(long appPtr_) { if ( BuildConfig.DEBUG && ( appPtr != appPtr_ ) && ( appPtr != 0 ) ) { //Log.d( TAG, "stopCameraPreview: appPtr changed!" ); assert false; // if this asserts then the wrong instance is being called }
previewStarted = false; nativeSetCameraFov( appPtr_,0.0f, 0.0f ); if ( cameraTexture != null ) { cameraTexture.setOnFrameAvailableListener( null ); } if ( camera != null ) { Log.v(TAG, "camera.stopPreview"); camera.stopPreview(); camera.release(); camera = null; } }
I'm tested and tested and tested and with the webcamtexture, only "works fine" with 640x480 resolution at 60fps.
The problem is the image appears more bigger that in passtrough camera video.
Testing every scene of the demos I find the Movie_PLayer Sample in the sdkexamples of Unity.
There, access natively to java class to load a movie in a plane with a texture. It "could be the same", but call to the PassThroughCamera.class from VRlib. I have to test this...
I'm tried acces directly to the PassThroughCamera:
I added to the manifest this: <activity android:name="com.oculusvr.vrlib.PassThroughCamera" android:theme="@android:style/Theme.Black.NoTitleBar.Fullscreen" android:launchMode="singleTask" android:screenOrientation="landscape" android:configChanges="screenSize|orientation|keyboardHidden|keyboard"> </activity>
And I created this function:(similar to the MoviePlayerSample.cs)
Hi guys, I know the topic is more than two years old and the software have greatly evolve since then but have you been able to make any progress on this topic ?
Vuforia is very efficient in AR but no longer supports Gear VR. Having a clear access to the passthrough camera may help bring mixed reality into GearVR.