Android API for video recording plugin in Unity
Hello, currently I'm working on a project that needs to record what the user is observing in the Oculus Quest 2. The recording should be triggered by an UI button that the user can use whenever it wants. I didn't find any dedicated asset for this purpose, but I found a solution for Android, in particular a screen recorder: LINK I tested in my smartphone and everything is working fine I tested in the Oculus Quest, the build has no errors and the application doesn't crash during the use with the headset. The problem is that the videos registered are empty! The code responsible to setup the video and start recording is this: private void Start() { DontDestroyOnLoad(gameObject); #if UNITY_ANDROID && !UNITY_EDITOR using (AndroidJavaClass unityClass = new AndroidJavaClass("com.unity3d.player.UnityPlayer")) { androidRecorder = unityClass.GetStatic<AndroidJavaObject>("currentActivity"); androidRecorder.Call("setUpSaveFolder","Tee");//custom your save folder to Movies/Tee, by defaut it will use Movies/AndroidUtils int width = (int)(Screen.width > SCREEN_WIDTH ? SCREEN_WIDTH : Screen.width); int height = Screen.width > SCREEN_WIDTH ? (int)(Screen.height * SCREEN_WIDTH / Screen.width) : Screen.height; int bitrate = (int)(1f * width * height / 100 * 240 * 7); int fps = 30; bool audioEnable=true; androidRecorder.Call("setupVideo", width, height,bitrate, fps,audioEnable);//this line manual sets the video record setting. You ca use the defaut setting by comment this code block } #endif } When the user press the record button, the script calls this method: #region Android Recorder public void StartRecording() { #if UNITY_ANDROID && !UNITY_EDITOR if (!AndroidUtils.IsPermitted(AndroidPermission.RECORD_AUDIO))//RECORD_AUDIO is declared inside plugin manifest but we need to request it manualy { AndroidUtils.RequestPermission(AndroidPermission.RECORD_AUDIO); onAllowCallback = () => { androidRecorder.Call("startRecording"); }; onDenyCallback = () => { ShowToast("Need RECORD_AUDIO permission to record voice");}; onDenyAndNeverAskAgainCallback = () => { ShowToast("Need RECORD_AUDIO permission to record voice");}; } else androidRecorder.Call("startRecording"); #endif } I'm trying to find the issue during these steps and find out the reason why the videos are empty. Could I modify this script in order to adapt it to the Oculus "screen"? Thanks2.6KViews1like0CommentsAccess to internal storage files
Hello, I'm developing an application that should record the VR experience in a video and store it in the "gallery" of the Oculus. The script that I'm using in unity is developed for Android devices, and the path that is considered is this one: System.IO.Directory.CreateDirectory(Application.persistentDataPath + GALLERY_PATH); where GALLERY_PATH = "/../../../../DCIM/VideoRecorders"; My questions are: Is this path recognized by the Oculus Quest? Can I access to the gallery of the Oculus while I'm using it in the developer mode? Thanks,1.3KViews0likes0CommentsRequired Assets Workflow - Unity / Oculus Quest
Hi, is there a more detailed documentation on how to use the "Required Assets" workflow in Unity for Oculus Quest? The docs just cover it briefly in "unity/ps-assets/" (can't post links yet unfortunately) I want to have a large 8GB video file as "Required Asset" file and use it in the Unity Video Player. How does this work together with the Unity AssetBundle System? How do I exclude the original video from the .apk? How do I reference the video from the "Required Asset" file in the Video Player? I first want to test this locally and not use the Oculus Store but push the .apk locally with ADB. Any help would be very much appreciated!1.3KViews0likes1CommentCan Oculus on PC not play 60fps videos smoothly?
I'm building a Unity app that plays back 180 degree top/bottom stereoscopic 1920x3840 VR 60fpsvideo. I can get very smooth playback on Go and Quest with H.265 @ 31Mbs with Unity's default Video Player. I can't however get smooth playback on PC... I'm attempting with the following codecs: H264 @ 75Mbs, 50Mbs, 25Mbs, 12Mbs H265 @ 31Mbs VP9 @22Mbs, 12Mbs I've tried the following players: - Unity Video Player - Unversal Media Player With these players I seem unable to determine if any hardware acceleration is taking place. The PC I'm testing on has a RTX 2080 and an i5 8500 running Windows 10. I'm testing with an Oculus Rift S and Quest via link. All videos are somewhat choppy, even the lower quality ones. To get the playback to be smooth on the android devices, I had to set the devices refresh rate to 60 fps, the same as the videos FPS to eliminate any stuttering. It seems however you cannot change the fps on Rift, Rift S or Quest Link which run at 90, 80 and 72 fps respectively. Is there any way to sync up the refresh rate with the fps on PC or is there something else I'm supposed to be doing for smooth playback? How does other software achieve it?1.7KViews0likes2CommentsExpansion File Bundled Scene Not Loading Video (Go)
Hello, I recently used this tutorial from Gabor to attempt to get around size limitations for apks by creating an asset bundle and turning it into an expansion file. My app is essentially an interactive video player, with a 3.65GB video, in a single scene. I removed the scene from the build list, turned it into an asset bundle obb and created a new scene that loads the asset bundle and starts the loaded scene, it works in loading the scene and the interface, except the video doesn't play. Using Unity 2018.3.6f1 and Oculus SDK 1.34. Any help would be greatly appreciated, thanks!3.8KViews1like10CommentsOculus Go: How to make trigger pause & play a 2D video on viewing screen placed within a VR scene?
Designer learning development here, need some help navigating Unity for an Oculus Go project. I couldn't find an answer in forums so apologies if redundant. I have an environment built out with a rectangle/viewing screen floating in the environment. I'm trying to make it so I can use the Oculus Go trigger to pause and play a video on the floating rectangle/screen. I'm not familiar enough with the video player component or Oculus Go input to figure this out. I've spent a bunch of time on Youtube and trying to find any specific documentation and just haven't cracked it yet. Any and all tips or direction is super welcome, thank you!451Views0likes0Commentscompositor layer shape feature request: halfsphere shape using for 180 video in unity
my halfsphere overlay shape feature request comes from plyaing 180 video. after a few test in playing 360 video, i found that if playing 180 video with full sphere, i must double the texture2d in horizontal direction which render the extra pixels in clear color to make 180 video work. so, if i play a 1000 * 1000 normal 180 video, i will create texture in 2000 * 1000, and copy texture like this: TextureSize = new OVRPlugin.Sizei() { w = ( textures[0].width * 2), h = (textures[0].height) }; // just copy src into dst's half width & full height Texture et = layerTextures[eyeId].swapChain[stage]; Graphics.CopyTexture(textures[0], 0, 0, 0, 0, textures[0].width, textures[0].height, et, 0, 0, 0, 0); is there a way for render 180 video just using its video texture's size in overlay? not double its width or height or something killing performance.......605Views0likes0CommentsOVROverlay render depth conflict with Pixvana video player
Hello! I'm running into an issue using the OVROverlay with Gear VR. The OVROverlay works fine at first, as expected, but it seems that after playing a video in Pixvana at runtime, going back to the OVROverlay video is rendered in the correct place but seems like it's at the wrong depth. It looks closer than the plane actually is. It seems like the Pixvana stuff is accessing Oculus render textures and other magic things so I'm wondering if there's a conflict somehow. Looking at OVROverlay, it's not quite clear to me how it determines what depth(distance) to render the texture at. Is it possible to reset this distance or force it to update to the correct position of the plane its being rendered to? Any ideas of what the problem my be? Thanks. Peace!793Views0likes1CommentWhat Resolution, FPS, Bitrate of video 360 can be used in Gear VR / Rift?
Hello friends! I need your help and advice. In my game / application uses 360 videos. These video clips were obtained from Unity by capture and then converted with the help of Adobe Premiere Pro CC in full 360 videos. What settings should video files have? Whatever it is equally well looked on Giar Gear VR / Rift? I made video clips 360 with the following parameters: - 4K resolution (3840x2160) - 60 frames per second - 60 Mbit bitrate It looks good on the computer, but slows in Unity. Maybe there are recommendations from Oculus for 360 videos?3.7KViews0likes2CommentsHighest quality 360 video in Rift and Gear VR created with Unity, what is possible, what do we know?
Honestly I am confused when it comes to the state of 360 video (lets talk monoscopic for the sake of this discussion for now). We use Rift CV 1 at live events and Gear VR for field use with our clients. So far we have been navigating the novelty VR wave very well (pharma and education) but as of late we are seeing push backs and concerns especially in terms of video quality. We develop all our solutions in Unity at them moment utilizing AVPro to project a 4k equirectangular video file onto a sphere. In the past we also have successfully used the Easy Movie Texture. Video plays back fine with both solutions, but the fact remains that I am zooming into a 4k video with less than 720p quality remaining in front of my eyes. I am talking especially about text within videos which always seems blurry and pixelated. Here is the thing: I have seen better qualities by now in players like Little Star's and some stuff I have seen at OC3. So what is the next step here? Cubemaps, adaptive dynamic streaming all which have been mentioned at OC3 again, but then it is very sparse to find anything about it here or on the web in general. I am currently planning a big 360 shoot for a project kicking off next month and I am confused as if we need more than 4k to support adaptive ideas. There are two particular scenarios I am concerned about and interested in what other are doing to solve these issues: A: There is a very interesting discussion on this forum where to author talks about a Penguin scene and mentions that we technically could only use video for the moving segments of a scene. So lets say we look at the scene as a cubemap and we would only use one side as a 4k video segment and the rest as still frames, this could be of course also done in a sphere with some transparency shaders. Did somebody do this successfully and what camera would you use for such a setup as a 4k Gear 360 cannot deliver more than 4k in total. B: More importantly we have moving scenes. e.g. driving a convertible car. These scenes don't easily qualify for the A solution since we have movement everywhere. This is where above mentioned adaptive ideas come to mind where we show selective parts of the videos based on the users head position, this would require many video versions and some good prediction algorithm. I want to point out that I want to make the discussion about quality and not about size in terms of streaming or size of the applications. What have you done and what is available to "normal" people or even via licensing models at this moment. Just to throw it out there: Are we stuck with Unity here? I hope not but I want to keep this discussion open. I want to thank anyone who found the time to read through this long post and hope that we can get a discussion going here, even if you just shoot me a bunch of links. I pledge that I will keep this post updated with findings and approaches we will implement to solve the quality issue to push the quality of VR forward and strengthen our sales pitch.1.8KViews0likes3Comments