Maximum 360 videos size in Unity
I'm finding it difficult to find out the maximum package size for an app published to Oculus Go. I have seen discussion of 4GB and increases possible through downloadable content. The reason I ask is we are creating an app with quite a lot of stereo 360 video with Unity3D. We will have approx 30GB of content and I would like to package this all in one, is that possible? Also even if it's not possible through a release or the store can we do this locally by sideloading? We would look to use a cloud service for downloadable content in the future but obviously this has a cost implication so for the moment for our minimal viable product we will simply sideload content whilst on trial. Is it possible to sideload extra content in this way or to sideload a larger apk containing the 360 video files. Thanks a lot2.5KViews0likes4Comments360 video unity recorder meeting oculus tv specs
Hi, I have a VR scene in unity, aka a 3D world in unity. I filmed it in 360 with my unity recorder. However it saying the 'video cant be played because the file is corrupt'. Here is an image. I opened up the dev tools on my firefox browser and looked at the console it. It gave the errors: Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://graph.oculus.com/video_upload_transfer?access_token=OCAeORB8DCsxiGxUczXa05PeloWUMXGT2NICUgqeue0ZCcacAEb56H7qxzVVGVS26WKpLAxLYGjHZCch1k7ZAvW2WLR73GAEZD&fb_dtsg=NAcOO-_emESTlaTRl2xKpc2OK51nwOU1xp6xBm__Dl33k5JS2FtzcwQ%3A35%3A1656116547&jazoest=25434&lsd=zEE2tPqN9vYzJjVQ4lzgO5. (Reason: CORS request did not succeed). Status code: (null). and Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://graph.oculus.com/video_upload_transfer?access_token=OCAeORB8DCsxiGxUczXa05PeloWUMXGT2NICUgqeue0ZCcacAEb56H7qxzVVGVS26WKpLAxLYGjHZCch1k7ZAvW2WLR73GAEZD&fb_dtsg=NAcOO-_emESTlaTRl2xKpc2OK51nwOU1xp6xBm__Dl33k5JS2FtzcwQ%3A35%3A1656116547&jazoest=25434&lsd=zEE2tPqN9vYzJjVQ4lzgO5. (Reason: CORS request did not succeed). Status code: (null). I am concerned that my unity 360 recording settings did not meet the oculus media specs https://creator.oculus.com/media-studio/documentation/video-spec/ I am pasting my unity recorder settings here as well. Thank you very much to anyone who takes the time to help me with this issue- I am deeply grateful! Thank you, Ozymandias1.1KViews1like1CommentRequired Assets Workflow - Unity / Oculus Quest
Hi, is there a more detailed documentation on how to use the "Required Assets" workflow in Unity for Oculus Quest? The docs just cover it briefly in "unity/ps-assets/" (can't post links yet unfortunately) I want to have a large 8GB video file as "Required Asset" file and use it in the Unity Video Player. How does this work together with the Unity AssetBundle System? How do I exclude the original video from the .apk? How do I reference the video from the "Required Asset" file in the Video Player? I first want to test this locally and not use the Oculus Store but push the .apk locally with ADB. Any help would be very much appreciated!1.3KViews0likes1CommentAndroid MediaCodec
Hello everyone. In our program, HLS 360video data is decoded using Android MediaCodec and rendered on the Unity. It uses MediaCodec to decode the H264 stream, but it looks very slow in Oculus Quest. And MediaCodec stops decoding after a short while. The following log was displayed in logcat. This content is output every time data is sent using MediaCodec's queueInputBuffer. #gpuaddr changes every time. 2019-08-19 11:51:33.251 685-22216/? E/Adreno-C2D: <c2dgsl_unmap_user_mem:1124>: Invalid input (gpuaddr=0x2b1a000) error=0 2019-08-19 11:51:33.251 685-22216/? E/C2DColorConvert: c2dUnMapAddr failed: status 3 gpuaddr 02b1a000 2019-08-19 11:51:33.251 685-22216/? E/C2DColorConvert: unmapping GPU address failed 2019-08-19 11:51:33.251 685-22216/? E/OMX-VDEC-1080P: Failed color conversion 0 After decoding for a few seconds, the log changes to the following. 2019-08-19 11:51:36.036 685-22223/? W/Adreno-GSL: <gsl_ldd_control:549>: ioctl fd 13 code 0xc0200948 (IOCTL_KGSL_GPUOBJ_IMPORT) failed: errno 12 Out of memory 2019-08-19 11:51:36.036 685-22223/? E/Adreno-C2D: <c2dgsl_map_user_mem:1095>: Error while gsl_memory_map_ext_fd(mem_fd=34, hostptr=0xe1f32000, len=1413120, offset=0, flags=3 ) error=-4 2019-08-19 11:51:36.036 685-22223/? E/C2DColorConvert: c2dMapAddr failed: status 3 fd 34 ptr 0xe1f32000 len 1413120 flags 3 It seems that GPU memory allocation and deallocation in the decoder does not work, but the cause is unknown. The source code of the part that uses MediaCodec is posted. # Partially omitted. Decoder initialization process. I tried setting #SPS and PPS, and setting other keys to MediaFormat, but the operation did not change. void initDecoder(final byte[] data) throws IOException { MediaFormat mediaFormat; mediaFormat = MediaFormat.createVideoFormat(mime, viewWidth, viewHeight); mediaFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, viewWidth * viewHeight); try { codec.configure(mediaFormat, null, null, 0); codec.start(); } catch (Exception e) { } } When receiving frame data by streaming, it passes the data to the decoder. boolean decodeFrame(final int iTimestamp, byte[] data) { if (doInit) { try { initDecoder(data); } catch (IOException e) { e.printStackTrace(); } doInit = false; //flgs = MediaCodec.BUFFER_FLAG_CODEC_CONFIG; flgs = MediaCodec.BUFFER_FLAG_KEY_FRAME; } else { flgs = MediaCodec.BUFFER_FLAG_PARTIAL_FRAME; //Tried another flg. But result did not change. } try { final int inputBufIndex = codec.dequeueInputBuffer(1000 * 100); if (inputBufIndex >= 0) { ByteBuffer[] inputBuffers = null; ByteBuffer inputBuf; inputBuf = codec.getInputBuffer(inputBufIndex); inputBuf.put(data); long presentationTimeUs = iTimestamp*1000; codec.queueInputBuffer( inputBufIndex, 0, // offset data.length, presentationTimeUs, flgs ); return true; } } catch (Exception e) { return false; } return false; } This issue has not happened on other Android devices. The same program works fine with Oculus Go, but only Oculus Quest has this problem. Is this a bug specific to Oculus Quest?2.9KViews1like3Comments360 Capture SDK will not capture World Canvas UI Element
Hi all, I was trying out the new 360 Capture SDK by Oculus from this URL. It is working fine until I found out it won't capture any world canvas UI element when doing surrounding capture. Does anyone know why this is? and how do I fix that? Thanks992Views0likes2CommentsNativeVideoPlayer SetPlaybackSpeed crashes app
Hi. I'm building an app for the Oculus Quest and am basing it off of the Stereo180Video Sample Scene. However, when trying to set the playback speed of the video to anything other than 1, the video lags horribly for a few seconds and then the app crashes. I'm using NativeVideoPlayer.SetPlaybackSpeed(2f); to set the speed. In the Exoplayer docs it says that the video may be laggy if the app is running as a debug build instead of a release build. So, I've made sure to uncheck the "Development Build" checkbox in the Build Settings of Unity. I've also added a signing key to the android publishing settings. Is there anything else I need to do to make the app a release build? Does anyone know of anything else that may be causing this issue or how to debug it? Thanks!650Views0likes0CommentsStereoLayer Cubemap stereo working on Quest, not on Rift
Hello there, I have successfully used cubemap stereo layers in the Oculus Quest and it works perfectly. But when I tried to use cubemap stereo layers for the Rift, It shows a small rectangle with portion of the cubemap and everything on the sides is black. Oculus says that VR Composition Layers do work on the Rift but I havent been able to make it work. Does anyone know what might be wrong with my project? I am using 4.22 but I have tried in 4.24 with no success. Thank you!708Views0likes1Comment360 Content on the Overlay Layer
Note to all developers looking to use the overlay layer to render 360 content. There are issues within the Oculus overlay implementation which makes 360 content not a viable option. The overlay layer is rendering straight to the eyebuffer(screen) and needs special care to do so. The edges of an overlay layer tend to have a thin dark outline which is usually not an issue with a quad or cylinder as they can be easily masked by darker backgrounds. Now to render 360 content with the overlay layer, you need to use a equirect layer. The equirect overaly shapes render 360 video/images clear and sharp however there is a massive issue making the equirect option unusable. The edges of the overlay layer still produce a small black line which happens to be on the back side of the sphere. Looking forward is just fine however once you turn around, the black line is always at the back of the video. The line is very thin but extremely noticeable. A strange aspect of the black line is that it is extremely tiny at the poles of the sphere but it is its thickest at the equator, though it is still very thin. I've reproduced this with 4k mono video and stereo video and 360 images. It happens with AVPro, Unity's video player and still image textures. I've primarily tested this with the latest oculus utilities 12/1.44 and Unity 2018.4.9 and 2019.2.3. In my research I also reproduced it on older version of oculus utilities which produced similar results with early versions producing an even thicker line. I am not the only developer who has come across this issue. I would post links to these discussions however, I "have to be around for a little while longer before you can post links". They can be found by searching "overlay" in the forum. In addition, John Carmack replied to a twitter thread about this on July 10th, of 2018, however his responses did address the real issue. In addition to these posts, I have privately spoken to other developers who have had the same exact issue and ended up having to ditch the overlay layer all together to properly render 360 content. I was met with the same fate as my counterparts. I have concluded that it is not possible to properly render 360 content on the oculus overlay layer. Other interesting tidbits i've found about the equirect overlay layers: Oculus offers an example scene with the equirect overlay layer but it is only a 180 video. The entire back half of the sphere is black so the black line not visible. I inserted my own 360 content into this demo scene and switched its custom video player script to 360 from 180 and of course, the black line was there. Oculus does not actually give a demo of the 360 content rendering on the overlay layer. I have personally never seen a single instance of any 360 content rendering on the overlay layer without the black line(within unity). In addition oculus does not actually provide an example of 360 content rendering on the overlay layer, though they have a demo for the much less utilized 180 video. I believe oculus did not deliver any demo scenes with 360 content because they know it doesn't work. The part that I've found troubling is that within the Oculus Go/Quest Overlay documentation it states, "Equirect layers are single textures that are wrapped into a sphere and projected to surround the user’s view. Most commonly used for 360/180 video playback". I was sure that this meant that it was proven and tested to work for 360 content. If it wasn't obvious by the length and detail of this post, i've wasted many hours trying to get this working because of the documentation stated that it works. I hope that any future developers happen to find this post before they go down the same path as I did. If this is correct, I also hope that oculus removes the part about 360 videos from their overlay documentation as it simply is not correct. If someone can prove me wrong, please do.1.7KViews1like3CommentsHigh Resolution Video in Quest App
Hello, I am building an Application on Oculus Quest using Unity engine. In the application there is a scene where I play a 360 video of high resolution (5760x2880); I am using render texture of the same size then a I am using the material in the skybox (lighting). yet the resolution of the video decreses. I wonder if there is another way to include high resolution videos in the app without compromising its resolution when built.502Views0likes0CommentsCan I use OVROverlay to get better 360 video performance?
Hello, I am developing an Oculus GO app in Unity that shows a 360 video player. So far I have been using a VideoPlayer that renders to a RenderTexture and a sphere with the Skybox/Panoramic shader that uses the same RenderTexture. It works great for most videos. But some videos with a lot of motion "flickers". I have a video of a skier going down a hill and every time he passes objects close to the camera the object appears twice. I took a photo of one of the lenses in the GO: As you can see you can clearly see two sticks where it should only be one. And when you actually look into the glasses the effect is even more obvious. I have tested with a 4096x2048 video. But it doesn't seem to matter if I use higher or lower resolution of the videos. I think it is more a limit in the way I render the video to the sphere. Now to the question: So I started reading about the OVROverlay today and when I read the documentation it sounded like this could be used to render videos to a sphere with better performance. I tried to setup a scene with an OVROverlay and a video player but I couldn't get it to work in the Oculus GO. I used the SampleFramework 180 degrees video player as a source. But I couldn't get that sample code to run on my GO at all. Has anyone used an OVROverlay to render 360 videos?1.5KViews0likes3Comments