Forum Discussion
yarongh
9 years agoExplorer
Stereoscopic 360 video player in Unity?
Hi,
I was just wondering - what would be the best way to load a stereo 3D movie file (.mp4 file with the left camera video at the top and the right camera video underneath it) into Unity and then export it to be playable in a HMD such as the Gear VR?
Are there any drawbacks? Like limited resolution or other side effects i should be aware of?
Thanks!
19 Replies
Replies have been turned off for this discussion
- vrdavebOculus StaffTry EasyMovieTexture. The most important thing is to use hardware decoding, which it does. You will probably need to use 2k or higher video for acceptable quality and the video will render to a sphere or skybox that follows the user's head position.
https://www.assetstore.unity3d.com/en/#!/content/10032 - yaronghExplorerWould it render it out as stereo if i give it an over/under .mp4 file like i mentioned?
- vrdavebOculus StaffYes EasyMovieTexture should be able to do that. It just has to select the rect of the movie for each eye and then use Unity's "target eye" camera property to show the content to only that eye.
- yaronghExplorerOkay.
And would it be able to play a 4K 60fps video smoothly on the Gear VR?
Because if i use Unity, i would have to "render" the video into the .apk, no?
Would it be better if i place the .mp4 file on the phone's SD card and load it from there? Is there a way of doing that using Unity? - vrdavebOculus Staff4k is supported by EasyMovieTexture, but I haven't seen 4k @ 60Hz content on Gear VR. It definitely supports 4k x 4k at 30Hz and head tracking will apply at 60Hz. I would recommend trying that first.
- vrdavebOculus StaffSDCard ought to work, but it's significantly slower than main storage. Try including the video in your APK if you can. EasyMovieTexture will load the video, decode it, and put it on a texture for Unity to render with head tracking.
- yaronghExplorerOkay, thanks!
I have a couple of last questions:
1) For both cases (generally using Unity, and using EasyMovieTexture in particular) - is it possible to map a video for some part of the sphere, and then an image at the other parts? For example - have the video displayed in the middle with an image on top and another at the bottom, so it won't have to render 30fps for 100% of the screen, only for the video part.
2) Is it possible to create an player interface in unity (with buttons like play, pause, fast forward..) that you would be able to control with the Gear VR?
3) How would EasyMovieTexture decode the video? Would there be any loss of quality?
As you might have guessed, i'm not too familiar with Unity but i'm still just trying to get my head around the general workflow. - vrdavebOculus Staff> is it possible to map a video for some part of the sphere, and then an image at the other parts?
If part of the video never changes, it should not consume much extra data. But no, you would have to copy the video texture and the static textures into one larger "atlas" texture before rendering with it. Or you could make separate spheres for a static texture and the video texture.
> Is it possible to create an player interface in unity
Yes, EasyMovieTexture comes with samples showing you how to do a seek bar and other things.
> How would EasyMovieTexture decode the video?
I believe it uses MediaPlayer to hardware-decode the video to a SurfaceTexture, which is then bound by Unity. Whether it loses quality depends on how you configure it. In general, the quality is very good. Most of the loss will occur inside Unity, where it samples the texture, renders to the eye buffers, and then samples the eye buffers to render to the display panel. In a future release, we plan to add support for VRAPI's high-quality cubemap layer, which would sample the video directly when rendering to the panel. For traditional 2D movies, you can already use OVROverlay.cs to do this. - yaronghExplorerWhen you said "make separate spheres for a static texture and the video texture" - how would i then combine them to one sphere?
Are there any advantages for using Unity to display stereoscopic video over using the sample player code that comes with Oculus Mobile SDK? - vrdavebOculus Staff> how would i then combine them to one sphere?
This isn't really a great approach. You would have to make separate spheres and put them in the same position and use a transparent shader. Or make custom meshes that only include portions of the sphere. Or use multiple materials on a single custom mesh.
> Are there any advantages for using Unity to display stereoscopic video over using the sample player code that comes with Oculus Mobile SDK?
Not unless you need to render complex geometry in addition to your video. I would recommend using the sample if you can.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 2 years ago
- 9 months ago
- 7 months ago