Forum Discussion
RoryLaneLutter
11 years agoProtege
Getting Started Casual Advice, and a question about the demo
Hi, new developer here.
I've worked primarily in mobile games for the past 5 years as an artist/technical artist.
I just got the DK2 kit a couple weeks ago, but only had the opportunity to set it up last weekend.
I've installed everything, and I can run the demos and get full featured VR, and I'm amazed about the possibilities.
I'm reading through the guides now, and I've built some of the samples from the SDK. When I've done that, I'm going to start working on a project in Unity.
I just wanted to check in and say hello to the board, talk to some developers about some of the questions I have, and basically join this community.
My first question is whether there's any sort of capabilities in the Win_OculusUnityDemoScene to disable the SDK distortion rendering so I could see what the scene would look like if the screen were projecting the original flat renderings.
I'm still just sort of wrapping my head around some of the higher-level concepts of how the OVR works, and I'm very curious what it would be like to be able to switch between the before and after of that system.
My goal is to create a compelling interactive prototype by the GDC, so I don't have much time to get up to speed. If anyone wants to share some of the things they did to figure out how to develop on this platform, I'd love to hear your opinions.
Also, I'd be interested in brainstorming ideas and open to a collaborative project if anyone was interested in partnering up for something more substantial. Either way, I find it helps me thing to talk to other people about this sort of stuff.
I've worked primarily in mobile games for the past 5 years as an artist/technical artist.
I just got the DK2 kit a couple weeks ago, but only had the opportunity to set it up last weekend.
I've installed everything, and I can run the demos and get full featured VR, and I'm amazed about the possibilities.
I'm reading through the guides now, and I've built some of the samples from the SDK. When I've done that, I'm going to start working on a project in Unity.
I just wanted to check in and say hello to the board, talk to some developers about some of the questions I have, and basically join this community.
My first question is whether there's any sort of capabilities in the Win_OculusUnityDemoScene to disable the SDK distortion rendering so I could see what the scene would look like if the screen were projecting the original flat renderings.
I'm still just sort of wrapping my head around some of the higher-level concepts of how the OVR works, and I'm very curious what it would be like to be able to switch between the before and after of that system.
My goal is to create a compelling interactive prototype by the GDC, so I don't have much time to get up to speed. If anyone wants to share some of the things they did to figure out how to develop on this platform, I'd love to hear your opinions.
Also, I'd be interested in brainstorming ideas and open to a collaborative project if anyone was interested in partnering up for something more substantial. Either way, I find it helps me thing to talk to other people about this sort of stuff.
7 Replies
- cyberealityGrand ChampionHey, welcome.
I don't think disabling the distortion rendering will really prove anything.
If you are interested, you can press 'G' on the OculusWorldDemo and it will bring up a grid.
I would also read through the docs i you having already.
https://developer.oculus.com/documentation/ - RoryLaneLutterProtege
"cybereality" wrote:
I don't think disabling the distortion rendering will really prove anything.
I'm still at the "wrap my head around" stage, and it was sort of an epiphany for me that the grid displayed by pressing "G" was actually a distorted grid that was inversely distorted again by the lenses to appear straight. Mind=blown.
Still, while I know that's happening, I can't force my brain to see that when I'm looking through the HMD. By brain sees it as a parallel grid, and I can't shoehorn my perception to perceive that it's a warped grad that conversely counter-warped to appear straight, if you understand what I'm saying. I feel that if I could turn off the SDK distortion rendering for second, and see the original lenz-distorted view without it being corrected, it would help me understand exactly what the effect of the lenses were. I'd like to see what the lenses see before you guys fixed it :P
It's not a bug or anything, but I was wondering if that functionality might be hidden in there somewhere.
And yes, I am reading the documentation. I'm mostly though the Developer Guide, then I'm going to read the Oculus Unity Integration Guide, and finally the Best Practices Guide. At that point, I'm going to be on the lookout for Unity sample scenes that help me understand how to build my own applications.
Do you know of a bare bones Unity scene that successfully sets up orientation and position data from the HMD? - cyberealityGrand Champion
- RedDreadMorganHonored Guest
I'm still at the "wrap my head around" stage, and it was sort of an epiphany for me that the grid displayed by pressing "G" was actually a distorted grid that was inversely distorted again by the lenses to appear straight. Mind=blown.
Still, while I know that's happening, I can't force my brain to see that when I'm looking through the HMD.
The ELI5 short answer is there is a screen inches from your eyes that needs to be in focus and magnified. To achieve this, the light at the edge of the screen must be bent way more than the light 'straight ahead'. Fortunately, light bends/changes speed when it transitions between different materials, like the plastic in the lenses and the two air gaps.
Different wavelengths bend differently, leading to the 'chromatic aberration' at the edges. If you take your spare lenses out and look at the image CyberReality posted, you'll see the 'rainbows' as your eye moves to the edge. The SDK 'pre bends' and 'pre abborates' the colors to to unwind what the lens do.
I understand your desire to learn more about the inner workings, however you don't really need to care about the math or lenses at all. This knowledge won't help you make better content if that's what you are going for here.
IMHO, your time is better spent making new content, consider that you just mentioned that you'd like something done by GDC. That's a very short time frame.
The best thing to do is dive in and look at your content with the HMD, and only care about making something compelling. Be sure to follow the guidelines for content, we don't need another first person shooter barf machine. There is plenty of other content available so you can see what is being done and to be inspired by. - RoryLaneLutterProtege
"cybereality" wrote:
The Unity integration package comes with a Tuscany demo you can import.
You can see the effects of the lenses by just opening a grid image in a web-browser or whatever.
Ah, of course, the DK1 mode. Thanks. - RoryLaneLutterProtegeThanks for your explanation and your advice.
Also, for the first time, I do finally get it.
While I kind of got how the system worked mechanically, only after reading your explanation did I finally grasp why... so you can focus on something so close to your eye because of the obvious point that I'd overlooked that It's hard focus on something that's only an inch or two in front of your eye. The lenz bends the light so you can focus at such a short distance, and the rendering distortion is just to compensate for the necessary evil of needing a lenz to place an led screen that close to your eye.
Thank you."cybereality" wrote:
This knowledge won't help you make better content if that's what you are going for here.
It's how my brain works. I have to understand. Anyway, thanks to some of your help, I finally do. You're right that I don't have much time, and that means I'm trying to learn this stuff very quickly. It was also pretty clear that simulation sickness would be a real challenge, and I'm going to read the rest of the best practices guidelines immediately.
The question really is what sort of project I could do that would really work.
So far, my favorite has been "Night Time Terror" http://vr-bits.com/#nighttimeterror, because it's really the only game I've tried where I get zero simulation sickness.
It's funny... simulation sickness is like the equal opposite of motion sickness. Where in motion sickness, you get nauseous because you feel movement but what you see is stationary, say, being in the bottom of a boat or in the back of a car without a good view of the window. With simulation sickness, you see movement, but you're body is stationary. Unfortunately, they feel pretty similar too.
That's challenged some of my initial ideas of the types of applications I though would be the most interesting before I'd received the kit.
Anyway, thanks for your advice. - RedDreadMorganHonored GuestIt comes down to this. Never move the 'camera' or position of the camera unless the SDK tells you to. I.e. the user turns or moves their head.
Luckys tale moves the camera slowly along, and there might be a threshold here somewhere, but slow is the keyword.
In general mapping any sort of 'first person run-around-and-explore or shoot" experience will make a lot of people barf.
Adding a cockpit or similar can help, there is also Alt-space with it's teleport which I found to not cause upset.
Using a joystick to move the camera (positionally or rotationally) means a lot of people get instant queasy feeling, so I'd recommend avoiding this.
Have a mentioned how you shouldn't move the camera? :)
Ok, off my soap box, good luck in your efforts.
Quick Links
- Horizon Developer Support
- Quest User Forums
- Troubleshooting Forum for problems with a game or app
- Quest Support for problems with your device
Other Meta Support
Related Content
- 6 months ago
- 10 months ago