cancel
Showing results for 
Search instead for 
Did you mean: 

What are you trying out?

johnc
Honored Guest
I would like to hear what everyone is trying with the SDK -- there are a lot of cases where a little bit of guidance from us may save a lot of frustration on your part.
6 REPLIES 6

seeme
Honored Guest
Hello,

I'm currently adding Moonlight support in one of our mobile in house engine. I'm still using the pixel shader we used with DK1 in a slightly tweaked version due to time constraints.

I didn't have the time to go in depth in the provided framework yet, but as in house demos will be available, I hope I'll have the time to improve on this basic implementation.

johnc
Honored Guest
When you say "the pixel shader we used with DK1" do you mean for drawing the environment, or doing the final distortion to the screen?

This probably needs to be made clear in the docs, but you really can't make an acceptable VR app on Android going through the conventional drawing and swapbuffers path, because the latency from triple buffering will just kill you. Draw into the eye buffers however you want, but let VrLib timewarp to the front buffer synced with the raster scan. If you can't integrate that with your engine, let me know the details, and I'll figure out what we need to do.

"seeme" wrote:
Hello,

I'm currently adding Moonlight support in one of our mobile in house engine. I'm still using the pixel shader we used with DK1 in a slightly tweaked version due to time constraints.

I didn't have the time to go in depth in the provided framework yet, but as in house demos will be available, I hope I'll have the time to improve on this basic implementation.

sfaok
Protege
I'm attempting an underwater experience based on a demo called Ocean Rift I released last Summer. The redesign is basically a set of snow-globe like scenes (coral reef, shipwreck, cave, dolphins, prehistoric etc.) you can teleport between. I'm using procedural animation for the big animals.

The biggest challenge I've run into so far is that a lot of underwater effects are alpha based. Swaying grass, floating particles, bubbles, volumetric fog, dancing light shafts, even the fins on schooling fish are alpha cutout. I'm currently experimenting with vertex shaders / polygonal alternatives.
Developer of Ocean Rift. Follow me on Twitter @sfaok

johnc
Honored Guest
If you are stuck with alpha cutout, GL_SAMPLE_ALPHA_TO_COVERAGE will help a bit with the 2x MSAA we recommend. If you are thresholding at 0.5, you would want your fragment shader to expand the passing 0.5 - 1.0 range to 0.0 - 1.0 in alpha for the coverage conversion.

It would certainly look a lot better if you can even roughly sort and fully blend the geometry.

seeme
Honored Guest
"johnc" wrote:
When you say "the pixel shader we used with DK1" do you mean for drawing the environment, or doing the final distortion to the screen?

This probably needs to be made clear in the docs, but you really can't make an acceptable VR app on Android going through the conventional drawing and swapbuffers path, because the latency from triple buffering will just kill you. Draw into the eye buffers however you want, but let VrLib timewarp to the front buffer synced with the raster scan. If you can't integrate that with your engine, let me know the details, and I'll figure out what we need to do.

"seeme" wrote:
Hello,

I'm currently adding Moonlight support in one of our mobile in house engine. I'm still using the pixel shader we used with DK1 in a slightly tweaked version due to time constraints.

I didn't have the time to go in depth in the provided framework yet, but as in house demos will be available, I hope I'll have the time to improve on this basic implementation.


I'm aware of that issue and now that I have a working sandbox, I'm diving into the implementation you provide with the sdk.

I'm having issues getting access to the logs, as the usb port is used by the headset (due to some company policies, debugging over wifi is not an option) and the usb port on the side seems to only carry power.

On another note, it seems to me that getting the optical specs of the headset is not available through the C API. I guess this is part of the idea of using the warp you mentioned.

ahowland
Honored Guest
Debugging is definitely hard without adb over wifi. If that's a no-go, you'll just have to clear your log, run some tests, then adb logcat the new logs over afterwards. It's the worst, I know.

Also that spare USB power-only jack is most likely going away in future dev kits. Apparently due to the (rare, possibly unconfirmed) reports of charging phones occasionally catching fire, there was concern over letting people charge a phone while it's on their face.