cancel
Showing results for 
Search instead for 
Did you mean: 

Oculus 0.6.0.0 SDK & Runtime Alpha Release (RC1)

cybereality
Grand Champion
The new 0.6.0.0 RC1 Oculus SDK is now available to the private alpha tester group.

This release is only stable on Direct Mode, so please make sure to be using that for testing.

Please download and provide any feedback as this is critical to fixing bugs before a public release.

oculus_runtime_sdk_0.6.0.0_win_RC1.exe:
https://s3.amazonaws.com/static.oculus. ... in_RC1.exe

ovr_sdk_all_src_0.6.0.0_RC1.zip:
https://s3.amazonaws.com/static.oculus. ... .0_RC1.zip

ovr_sdk_win_0.6.0.0_RC1.zip:
https://s3.amazonaws.com/static.oculus. ... .0_RC1.zip

ovr_unity_0.6.0.0_lib_RC1.zip:
https://s3.amazonaws.com/static.oculus. ... ib_RC1.zip
AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV
32 REPLIES 32

jherico
Adventurer
"Tojiro" wrote:
I know that Valve has a method for that but I also don't believe they're using Timewarp. Timewarp makes is feasible that any given point on the texture may be sampled depecding on the apps latency and how vigorous the user's head motion is, which I'm pretty sure prevents this stencil method from being used.


Yes, that's true, but even now timewarp can produce black borders on the edges if your head motions are fast and unpredictable enough.All the stencil mechanism would do is change the shape of such black borders.

Actually it occurs to me now that a stencil on the texture wouldn't really do any good (you could apply one during rendering to reduce your load, but providing it to the SDK wouldn't really reduce their load during distortion. On the other hand if you could provide the SDK with your stencil mesh at the time of creation of the distortion mesh, they'd be able to reduce the total amount of work done there.

And yes, this idea is based on the Valve GDC discussion of reducing pixels via stenciling.

I still think it's a mistake to remove all access to the distortion information. Granted, most developers will never want to use it, but some will want to experiment with pre-distortion mechanisms, even if it precludes the use of timewarp. Otherwise I don't see the point of ovrLayerType_Direct.
Brad Davis - Developer for High Fidelity Co-author of Oculus Rift in Action

brantlew
Adventurer
"jherico" wrote:
Otherwise I don't see the point of ovrLayerType_Direct.


There are some specialized use-cases for it. It works great as a diagnostic or debug layer for instance if you need unwarped screen-space text outside of the visible FOV. The Config Util also uses this layer to display undistorted IPD measurement lines.

Tojiro
Protege
Just noticed that the Oculus Developers Guide is still instructing you to call ovrHmd_AttachToWindow in the Rendering Setup Outline. Probably want to correct that.

jherico
Adventurer
OK, here's a theory. Hiding the distortion mesh from the API means that OpenVR can never encapsulate the Oculus SDK past version 0.5.x. At least not and use their own compositor.

On the one hand, yes, it means that they won't be able to automatically incorporate new Oculus SDK features without doing any work. On the other hand, it's likely to fracture the community by forcing them to choose between 'best Oculus support' and 'best Vive support' if they don't want to go down the path of supporting both OpenVR and the Oculus API.

But it's a bad decision.

For one thing, as soon as the OpenVR API was announced, I had several people who were depending on my Java Oculus SDK bindings ask me if we/they should simply switch development to OpenVR, since it was available for cross-HMD development. I pointed out that the OpenVR API didn't seem to support timewarp, and so for the best experience on an Oculus headset, writing directly to the native SDK was still the best way to go for user experience. I've also said as much in a reddit thread on the OpenVR release.

But the response I got was that if it reduces the development burden on them, supporting multiple headsets with one API was preferable, even if the end result is a poorer user experience on the Rift.

Along with the fact that lack of access to the distortion parameters is going to hurt experimenters, this just seems like cutting off your nose to spite your face. I think that the apparent animosity between Oculus and Valve is causing Oculus to dig it's own grave.

Yes, your compositor is better than Valve's on your headset, right now (at least for OpenGL), but I wouldn't expect that to last.
Brad Davis - Developer for High Fidelity Co-author of Oculus Rift in Action

owenwp
Expert Protege
Attached logs and profile as requested.

jherico
Adventurer
One thing I noticed until I was able to re-disable the HSW was that it appeared to be rendering either incorrectly, or at infinite distance. If it's rendered at infinite distance, but superimposed over a scene with depth information in it, it can cause some discomfort as the eyes try to deal with the depth cue disparity between the scene over which it's being rendered and the distance it appears to be rendered at.
Brad Davis - Developer for High Fidelity Co-author of Oculus Rift in Action

Tojiro
Protege
"jherico" wrote:
One thing I noticed until I was able to re-disable the HSW was that it appeared to be rendering either incorrectly, or at infinite distance. If it's rendered at infinite distance, but superimposed over a scene with depth information in it, it can cause some discomfort as the eyes try to deal with the depth cue disparity between the scene over which it's being rendered and the distance it appears to be rendered at.


I've noticed the same thing, but it seems inconsistent. Sometimes my eyes can resolve the HSW very easily, but other times I have a lot of trouble focusing and the text appears doubled up unless I stare at it really hard. I noticed that it seems most "out-of-focus" at the top and bottom of the message, though, so I had assumed it was a problem with me not consistently hitting the focal sweet spot as I pulled the headset off and on repeatedly. Maybe not.

DeanOfTheDriver
Protege
"owenwp" wrote:
Direct mode is not working for me on my laptop with the demo scene in the config utility.


Optimus is not supported in either extended or direct mode. It adds significant latency in either setup and given how cross-adapter resources are managed is additionally problematic for direct mode. We haven't completely disqualified laptop support but it's not our focus at the moment.

cybereality
Grand Champion
@owenwp: Were the previous runtimes (0.5.0.1 or 0.4.4) working on that setup?
AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV

owenwp
Expert Protege
0.5 and all previous versions work perfectly in direct mode, with no added latency over my beefy work PC as long as the app can hit 75 hz. I have been using it occasionally for VR development for quite a while, and it has been my demo machine since DK2 launch.