Forum Discussion

🚨 This forum is archived and read-only. To submit a forum post, please visit our new Developer Forum. 🚨

11 Replies

  • suyash's avatar
    suyash
    Honored Guest

    Hi mouse_bear I read the blog and am quite excited to try the Presence Platform. How do I use these APIs in Native Android and Unity apps, please? Are these part of the SDK? Any technical documentation I can refer to learn more about the presence platform? Thanks

  • Hi I have some feedback.

    I'm very disappointed that v34 was released before the sdk was.

    It broke my Passthrough enabled app. Now I have angry people who bought my game and can't play it anymore.

    I can't fix the issue because the sdk didn't release at the same time.

     

    Please try to avoid that in the future because as an indi-dev it really hurts my chances of surviving!

     

    Regards,

    Caz

    • JeffNik's avatar
      JeffNik
      MVP

      It probably wasn't a great idea on your part to charge people for an app that used features that were not released for publication. "Experimental" means that the API will probably change or could even be dropped. I'm not trying to sound harsh, but Oculus wasn't the cause of your issue.

  • The features seem excellent and great start to a MR platform. Would request image tracking and maybe in the future a method for assigning objects to expand object tracking, plus face and body detection. I’m sure there is a long list of features in the works just voicing my vote. UNREAL SUPPORT PLEASE.

  • WE need a ton of more Unreal support We NEED to be able to RUN UNREAL IN BACKGROUND

    Can't release game until I can please help

  • I am looking for online guidance on how to setup the hands sdk and presence platform. I am a bit confused as it seems I must use the XR Plugin in order to enable OpenVR support which is necessary for Passthrough, but the Hands SDK lacks basic support when using OpenVR. Is there a solution? Thanks. 

    • jamesehenderson's avatar
      jamesehenderson
      Retired Support

      We identified an issue with the OpenXR hands and are landing a fix with v35. Please let me know if updating to the v35 OS+SDK does not resolve your issue!

      • ralphVR's avatar
        ralphVR
        Protege

        Thanks for the update. I will give it a go and report back if v35 does not resolve my issue.

  • Currently with v40 of the Unity Integration, there's an issue with the Reticles only being displayed in one eye in all of the "DistanceGrab" examples. This happens using Multiview and Vulkan as the Graphics API.

     

    It's trivial to add support for multiview to the "PolylineUnlit" shader:

     

    Spoiler
    /************************************************************************************
    Copyright : Copyright (c) Facebook Technologies, LLC and its affiliates. All rights reserved.
    
    Your use of this SDK or tool is subject to the Oculus SDK License Agreement, available at
    https://developer.oculus.com/licenses/oculussdk/
    
    Unless required by applicable law or agreed to in writing, the Utilities SDK distributed
    under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF
    ANY KIND, either express or implied. See the License for the specific language governing
    permissions and limitations under the License.
    ************************************************************************************/
    
    Shader "Custom/PolylineUnlit" {
    
        Properties { }
    
        SubShader
        {
    
            Tags { "Queue"="Geometry" }
    
            Pass
            {
                CGPROGRAM
                #pragma vertex vert
                #pragma fragment frag
                #pragma multi_compile_fwdbase nolightmap nodirlightmap nodynlightmap novertexlight
                #pragma target 5.0
                #include "UnityCG.cginc"
                #include "./CubePointToSegment.cginc"
                #include "../ThirdParty/Shaders/CapsuleRayIntersect.cginc"
    
                #if SHADER_TARGET >= 45
                StructuredBuffer<float4> _PositionBuffer;
                StructuredBuffer<float4> _ColorBuffer;
                #endif
    
                struct v2f {
                    float4 pos : SV_POSITION;
                    sample float3 worldPos : TEXCOORD0;
                    float4 p0 : TEXCOORD1;
                    float4 p1 : TEXCOORD2;
                    float4 col0 : TEXCOORD3;
                    float4 col1 : TEXCOORD4;
                    UNITY_VERTEX_OUTPUT_STEREO
                };
    
                float _Scale;
                float4x4 _LocalToWorld;
    
                v2f vert(appdata_full v, uint instanceID : SV_InstanceID)
                {
                    #if SHADER_TARGET >= 45
                        float4 p0 = _PositionBuffer[instanceID * 2];
                        float4 p1 = _PositionBuffer[instanceID * 2 + 1];
                        float4 col0 = _ColorBuffer[instanceID * 2];
                        float4 col1 = _ColorBuffer[instanceID * 2 + 1];
                    #else
                        float4 p0 = 0;
                        float4 p1 = 0;
                        float4 col0 = 0;
                        float4 col1 = 0;
                    #endif
    
                    v2f o;
    
                    UNITY_SETUP_INSTANCE_ID(v);
                    UNITY_INITIALIZE_OUTPUT(v2f, o);
                    UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o);
    
                    float3 localPos = orientCubePointToSegmentWithWidth(v.vertex.xyz, p0.xyz, p1.xyz, p0.w, p0.w);
                    float3 worldPos = mul(_LocalToWorld, float4(localPos, 1.0)).xyz;
    
                    // Apply MVP matrix to model
                    o.pos = mul(UNITY_MATRIX_VP, float4(worldPos, 1.0));
                    o.worldPos = worldPos;
                    o.p0 = float4(mul(_LocalToWorld, float4(p0.xyz, 1.0)).xyz, p0.w);
                    o.p1 = float4(mul(_LocalToWorld, float4(p1.xyz, 1.0)).xyz, p1.w);
                    o.col0 = col0;
                    o.col1 = col1;
    
                    return o;
                }
    
                fixed4 frag (v2f i, out float out_depth : SV_Depth) : SV_Target
                {
                    float3 rayDir = normalize(i.worldPos - _WorldSpaceCameraPos.xyz);
                    float dist = capIntersect(_WorldSpaceCameraPos.xyz, rayDir, i.p0, i.p1,
                                            i.p0.w/2.0f * _Scale); // hardcoded sphere at 0,0,0 radius .5
                    clip(dist);
    
                    // calculate world space hit position
                    float3 hitPos = _WorldSpaceCameraPos.xyz + rayDir * dist;
    
                    // set output depth
                    float4 clipPos = UnityWorldToClipPos(hitPos);
    
                    out_depth =  clipPos.z / clipPos.w;
    
                    #if !defined(UNITY_REVERSED_Z)
                    out_depth = out_depth * 0.5 + 0.5;
                    #endif
    
                    float3 vec = i.p1.xyz - i.p0.xyz;
                    float dotvecvec = dot(vec, vec);
                    float t = 0.0f;
                    if(abs(dotvecvec) > 0.0f)
                    {
                        float3 toHit = hitPos - i.p0.xyz;
                        t = dot(toHit, vec)/dotvecvec;
                    }
                    return float4(lerp(i.col0, i.col1, t).rgb, 1.0f);
                }
                ENDCG
            }
    
        }
    }

    However, the shader is written in a way where the dashed lines won't match up between the two eyes, leading to a stereo disparity issue. Any ideas on how to fix that?