cancel
Showing results for 
Search instead for 
Did you mean: 

Virtual Keyboard, Interaction SDK and OVRInteraction not Working

knewk1
Honored Guest

Hi,

Using Unity and Quest 3.  I need to use the Virtual Keyboard with OVRInteraction but the keyboard does not work with OVRInteraction SDK components attached to the OVRCamaraRig.  Followed the docs precisely but not luck.  Anyone have some suggestions.  

Working with native Virtual Keyboard sample OVRCameraRig.  Screenshot 2024-02-06 092039.png

Not working after adding OVRInteraction and OVRHands and following docs.

Screenshot 2024-02-06 092418.png

7 REPLIES 7

bilgekagan
Protege

As far as I can see, it looks like you are experimenting with the sample provided by the meta. Are you using updated Meta XR All in One sdk, if you are using you can use bluilding blocks which appears when you click on the meta icon in the bottom right. In this Building Blocks page you can implement virtual keyboard easly. If you are not using check OVR Virtual Keyboard script attached to the keyboard, and control handleft and handright parts are correctly assigned.

knewk1
Honored Guest

Thanks for the help.  The problem appears to be with Quest Link. Doesn't work when running under Unity editor over Quest Link but works when built to device.  The latest Quest link update makes things worse with the Virtual Keyboard not appearing in front of the OVR Camera Rig.  Please retest the Quest Link SW with the OVR Virtual Keyboard sample and connect the OVR Virtual Keyboard to the OVR Interaction SDK hands.

Big_Flex
Meta Employee

Hi @knewk1, the Virtual Keyboard out of the box accepts an OVRHand binding, but with the ISDK running you won't have an OVRHand that is updated. Here is a sample script that shows how an ISDK "synthetic hand visual" can be used to interact with the Virtual Keyboard.

 

 

using System;
using System.Collections.Generic;
using Oculus.Interaction;
using Oculus.Interaction.Input;
using UnityEngine;
/// <summary>
/// Handles Interaction SDK input for the virtual keyboard
/// </summary>
public class VirtualKeyboardISDKHandler : MonoBehaviour
{
    [SerializeField]
    private OVRVirtualKeyboard _virtualKeyboard;

    [SerializeField]
    private SyntheticHand[] _hands;

    private Dictionary<SyntheticHand, Action> _handlers;

    private void Start()
    {
        foreach (var hand in _hands)
        {
            if (!hand)
            {
                continue;
            }
            _handlers[hand] = () =>
            {
                OnHandUpdated(hand);
            };
            hand.WhenHandUpdated += _handlers[hand];
        }
    }

    private void OnDestroy()
    {
        foreach (var hand in _hands)
        {
            if (!hand)
            {
                continue;
            }
            hand.WhenHandUpdated -= _handlers[hand];
        }
    }

    private void OnHandUpdated(SyntheticHand hand)
    {
        if (!hand.IsTrackedDataValid)
        {
            return;
        }
        hand.GetJointPose(HandJointId.HandIndexTip, out var indexTipPose);

        var handVisual = hand.GetComponentInChildren<HandVisual>();
        _virtualKeyboard.SendVirtualKeyboardDirectInput(
            indexTipPose.position,
            (hand.Handedness == Handedness.Right)?
                OVRVirtualKeyboard.InputSource.HandRight : OVRVirtualKeyboard.InputSource.HandLeft,
            hand.GetIndexFingerIsPinching(),
            handVisual?.GetTransformByHandJointId(HandJointId.HandWristRoot)
        );
    }
}

 

 

I'm running into similar problems.

The Virtual keyboard sample has  a weird reverse setup, with a script on the OVRVirtualKeyboard gameobject that handles the InputField. Does this mean I need to create a Virtual Keyboard for EVERY Inputfield I have? I sure hope not.

So I hope someone can help me with the simple, basic and most common situation that 99% of the developers will be using:

  • A Unity canvas with a bunch of InputFields (firstname, lastname, username, password, those kind of things)
  • Using controllers, with the Interaction SDK. No need for hand-tracking
  • When I click on an InputField the virtual keyboard shows up
  • The text I type is handled by the Listener that I linked to InputField.OnEndEdit

@bilgekagan When I go to Oculus->Tool->Building Blocks, I see only one block, Experimental. I know I've seen more in another project, do you have any idea why I'm missing them? I am on All in one SDk version 65.0.0.0. I also get a lot of errors in the console:
NullReferenceException: Object reference not set to an instance of an object
Meta.XR.BuildingBlocks.Editor.Utils.CollectPackageDependencies (Meta.XR.BuildingBlocks.Editor.BlockData blockData, System.Collections.Generic.HashSet`1[T] set)

Thanks

Anton111111
Expert Protege

When i tried OVRKeyboard i had another issue. Rays intersect keyboard in wrong place. And don't show hover cursor on Keyboard. Does anyone know why? 

I'm having a similar issue.  The rays intersect but they're offset from the visual indicator for the raycast. 

So if I aim with the right controller, the ray is on the Q key but intersects/activates the W key.  It's about 50% off so if I'm on the left side of the Q key it works but the center of the Q key registers as a hit on the W key.  Very strange. 

Using the out of the box building blocks.  Poke w/ controllers works as does hands.

Did you ever find a solution?

I managed to find only one solution: throw this keyboard in the trash and use the system one with a fix from here https://communityforums.atmeta.com/t5/Unity-Development/System-keyboard-not-opening-every-time-in-Me...