Oculus Avatar Hands with Controller example
Hi, I need to teach a player how to control a project i am creating, and would benefit a lot if i could use the Oculus Avatar hands with the controller in them, because they react when you place your fingers on each button and move the thumb stick. They can be found in the Avatar Samples. Does anyone know how i can migrate these into my project or where i can find a tutorial do something similar?878Views0likes2CommentsUnreal bugs/errors with Rift/Go/Quest and their status
I've made a list of current bugs/errors encountered in Unreal when developing for Rift / Quest / Go. Some can be somewhat fixed / avoided, some are still very present. I encourage people to add to the list or bring their solution. Would be great also to have input from @NinjaGaijin and @Ross_Beef on their possible ETA or resolution. ;) This is up-to-date with latest Unreal (4.22.2) Oculus integration found on github, dating back from June 7. There it is: 1. Retrieve Oculus ID and Verify Entitlement fail on Go/Quest a. STATUS: can be fixed by changing a line in OculusIdentityCallbackProxy.cpp. See at the end of list for complete solution. b. Reference: https://developer.oculus.com/bugs/bug/2343978258981741/ 2. Entitlement fail on Quest if no access to Quest API a.STATUS: can be fixed by commenting out a line in OculusMobile_APL.xml. See at the end of list for complete solution. 3. PlayDynamicForceFeedback has errors when nativizing assets a. STATUS: can be fixed by changing \Engine\Source\Runtime\Engine\Classes\GameFramework\PlayerController.h line 1053 private to public, because 'PlayDynamicForceFeedback' is a private member of 'APlayerController', for reasons unknown b. Reference: https://answers.unrealengine.com/questions/831405/playdynamicforcefeedback-node-in-420-fails-to-cook.html ; 4. Oculus Go/Quest : No VOIP / or no LipSync with Oculus Avatars a. STATUS: NOT FIXED b. Reference: https://developer.oculus.com/bugs/bug/471102320355137/ ; c. Background infos: Android only allows access to the microphone from a single process. This wasn't an issue when networking avatars previously, as the mic input wasn't being used. But with the expressive update, we specifically need to run the mic through the OVRLipsync plugin to generate blend-shapes and drive the mouth shapes. Trying to hook up the mic to both VoIP and Lipsync therefore causes an inevitable race condition. The loser gets a bunch of zeros. So either there's no networked audio, or no blend-shapes. 5. Multiplayer Oculus Avatar is broken with official Oculus Avatar Plugin a. STATUS: NOT FIXED b. Use our in-house patched plugin/template, or use Photon Engine 6. Oculus Rift / S: Enabling Splash in Project Settings / Oculus Settings crash when loading next level a. STATUS: NOT FIXED b. Use blueprints to Set and Show Splash Screen 7. Stereo Layers are transluscent in ES3.1 (Oculus Go/Quest) / Vulkan (Oculus Quest) a. STATUS: NOT FIXED 8. Performance issues when a Render target is added to Spectator screen (VR) in a packaged game a. STATUS: FIXED IN 4.23 b. Reference: https://issues.unrealengine.com/issue/UE-70352 ; c. IN 4.22, don't use Spectator Screens, or use them only in editor: 9. Cannot access Oculus Quest platform features a. STATUS: You have to be greenlighted by Oculus before doing so or Oculus Business users will have access to Business Suite Q3-4 2019 10. When a stereo layer is present rendering a widget it cannot be destroyed. a. STATUS: NOT FIXED b. Reference: https://answers.unrealengine.com/questions/829084/stereo-layer-isnt-destroyed-on-end-play.html ; c. Avoid using stereo layers if you need to destroy them 11. Quest: Oculus Avatars and Oculus Audio use 32-bits Android libraries (armeabi-v7a); no arm64 librairies available a. STATUS: NOT FIXED b. Use 32-bits librairies on Quest if using Oculus Audio and/or Avatars3.5KViews2likes11CommentsOculus Avatars in a multiplayer session
Some months ago you released Oculus Avatar SDK for Unreal. With it you also released a sample project which shows how to spawn local and remote avatars (classes ALocalAvatar and ARemoteAvatar). The same project does a sort of package recording for replicating local movements to the remote avatars. What we noticed is that all the stuff is kept local; the structure which keeps recorded packets is opaque and, actually, this is unusable for a real multiplayer application: all the local avatar's movements are stored on a local structure and all "remote" avatars read from it and replicate movements on themselves. Since we would like to use Oculus avatars, what we think to do is to track hands and head transformations and replicate them from a player to another; a sort of custom packet recording without opaque structures. Before doing it, is there a "right way" of using your ALocalAvatar and ARemoteAvatar for replicating movements in a real multiplayer application? If not, did you schedule something for the next future which can help us on this?1.3KViews0likes2Comments