The phalanges are driving me mad!
Hi all, I'm trying to add a simple sphere collider at the hand joint : index tip. so i thought this would be easy in the documentation i found the accessor function gethandtransform(), so i grabbed the gameobject of this transform tried to add the sphere collider..... and no luck. next step i tried to create a game object at the point, set its parent to the gethandtransform, set its position ect ect and still no luck, then i spotted that in the hierarchy that the object right hand is disabled. how can i achieve what i want to do ?1.5KViews0likes5CommentsFirewall rule sets
Hi, I am currently developing a multiplayer VR Experience in Unity with Oculus SDK Integration. Our company has some strict Firewall settings and I need to tell our IT department IP Addresses ports and protocols for new rules sets. My current issue is, that I cannot use the Oculus store, friend list and the SDK is not able to load the custom avatars via ID from the Oculus servers. We already implemented some rules. We already set IP range 31.13.84.xxx TCP / TLS and 69.171.250.48 but still seem to be missing something. My app only loads the default avatar but not the customisation. Everything works when using a normal Internet connection, so its definitely our IT's firewall fault. It would be nice to get an exhaustive list of IP addresses and Ports to open, that I can send to our IT department to properly use all Oculus services. Best7.8KViews2likes5CommentsCustom Oculus Avatar Mesh on Unity
I am using the following documentation to load a personalized avatar into my unity project: https://developer.oculus.com/documentation/avatarsdk/latest/concepts/avatars-sdk-unity/ We have followed steps 1-4 on this documentation, including creating an oculus app id and pasting it in both the OculusAvatars and OculusPlatform. I am running into two issues: 1. The "PlatformManager.cs" already exists in the project (it is included in the OvrAvatar folder) 2. When I use a unique name like "PlatformManager1.cs" and continue the remaining steps, my personalized avatar does not load, the standard blue avatar remains. The only console error I get is "Unrecognized message type: 78859427" Am I doing something wrong? Is there an updated guide for including the personalized avatar in Unity? Please help!4.6KViews0likes9CommentsOculus Avatars hands animations on a full body avatar?
I’ve implemented a full body avatar using Final IK in my prototype and while the results are beyond my expectations so far, it’s missing hands animations. So I was wondering if there was a way to use the awesome Oculus Avatar hand animations on a standard humanoid rig?2.8KViews0likes7CommentsOculus Avatar Sdk for Unity
Hi. Im trying to figure out how Oculus Avatar SDK works. For testing im using Oculus Quest. The problem is - I cant get user ID, which i need to load his/her own avatar of Oculus account. Here is part of code: Oculus.Platform.Users.GetLoggedInUser().OnComplete(OnGetLoggedIdUser); private void OnGetLoggedIdUser(Message<User> message) { if (!message.IsError) { Debug.Log("========ID of User========"+message.Data.ID.ToString()); } } But message has error, which says "The user isn't signed in or their account state wasn't in a recoverable state." What i should do to fix this?916Views0likes2Commentslibovravatar.dll (version 1.2.3.0) Causing unlogged crashes
Context info (from OVRManager's console log): Unity v2018.2.21f1, Oculus Utilities v1.37.0, OVRPlugin v1.37.0, SDK v1.38.0. ---Yesterday, the sdk version was also 1.37.0, I'm not quite sure how that updated but the crashes have happened with both versions. The crashes are not logged in Unity's output log, nor are there any hints as far as I could tell. I used the OculusLogGatherer tool to finally find what was causing the crash. I'll paste the event logs. The Unity application we're developing is a crane simulation, where we teach the player how to give specific hand gestures/motions to signal a crane. To help with that, we use the Oculus avatar system to record a developer giving the correct signals and play the recording back at runtime for the player to mimic. This project has been in development for 2 years and we have updated our Unity version and Oculus sdk and plugin a few times over that span. The signal recordings were taken a long time ago and we have updated our project since then. So far, the crashes have not occurred predictably; we can sit and watch the recordings playback for several minutes, starting and stopping as expected. Then, in the middle of one of the recordings, the editor (or build) will crash. I haven't been able to establish a clear pattern for it, but it might be related to starting a new recording soon after ending a previous one, but that is a low confidence guess. The crashes occur both on our development machines and on Valve's test machines, which is unfortunately how we first noticed the crashes. I'm guessing our QA team flew through the tutorial so fast that the problem didn't show up. Here are the event logs capturing the crashes, some in the editor and one in the player: Faulting application name: Signal Person Training.exe, version: 2018.2.21.8949, time stamp: 0x5c73f900 Faulting module name: libovravatar.DLL, version: 1.2.3.0, time stamp: 0x5cdb398c Exception code: 0xc0000409 Fault offset: 0x000000000004c23c Faulting process id: 0x1968 Faulting application start time: 0x01d5212f09126f1a Faulting application path: C:\Program Files (x86)\Steam\steamapps\common\CLS Signal Person\Signal Person Training.exe Faulting module path: C:\Program Files\Oculus\Support\oculus-runtime\libovravatar.DLL Report Id: c1a76a9f-e118-4675-bae6-6308941d7cf2 Faulting package full name: Faulting package-relative application ID: Faulting application name: Unity.exe, version: 2018.2.21.8949, time stamp: 0x5c73f759 Faulting module name: libovravatar.DLL, version: 1.2.3.0, time stamp: 0x5d003cf3 Exception code: 0xc0000409 Fault offset: 0x000000000004cc3c Faulting process id: 0x3188 Faulting application start time: 0x01d521e815270000 Faulting application path: C:\Program Files\Unity\Hub\Editor\2018.2.21f1\Editor\Unity.exe Faulting module path: C:\Program Files\Oculus\Support\oculus-runtime\libovravatar.DLL Report Id: 208d504e-7332-474f-b614-47dc04020bc3 Faulting package full name: Faulting package-relative application ID: Faulting application name: Unity.exe, version: 2018.2.21.8949, time stamp: 0x5c73f759 Faulting module name: libovravatar.DLL, version: 1.2.3.0, time stamp: 0x5d003cf3 Exception code: 0xc0000409 Fault offset: 0x000000000004cc3c Faulting process id: 0x1e2c Faulting application start time: 0x01d521f05faa3373 Faulting application path: C:\Program Files\Unity\Hub\Editor\2018.2.21f1\Editor\Unity.exe Faulting module path: C:\Program Files\Oculus\Support\oculus-runtime\libovravatar.DLL Report Id: 59ac9a4a-6b19-42d5-badf-eedd001667db Faulting package full name: Faulting package-relative application ID: After investigating the reported exception code, it looks like this is a stack buffer overflow fault, according to this Microsoft forum (https://social.msdn.microsoft.com/Forums/sqlserver/en-US/aa84a49e-6bfe-4b89-928a-ea477e73c07e/clr-exception-0xc0000409?forum=clr) and supported by other search results. I couldn't find any information on my version of libovravatar.dll (1.2.3.0), but since it's part of the runtime I expect Oculus to automatically update this. Our recording and playback system is based on the provided sample scripts in Assets/Oculus/Avatar/Samples/RemoteLoopback and the guide found here: https://developer.oculus.com/documentation/avatarsdk/latest/concepts/avatars-gsg-unity/ Here's the playback code: TextAsset asset = Resources.Load(filename) as TextAsset; if (asset != null) { MemoryStream s = new MemoryStream(asset.bytes); reader = new BinaryReader(s); loadPacket(); } /// <summary> /// This is called recursively for every frame in the avatar recording. /// </summary> void loadPacket() { if (reader.BaseStream.Position >= reader.BaseStream.Length) {//If we've reached the end of the file, load it again and start over. reader.Close(); this.DelayedInvoke(delegate { loadFile(); }, .1f); return; } int sequence = reader.ReadInt32(); int size = reader.ReadInt32(); byte[] sdkData = reader.ReadBytes(size); IntPtr packet = CAPI.ovrAvatarPacket_Read((UInt32)size + 8, sdkData); OvrAvatarPacket p = new OvrAvatarPacket {ovrNativePacket = packet}; remoteAvatar.GetComponent<OvrAvatarRemoteDriver>().QueuePacket(sequence, p); this.DelayedInvoke(delegate { normalize(); loadPacket(); }, remoteAvatar.PacketSettings.UpdateRate); } private void OnDestroy() { if (reader != null) { reader.Close(); } if (remoteAvatar) { Destroy(remoteAvatar.gameObject); } } normalize() adjusts the scale and position of the playback avatar to make it look nice and appear where we expect. I'm hoping someone can tell me if this is a bug on Oculus' end, or if we are using the api improperly in some subtle way I haven't discovered yet. In either case, Is there a suggested workaround? We could bypass the problem altogether by moving away from the avatar system and just tracking position and pose data ourselves, but that seems an awful waste.981Views0likes1CommentCross-platform avatar hands not activating
Hi, I'm trying to add support for cross-platform avatars to the OpenVR version of my app, but I've run into an issue. These is how I've setup the Avatar component: I've disabled the component, but I enable it from code once I've set the "Oculus User ID" property to one of the preset IDs. The body of the avatar is working as intended, but the hands never show up. I've tried enabling the "Show First Person" property as well, but it had no effect. After some digging, it looks like this call to the Oculus API in OvrAvatarSkinnedMeshRenderComponent::UpdateSkinnedMeshRender (line 23) for the hand components, always returns a visibilityMask without any flags set: ovrAvatarVisibilityFlags visibilityMask = CAPI.ovrAvatarSkinnedMeshRender_GetVisibilityMask(renderPart); Is this intended behavior for cross-platform avatars, or a bug? Unity version: 2017.4.18f1 Oculus integration: 1.32.11.5KViews1like3CommentsHow to get a model of only the index finger?
In my app, I would like to enter a "selection mode" by pressing where you can raycast from your finger to an object and select points on the object. My issue is that it's annoying to select with your index finger on the standard hand because it's very easy to accidentally touch a trigger and contort the hand model and screw up the raycast direction. My question is, how can I get only the model of the normal, extended index finger? I want to press a button, enter "selection mode", and then have a static model of the index finger that can't be closed or contorted in any way.652Views0likes1CommentGood practice for standing (still) Avatar experience?
We're making a Unity-based surfboard experience and expect the players to be standing and using their hands with Touch controllers to steer in the game. We want them to see themselves and their hands on the board (via first-person Avatar visualization). This all works fine with OVRCameraRig and OVRAvatar, with one exception: Because it's room scale, players can "walk" off their in-game surfboards or otherwise get themselves into odd positions relative to their board. Disabling position tracking in OVRManager keeps the player camera correctly positioned on their in-game board, but then you lose the Avatar hand tracking. Any suggestions on ways we can keep the player fixed on the surfboard but still track their hands (without having the hands drift away if the player moves around in their room)?410Views0likes0CommentsOculus Avatar launch incorrect position on Go
I am developing in Unity, followed instructions on official blog (https://developer.oculus.com/downloads/package/oculus-avatar-sdk/) and also did some digging on this issue, honestly the Avatar & Hand should just align correctly with the HMD cam instead of manually adjusting the LocalAvatar transform. Since I have to run the app to see what's changed, this becomes a pain. Any tips are welcome.647Views0likes1Comment