12-05-2023 05:19 AM
Hi there,
I'd like to expand the use of Scene actors to be able to replace Scene elements with any Blueprint and not be limited to static meshes (eg. I want to be able replace the user's table with a table BP that I can control to open drawers, place things on it, etc.).
I'm hitting a wall with the provided OculusXRSceneActor not being expandable (nothing in it is virtual) and I can't duplicate it because it uses API entry points which are in the private section of the plugin (specifically OculusXRAnchorManager set of static functions). At that point, I feel like I have to switch to source in order to get what I want (which I'd rather avoid at that stage), because nothing is allowed beyond the (quite restricted) options showcased in the demo projects.
I guess another thing I could do is spawn the anchors as they are by default (with an empty static mesh), then collect them and spawn my own BP based on their location / scale, but that's quite dirty to begin with, and since there are no available callbacks to hook on to know when the scene is done populating I'd have to check on tick, which makes it dirtier / less performant.
Any help / insight for me? Would it be possible to edit the plugin to make some things accessible without compiling the whole source?
Thanks!
12-05-2023 07:01 AM
Use "get actor by semantic label" on begin play. For the walls, get just one copy. Im not sure now but I think that the first wall you put when setting the scene in the headset is the number 0 in an array. This way you can know for sure which wall to start.
You can also put sockets on the meshes (of the wall and so on) and then get these sockets as specific locations.
12-08-2023 10:17 AM
I'm able to use semantic labels to find scene component and spawn actors at the location... but does anyone know how to scale a spawned mesh to match the scene actor component scale?
In other words, when I spawn a mesh at the location of WINDOW_FRAME feeding in the transform from the component, it spawns in at the original scale of the mesh and doesn't seem to get the modified scene actor component scale that is represented in the scene. Where do I get the actual scale of the component that is showing in the scene? Or am I missing something?
This is my setup:
12-08-2023 02:34 PM
you could try some clever math to achieve this. I never tried but this is the idea:
you put sockets on the vertices of the planes for the walls and so on. When the headtset populate the scene with the mesh, you could get a distance from these sockets of a specific mesh (a specific array from the semantic label) and do something with this distance.
If you know the measurements of your own mesh, you could apply some scale so the distance between 2 sockets of your own mesh is the same as the distance between the 2 sockets of the populated ones.
I did once a simple test to put a button inside the room using a distance between 2 sockets at opposite walls (array 1 and 3, or 2 and 4, for example). With this distance i can create a spline and use a time from spline to place a button at a specific location inside the room. Because the walls dont have depth information, you need to achieve this information by other means )attaching meshs into tthe sockets and so on)
12-08-2023 05:28 PM
I was over thinking it... I realized I didn't need to spawn my mesh using labels. I just add the mesh to the scene actor and it gets scaled under the hood. My only issue now is that I keep getting some weird rendering issues (I think either z-fighting or render bounds) where window planes don't always show up or just show up partially at certain angles, but then disappear. Seems like the more things I have set to render in the scene actor the less stable the display of room meshes are. In some instances room elements do show up at all (ie. missing door, missing windows).
12-08-2023 06:41 PM
This is transparency sorting issue, enable Order Independent Transparency under Project Settings; it'll effect performance and I'm not sure runs on Android. The other way to workaround this is less fun 🙂
01-09-2024 10:47 AM - edited 01-09-2024 10:55 AM
I tried this idea and ran into the issue of not being able to access the static mesh within the OculusXRSceneAnchorComponent associated with the walls. Any thoughts there on how to access that mesh so I can access the sockets?
I've been trying to spawn portals along the wall by getting the wall actor's bounds, and then I get a random point in that bounding box. When I attempt it that way the spawned portals are sometimes in front of or behind the wall plane. Seems to me like the reported bounding box is 3D and not a 2D plane?
01-09-2024 01:01 PM
try a node called "Get Actors by Semantic Label". On this node you put the string WALL_FACE, get an array and from this array you need to get one copy. I think the first wall you set up on the headset is the first on the array. From this selection you can make sure that it has a socket and from there get location from socket and so on. You can also do some math from the wall to the VR Pawn, so you can only develop inside the room (get distance and so on)
see this link:
https://developer.oculus.com/documentation/unreal/unreal-scene-actor-functionalities/
list of available labels to search
https://developer.oculus.com/documentation/unreal/unreal-scene-supported-semantic-labels/
01-09-2024 01:10 PM
Yes, that's exactly what I am already doing. The "Get Actors by Semantic Label" node only returns an array of Actor references and does not allow you to directly access the static mesh and therefore does not allow you to access the sockets.
01-09-2024 03:01 PM
Are you trying like this? Get the XRoculus scene actor from the level (i suppose there is only one, so you get the first of the array) and use this actor to get the semantic labels and so on
I have a project setup like this and it works