External Depth sensor - 3d meshes
I am doing a project where I would like to mount a depth sensor such as a 3D depth camera, sonar or Lidar onto the HMD, to detect obstacles in real time and then represent them as point cloud data or 3D meshes onto the display. This way the wearer can navigate more safely. It is important that I can scan and represent the objects as meshes dynamically in real time and not statically scanned beforehand. Then I would have meshes change colour based on the user's distance from that object. Bear with me as I am clueless on how to start and never used sensors or created an algorithm for the Oculus quest, used unity or anything of the sort. For example would I need an adruino/ rasberry pi to for the depth data or can it be somehow handled by the Quest 2 and using Unity (or unreal idk which would be best to use)? Additionally any recommendations for a cheap depth sensor less than €100? I need it to preferably have a FOV of 90 degrees or better. Remember it needs to go onto the participant's head so needs to be light and compact too. At the end I would like to have a result similar to the below image. However this developer used HTC Vive Pro and its built in depth sensing developer feature to create static meshes and not dynamic. (Link to his paper: https://ir.canterbury.ac.nz/handle/10092/16777 ) Any assistance, material, links and videos that can be provided to get started or confirm this is possible would be very much appreciated.1.3KViews0likes0Comments