Development of Metaverse Controllers
I have developed a metaverse controller that simultaneously controls spatial computing devices like Quest and is compatible with controllers of all platforms, available in two types: portable and board. It features an innovative input method where touch confirms finger position and press executes functions, adapting virtual interfaces according to program requirements. With accuracy, versatility, and scalability, along with ergonomic design minimizing body movement, it will be the cornerstone of metaverse expansion, offering developers high productivity in virtual environments and users diverse content across platforms. Please email me at woogle554@naver.com, and I will send you the introduction video of the metaverse controller and two related patents. Woo Yeol Jung886Views0likes0CommentsOculus Design Improvement Concept
So I included this in some of the public facing forums as well, but I figured it worthwhile to share with the developer side as well and was interested to hear what people have to say about this. This would increase the cost of the device but also would improve the capability to include natively supported Augmented reality. The Rift currently only natively supports VR, with AR possible through external Attachements such at the OVRVision device. This device, on its own has frankly subpar performance in comparison to the Rift proper, so I have to wonder if the OVR team could hypothetically design a system which could handle this better. My suggestion, thusly, is a system wherein the front opaque panel is made to be able to shift between opaque and transparent, or able to move out of the way of the viewer by either folding or moving around the user's head. An example method for this would be to move the hardware in the frontfacing lens to the lateral or cranial directions to account for this. Assuming you'd prefer the transparent to dark backing transition with no motors and movement (which I expect) add to the API tools to change the state the state a single function which takes in the transparency value you would like and will automatically set in in the backing hardware. Additionally, this may require changes in your display technology, based on my understanding of how it works. I would argue that the technology to handle this does exist and it should be doable but Im not sure whether you may deem the necessary R&D costs to be appropriate. Another concern in this case might be latency if you also want the feature to feed back an image of what the player can see through the lens, but regardless I do think this to be something worthy of consideration. Anywho, Im Forrest Shooster, a VR/AR/MR game dev who just had a passing thought I wanted to share with the oculus team. I'll be sharing this elsewhere as well. Thanks. Cheers, Forrest Z. Shooster AKA Argzero470Views0likes0CommentsVirtual Reality for scientific outreach
Hello, I am a physics PhD student from King's College London and I would to do scientific outreach using Virtual Reality. I am new user to VR and I hoped you might be able to give us some initial information to get started. Hardware: Our target is to simulate astrophysical events from our simulations. We hope to get simulate movies (> 15 models/sec) of models comparable to https://skfb.ly/Z8DH (~383.2k faces, ~194.5k vertices). The main question for us is whether we should invest in a Oculus rift or just use a simpler setup with a mobile phone. We hope to use money designated for outreach, so we have to argue why we should spend such a significant amount of money for a Oculus + Compatible PC instead of a mobile phone with a simpler Gear VR. Software: We are using coding daily for doing science, but we have never used it to work with VR. We have a set of .obj files, which we extracted from our data. What kind of software would be best to create a VR - movie from these files which we could use for outreach -events ? About us : Two years ago gravitational waves where measured for the first time (http://www.bbc.co.uk/news/science-environment-36540254). This is one of the most significant discovery of this century and opened a new window to explore our universe using gravitational waves. To say the least, we are very excited about this discovery and hope to share this excitement with other people. Our group specifically does simulations of astrophysical events which we hope will cause gravitational waves. These events are the most violent ones in our universe, examples are colliding black holes or neutron stars. We produce these simulations using supercomputers. We hope to use Virtual Reality to visuale these events to show these very violent and fascinating events. any help would be appreciated best Thomas Helfer858Views0likes1Comment