Recent Discussions
Amelia Winger-Bearskin- VR Talk- Performance of Self at IFP (NYC)
(Edited) Link to the Video of this talk in 2016 https://www.facebook.com/563688046/videos/10155028922648047 The talk was "From Film to Headset"; however, I didn't share as we ended up having a pleasant, informal conversation instead of presenting the talks we had scheduled. But I thought I would share the slides for those who would be interested (First link) Info about event: MADE IN NY MEDIA CENTER by IFP & FILM FATALES SHORTS: From Film to Headset - An Interactive Panel Monday, July 11th at 7:00pm Join us for an engaging and informative discussion with some highlight women & men currently working in the field of Virtual Reality. Panelists will share how they successfully made the leap into VR, dive into their creative process, the biggest challenges they face, and how their past work informs their current work, but most importantly, they will answer any questions you have - we are calling it an Interactive Panel for a reason. So come prepared with your questions! Panelists: Lilian T. Mehrel Adrian Vasquez de Velasco Catherine Rehwinkel Javier Molina Amelia Winger-Bearskin Skye Von (moderator) Panel: 7:00pm - 8.30pm Reception: 8.30pm - 10pm NY Media Center Brooklyn, NY About the Panelists: Lilian T. Mehrel is an award-winning writer & director with a visionary sense of feeling and humor. Lilian’s films have premiered at Tribeca and screened internationally at Clermont-Ferrand, and her awards include ABC/Disney and HFPA. She earned her MFA from NYU Tisch Grad Film, and her BA from Dartmouth with a Senior Fellowship. A mixed-ethnicity daughter of immigrants, she also authored/illustrated a family memoir and is a PD Soros Fellow. Lilian inspires audiences to laugh and cry with her funny, hopeful swirl of cinematic poetry. Adrian Vásquez de Velasco is a film director based in NYC and a graduate of NYU’s Tisch School of the Arts. His work has been screened at the Warsaw Film Festival, the Palm Beach International Film Festival, the Smithsonian Institute in Washington, DC, and the World Sustainability Forum in Brazil. He currently works at Ticklish Subjects and Phraxos Films. Catherine Rehwinkel is a a narrative theorist, filmmaker, and recent graduate of NYU ITP with a background in commercial production and feature development. She focuses her research and creative development around considering narrative as a cognitive system which exists outside individual minds. Her driving curiosity is to ask what we are capable of understanding about reality by piecing together chunks of selectively framed motion data. Can mapping the computational to concepts and techniques of human-centric narrative help us to understand how to create artificial minds? Javier Molina is an engineer, actor and media artist based in NYC, working with Virtual Reality, Motion Capture, Performance Art and Experimental Film. He is a professor teaching motion capture with performance art and the director of the VR Lab at NYU Tandon School of Engineering in MAGNET, a facility researching live performance and social virtual reality. Amelia Winger-Bearskin is the co-founder of the “Stupid Hackathon.” She is currently the Director of the DBRS Innovation Lab where beautiful things are made with Big Data and an Artist in Residence (Tech) at Pioneer Works. She also is the Chief Advisor for a new MoCap / VR Non-Profit space in New Rochelle opening January 2017. In the past she was an Assistant Professor of Film and Performance Art at Vanderbilt University (2008-2012). In 2014 her video artwork was included in the 2014 Storytelling: La Biennale d’Art Contemporain Autochtone, 2e édition (Art Biennale of Contemporary Native Art) at Art Mur (Montreal, Canada). She performed as part of the 2012 Gwangju Biennial and created an interactive portion of The Exchange Archive at the Museum of Modern Art (MoMA) in 2013. https://studioamelia.com About the Moderator: Skye Von is a director, writer and VR creator based in NYC. Starting in theater back in London many eons ago, she feels like she has come full circle to be telling story again in a 3-dimensional space. Currently she is working on the VR experience Falling Out of Love, which is reimagining the music video for the band Aloud. Skye was selected as the 2016’s IFP/Indiegogo Fellow at the NY Media Center for this very project with her team. She contributes to such publication as UploadVR and Hammer&Tusk on emerging trends in Virtual and Augmented Reality. WHEN Monday, July 11, 2016 from 7:00 PM to 10:00 PM (EDT) - WHERE Made in NY Media Center by IFP -ameliawb2 years agoExplorer2likes1CommentTrauma Care VR: week 2
Week 2: Get close to finalizing script and design doc. What we have is scenes which is an intro/splash, boundaries: personal and professional, communications, and outburst. In the intro scene what I had in mind is to focus on the school it self. I want the user to be able to access old footage of commercial that Hanna Boys center have had in the years. Each small window represents a commercial. In the front is the start game button that brings the user into the splash page which is the course selection. What we had in mind for the splash scene is a 360 aerial view that ascends from ground level to sky view of the campus using a drone. This week we were playing with the use of the drone and having everyone that is involve to get familiar with the device. Boundaries will be focus in 2 aspect which are personal and professional boundaries. Personal is learning the difference in public, social, personal, and intimate space. Professional focus reenactment of situation. This week we worked on the script for these scenes. Communications is a set of 360 photos. The user is introduced to these photos and has pop up info that shows non verbal communication. This helps the users understand tells of a boys in need. Outburst is a reenactment of a situation of a boy going out of control. Starts with the boy moving around the camera upset. From here the user choose how to interact giving them 3 choices. Based on the 3 choices the user gets info on the actions and an outcome reenactment. Goals for this week continue finalizing script and design start working on 360 photos part of the communication get started in filming boundaries: personal get started in filming boundaries: professionalAngeloHizon9 years agoMember0likes0CommentsProject: Trauma Care VR
Synopsis: I plan to leverage the interactive 360 video films as teaching tools for teachers and students. I want to create an interactive and immersive VR environment educational tool for training professionals to properly support their clients at homes and schools. I would like to focus on Crisis prevention courses on violence in schools and at home using VR and 360 videos. The current curriculum basically uses papers and videos that does not expose the true environment. By exposing the teachers to modern techniques in 360 videos and VR as a normal every day teaching tool, this would not only improve teaching ability to students but also increase the adoption rate. The more teachers and students use and understand VR, the easier it would be accepted in our daily use. Currently, the majority focus in VR is towards entertainment and games. There is little impact towards the educational field which is in need of being modernized. The more exposure of practical uses in educational settings should lead to greater adoption in this field. Our application is a bridge that will assist in the adaptation of upcoming VR technology. Week1 acquired 2 assets need for project. Youtube Mobile Video and Easy Movie Texture Got experienced Unity developer (George Katsaros) Started an after school Video editing elective at Hanna Boys Center currently 3 students volunteered to be on the project. new Gear 360 camera arrived in the mail acquired 2 - 64GB microSD Goals Next week finalize design doc of project have students understanding the 360 camera acquire adobe premiere tools to understand the editing tools for 360 camera start testing Unity assets and see limitations.AngeloHizon9 years agoMember0likes0CommentsXENTITY: Progress Update.
After Connect3, I've make a little more steps to reach out my final goal. Here are my progress. IF YOU HAVE GOOD SUGGESTION OR COMMENT, FEEL FREE TO LEAVE REPLY. You can guess or imagine how the final VR experiences are by combining below three video clips. The first one is a pre-visualization before adopting mesh texturing and particle animation on VR background simulation. The second video is my latest prototype VR, which was made by Cinema4D and simple C# scripting. The visual contrast between Cityscape and landscape is a little different from each other. The first is to make contrast in a scene with upside-down cityscape, but the second is a sequential scene change. Either one can be possible to choose for my final decision. The below video clip is the old version of my concept video. You can guess what pointillized photo particle animation looks like. Anonymous9 years ago0likes0CommentsAleem Hossain - Launchpad Weekly Updates - "I Never Told You"
WEEK 1 I'm a filmmaker who just finished my first feature film as a writer/director. It's a sci-fi drama and I'm just starting to send it out the festivals. For the past year or so, as I was completing my film, I've been getting more and more excited about VR. My current goal is to create my first immersive experience (a fictional live-action one) by early fall. What have I been thinking about creatively? As a filmmaker, one of my big goals is to approach VR with an open mind. I don't want to bring a lot of filmmaking assumptions to the process by default. And I've been searching for project ideas that feel truly/essentially VR… and not just a film idea shoved into the 360 format for no real reason. I'm nowhere near the first person to say this but I think the key is to figure out what's truly essential about VR, what's unique to it. In cinema, I think editing and framing are such central tools b/c film is an amazing way to manipulate what the audience sees and doesn't see, and how fast or slow information reaches them, and what order they see or feel things in, and what information/emotions are combined or isolated. Many (though of course not all) of the best films really take advantage of these strengths or explicitly subvert them. But what about VR? It's early days in this art form but for sure people have zeroed in on the idea of “presence” as being one of the essential parts of the VR experience. And of course there's the inescapable fact that the “audience” in VR is much more conscious of themselves in the art form… they are almost always made to feel like they are inside the art, a participant. So I decided I wanted to try and do something that evokes presence and explicitly makes the viewer be conscious of themselves in the experience. I know, I know… that's pretty vague. I felt that way too… until I thought about Sam. Do you remember Sam? Quantum Leap blew my mind as a kid. I just loved the idea, the characters, everything. If you've never seen the show, it's about a guy named Sam Beckett who is traveling through time in a very particular way: His consciousness is constantly being dropped into the body of someone in the past. He's a white guy in his 30's in the present day but he finds himself in the body of a mafia hitman in the 1960's, an older black man in the segregated south of the 1950's, a single mother, etc. The show is a 1980's network drama, so it's not particularly subtle. But it's pretty profound if you ask me. We see Sam get sexually harrassed as a woman. We see him racially attacked as a black man. We all know the old adage about walking a mile in another person's shoes… Sam gets to literally do this. And with VR, we all can do this. Which brings me to the specifics of my project: I'm going to develop an immersive VR experience about being looked at. One of the most personal and unique experiences we have is the way people look at us. Until now, it's been very hard to truly let someone besides yourself viscerally experience this. I don't want to be too simplistic but just to make my idea clear, I'll start with some pretty clunky examples. The moment when you arrive at a party and everyone looks over at you… I think it's really different for women vs men (Or even just walking down a city street). Getting pulled over by the cops and first moment the officer makes eye contact with you… I think it's really different depending on your race. Being in a room full of strangers vs your closest family – you get looked at differently. I'd like to film some scenes where I direct the actors to treat the camera as a very specific person (in terms of gender, race, age, relationship to them, etc)… and cycle the user through these scenarios. Some may feel very familiar and others will hopefully feel very strange. You will hopefully get to feel in a truly visceral way what it is like for someone other than yourself to be looked at. Basically, I'm going to do a meditation on being looked at. I'm imagining this is more of an experimental/emotional piece than an actual narrative one… but there will be an emotional journey for the user. A gut-level, pared down to its purest form, version of walking a mile in other people's shoes. Those are my creative thoughts at the moment. What have I been thinking about technically? VR live-action camera technology is driving me crazy. It's so obvious to me that everything being used to shoot stuff right now will seem totally outdated in a year… and probably laughable in a few years. I'm coming from the film/video side of things where now that we're past the crazy early years of digital, we're at a point where the camera technology is no longer a significant obstacle in the process and can pretty much do whatever we want it to. But it's also pretty exciting to be at the beginning of an art form, not just artistically but also creatively. I'm going to choose to focus on this. I was never going to be there with Edison and the Lumiere brothers figuring out 35mm film cameras… but the VR versions of those folks are working on the tech right now. I've got a Ricoh Theta which I'm using to do tests. I want to shoot my actual project on a better rig but I love the Ricoh for testing/brainstorming. (Thank you Marcy for encouraging me to play around with it) When I'm making films, my iPhone (and before that my miniDV handicam) is an essential tool. I'm constantly testing shots and edits. For me, it's way more productive than storyboarding. The Ricoh feels like it will play a similar role in my VR experiments. Eye contact will be extremely important for my project. I want the user to feel like they are being looked in the eyed– I want them to get that buzzy feeling I get when I've got the headset on and a character (live-action or CG) really looks at me. So I've started doing tests for live-action eye contact. At the moment, I'm experimenting with distance to camera (in terms of how readable a face is). I shot of series of tests standing at progressively farther distances from the camera. With the Ricoh at least, the fall off was pretty quick and pretty sharp. Once I was 4-5 feet away from the camera I felt like my gaze into the lens was already less powerful when I looked at it in the headset. But there's more than one variable at work. Partially, it's problem of resolution – both the camera and playback in the headset. It's also problem of optics – how good the lenses are and how they represent depth, space, etc. I'm going to do some side by side tests with my DSLR. The flatness/fuzziness of even multi-camera GoPro based rigs is a challenge. These cameras were just not created to capture the subtlety of human emotions flickering momentarily on a face… which is not to say that can't be used to do this but I'm using my DSLR as a comparison point b/c I know that it can capture clear and powerful human emotion directly into the lens at a pretty great distance from the subject. A second issue I'm pondering is what an actor looks like when they're staring directly at camera but the user has not fully turned into this direct gaze… is it weird to see this? Does it undermine the feeling of being looked at once you turn fully into the gaze b/c you just previously saw the gazing not being at you? Another issue I foresee is the ability for an actor to maintain eye contact with the user while moving. On a cheap camera like the Ricoh, this is pretty easy… an actor can traverse 180 degrees of a fixed 360 frame staring at the same lens. But on a multi-camera rig… what's the answer? This is something I won't be able to test until I get access to a more complex rig. Those are my technical-ish thoughts for the week. Thanks for reading!aleemhossain9 years agoExpert Protege2likes39CommentsLisa Walkosz-Migliacio- Launch Pad "Reviva"
Hi, I'm Lisa, a game designer at my company Intropy Games. We make cute games that tug on heartstrings, and currently have games on Steam, Wii U, and iOS. I frequently attend game jams to hone my craft, specifically I attended the last Train Jam where a bunch of developers hopped an Amtrak Train from Chicago to San Francisco to get to the Game Developer's Conference and made a silly game. I was happy to see the Oculus Bootcamp when it came up in my feed and I had to apply. The reasons I jumped into making games has been that I enjoy the story based games (remember Final Fantasy) and enjoyed exploring and being taken to new worlds. I started looking at the hits of the day and seeing more violence simulators and wasn't interested in playing them. Not that you can't enjoy them, they just aren't for me. Because of the lack of the more whimsical, non-violent types of games, I decided that it was time for me to jump into a lend my creative expression into the world of video games, and I hope you'll continue to have me! Here is my pitch I sent when applying: I've always been a big fan of the idea of flying. It's something that most people have dreamed of as children and then forgotten as they became older and more jaded about the realities of gravity. However, I believe the feeling of wonderment is needed in VR. I want to bring to it an environment of fantastic views from the clouds, finding new and interesting things from above tree level and gaining another perspective from our everyday grounded life. On your adventures in the skies, you will meet other free-fliers, others who wish to ground you, and a journey you take to find what is important to you. Flying is indeed a metaphor for freedom and we all can relate to the challenges that tie us down and the times we can roam wherever our imagination takes us. And as you know already, they picked me! After I learned I was going, I met up with a narrative designer right away to flush out the idea and get a handle on the story aspects right away. I coded up quickly (remember, I'm a game jam nut) a little prototype that had a character flying around and plants growing. I filled the world mythos with the ideas we came up with my narrative designer. I was all ready to go! Here's our 2 sentence pitch: Revive the world - It's never too late to redeem yourself. Uncover the story of the ruin you caused through talking with those left behind. Take flight and watch nature bloom beneath you, reviving the landscape you helped destroy. Oh yeah, and you get to take pictures. The bootcamp was great, meeting other awesome people, and getting some major knowledge dropped on how to succeed on the VR Platform. As soon as I returned I was ready from a hardware and overview perspective to start making this experience fit right into the VR world. Now I'm taking all of this knowledge, putting things down into "to dos" and getting ready for August. Frames per second, cameras, making sure the players aren't getting sick, narrative pacing, how to get them to go to the next piece of the story, text meshes in 3D! Basically, there's a lot to do, and over the next few weeks, I hope I will be solving them all. -LisaAnonymous9 years ago5likes34CommentsXentity (old name. Digitized) Progress - Cinema4D-to-Unity, Vice Versa
Hello, Guys. I was so pleased to meet many fellow members at Oculus Diversity Leadership Luncheon and Connect3, San Jose last week. Demos were so interesting, and sessions were very insightful. Most of all, Opening Keynote was so honorable, because Ebony introduced Launch Pad Program as a speakers. I am so honored to be a member of 100 awesome VR enthusiasts, and I am still progressing my project. I posted my lessons from awesome sessions on FB groups. You can read them up. So luckily, I could have new VR dev.environment in San Mateo by contributing to UX/UI design for a stealth mode AR startup from this Monday. I tried to load my very simple Unity scene with C# scripting on HTC Vive. It was successful. I'd like to share my experimental materials, and other stuffs as a kind of project journal. A. New Name and Logo Design (Thank you for thoughtful feedback on Facebook Group) B. Cinema 4D modeling and concept visualization (FBX format converting to import in Unity) *** Drone View Experiment (In Unity 3D) *** Cityscape Bending Test (In Cinema 4D) - This style environmental manipulation could be used for delivering unusual feeling in Simulated World Experience. *** Unity High-Fidelity City Night Scene Test (Submitted Scene for Scholarship Proposal) C. HTC Vive Controller Test (my new dev.environment) *** Vive Controller Demo https://www.facebook.com/jikkimi/videos/1249576438439757/Anonymous9 years ago0likes0CommentsI am continuously stepping out to progress. (Design update / Event participation / Experimenting)
Today, I am going to blog below three things. 1. Visual Design update (New project name and logo design) 2. Conference and Event experience 3. Trials for VR prototyping (A-frame WebVR, Simple Android Unity Scene from Cinema4D modeling, Leap Motion) I. The New name of my project is XENTITY. Xentity can be read as Cross+identity. My project is to express a correlation between social media and our real identity, so I am trying to make a visual contrast between digitized identity and our genuine one. In addition, "Trenscending" is another source word of my new name. The prefix "X-" is a kind of abbreviational spell means "Trans-" (e.g. Transformer: Xfromer), so XENTITY can be read as "Transentity", which comes from "Trenscend" + "Identity." Here is the ideation process. Digitized was old name of my project, which was mentioned as "Current Name and Design" in below screenshot. After sketch of new logo designs, I have refined it as you can see attached images. I chose two candidate fonts to express a kind of techy look, and they are Copperplate Gothic Light and Niagara Bold. Finally, four major concepts were completed like below. Could you vote which one do you think the best? (Top-left to bottom-right : 1-2-3-4) II. FBstart Conference + Kaleidoscope VR showcase vol.1 FBstart is a kind of similar to our program, Oculus Launch Pad, but FBstart is targeted to startup or person who is going to launch a new mobile app. I attended FBstart SF in Sep 28 at Bespoke SF. Since my project is related to Facebook API and Facebook SDK for Unity, I was there to learn how to use it and meet real Facebook developers. I listened and learn a lot from 5~6 session speaker, and I met a Facebook software engineer, and she recommend me to join Facebook Developers Group in Facebook. If I could become a FBstart member, I can have enormous benefit to make a reall mobile app. As you know, our VR app package is aiming to launch GearVR app, so I think I can be a candidate for this program as well. https://fbstart.com/ https://developers.facebook.com/docs/unity/ Kaleidoscope VR summer showcase was super amazing. I joined this showcase as a volunteer, but I could enjoy groundbreaking VR projects at ObscuraDigital, SF in Sep 30. http://kaleidovr.com/updates/summer-showcase-launches Besides above Web linked projects, amazing WebVR (1st below), interactive projection (2nd below) were catching my eyes. I encountered so warm feeling and lovely styled VR storytelling, PEARL (by Google Spotligt Story, https://youtu.be/WqCH4DNQBUA), as well (3rd below.) Most of all, however, I was so pleased to experience SF city experience VR (by Amber Garage, Skywand.com) https://skywand.com/ https://youtu.be/3xbnwzOUdYA It is because this VR experience has a kind of similar concept to my project. Main motive of this project and mine is the cityscape of San Francisco. Although the final goal and main purpose is different from each other, experiencial visualization is pretty resembled. I introduced my project and I am going to meet the developer Amber and the designer Anqi in their office soon to discuss about our collaboration. III. Still Toddling but shown the right path III.1. A-frame WebVR test This is for grabbing remote images via Internet in a VR scene. This is a technical key feature of my project. I am still researching and coding to access my profile image from Facebook database by using Graph API. This trial scene is to experiment such thing and make solid algorithm. http://jikkim.com/AAU/Fall2016/WNM499/aframe/square1.html Currently, I succeeded in mapping on a cube by my logo image from other URL not in my local HDD. You can try to experience this very basic WebVR scene with Cardboard by accesing above URL link. WebVR is very useful alternative for more affordable service to much more audiences. III.2. Simple test for importing Cinema4D modeling and animation in Unity3D This is the current Unity scene for my project which was purchased in Unity Asset Store, but in order to design much fit for my own project, I am going to do 3D modeling by Cinema 4D in person. Cinema 4D will be also a tool to make much sophisticated POC video. Anyway, I tested to export a simple 3D modeling C4D scene into FBX format file, then import the file in Unity, and it was successful as you can see below. The next step will be exporting C4D animation to import the animation in Unity3D. III.3. LEAP motion test These are my current VR gadgets. I unboxed LEAP controller and tested it on my desktop PC. To use LEAP controller in my VR project, I am supposed to have Vive or Rift, so I listed up VR ready bunble package in my scholarship budget list. https://youtu.be/DFfWb_EP6oI I am ready to meet you in Oculus Connect3 conference. I am going to attend Diversity Luncheon hosted by Oculus Diviersity Program. I can't wait for meeting you again. Thanks.Anonymous9 years ago0likes0CommentsMJ Dev Weekly Updates
Week 1: Catching Up I've been a little late getting started on the Forum, so I've got some catching up to do! I had always planned to make my first post a recap of my experience at the Launch Pad event itself, so a little late but here it is (I had previously posted this in my personal blog as well): A month ago I had the fantastic opportunity to attend the Oculus Launch Pad event at Facebook Headquarters. The event was intended to promote diversity in the field of VR, and it met those intentions quite well. I, along with 99 others who all came from diverse backgrounds and varying ages, genders, races, ethnicities, orientations, etc, spent an incredible day learning about the technical side of putting a game on the GearVR, tips and tricks for designing "comfortable" VR content, how to keep a project on track and finish on time, and the importance of story and immersion in the VR world. On the technical side, it was great to have an opportunity to go through the process of getting an application from Unity onto the GearVR. I'm outlining the process here for anyone else who may have had trouble getting it working during the boot camp (and also to help myself remember the steps for future use): 1. Get the latest version of Unity, and during the initial download be sure to check the box for "Android" from the list of development platforms (if you already downloaded Unity and don't have the Android package you will need to go back to the Unity downloader to get it). 2. Visit the Oculus developer website and download the Oculus SDK and Oculus Unity Utilities, and import the Oculus Unity Utilities package to your Unity project 3. Download and install the Android SDK. It will launch a download manager where you can select different versions. you can download as many versions as you like, but you will at least need API level 19. When you select download, you will need to agree to the terms and whatnot, but it is not super clear from the prompt you need to select "agree" for each of the download items in your selected list, so if you have two items selected you will need to scroll to the bottom to "agree" for the second one. 4. Create an OSIG for your device. This one is a little complicated, here are the details Put your Android Galaxy device into debug mode (go to settings > device and tap the build number a bunch of times) On your Android device go to Developer Settings and check the box for "USB Debugging" then plug it in to your computer Open Command Prompt (or Terminal on a Mac) d. Navigate to the Android SDK Tools folder (use 'cd' to change directories, 'cd ..' takes you up a directory, 'cd Directory_Name' takes you down into specified directory) Use the command 'adb devices' and it will give you a list of IDs for connected devices (you will probably only have one) - if you have trouble with this step you may need to update the drivers on your computer Copy your device ID, then go to developer.oculus.com/osig and enter the ID Download the osig file generated by the Oculus website and copy it into the following directory inside your Unity project - Assets/pluggins/Android/assets Note that you will need a separate OSIG file for each device you want to test on; you can have as many OSIG files in the assets directory as needed 5. In Unity go to File > Build Settings. Be sure to add the scene you want to build to the list of scenes. 6. Select 'Android' from the list of platforms, then click "Player Settings" at the bottom. You can also hit "Build" right now to check if your SDK is setup properly, it will let you know if it cannot find the Android SDK (it may also prompt you to download the latest JDK which are also needed to run the Android SDK). 7. In the player settings panel which should have opened on the right side of the screen, scroll down to other settings and check the box "Enable VR", then scroll down to "Other Settings" and fill in the bundle identifier (you can use whatever company and product name you want but you need the company and product name at the top to match). 8. Sign your application - under "Publish Settings" check the box for "create new keystore" then click browse and name your keystore file, then give it a password (be sure to remember this password). Then set an alias and password for this keystore (it can be the same password or a different one). 9. Be sure your Android device is still plugged in and the screen is on, then hit "Build and Run" and the game will automatically deploy to the phone. 10. Unplug the phone from the computer. It should prompt you to put it in the GearVR, but if it does not then find your newly made build (it probably has the Unity logo right now) and run it, then when it prompts you put it in the GearVR. I hope this is helpful to anyone trying to get something working on the GearVR :) Feel free to respond with feedback if you notice any issues with these steps, or questions if you try it out and it doesn't work! Thanks for reading :)mjohns9 years agoExplorer1like6Comments