cancel
Showing results for 
Search instead for 
Did you mean: 

Week 7: Oculus Launch Pad (Due Aug 6 Midnight)

Anonymous
Not applicable
Week 7 - we are at the half way point. Looking forward to seeing the updates you are sharing. See you on the call Friday!

70 REPLIES 70

jacqueline_assa
Explorer
This week got off to a great start - I ran into fellow Launchpadders at the Games For Change Festival in NYC. It was great to catch up in person and talk about our projects with each other. The whole festival was incredibly inspiring and I took that energy with me throughout the week as I continued to work on Space Noodle. This week, we continued to work on the design of our "tech planet" and I'm really pleased with the visuals/environment we are creating. This upcoming weekend, we have an internal hackathon set up to hopefully complete an end-end demo of the main interactions/components in the universe. We have so many ideas on how this platform can support many fun educational adventures but, I am trying to keep myself and everyone on the team focused on putting forth the best proof of concept we can and focus on a really compelling end-end demo. Looking forward to posting updates (and pictures) from the internal hack this weekend! 

icy_violets
Explorer
This past week we began to look into different possible strategies to implement the visuals within our Unity project.  The main visuals are comprised of a generative glsl shader, which utilizes tight timing of parameter changes to create visual drama of different paint strokes drawing in time with the music soundtrack.  Up until now, we have been doing most of the work in the Touch Designer environment, which expedites prototyping. Both my collaborator and I are new to Unity, so we are trying several experiments to look into whether we should try to generate video that will work in the same way as 360 video within Unity or whether we should, instead port the shader into Unity and script the event changes in there. After looking into this for a little while now, we are still undecided but hope to come to a final decision this coming week.

shiud
Explorer
We started working towards our whitebox but I haven't been able to make the time to find some rigs for our characters. I have some that should work, but they will need to be tested for our needs, and see how much we can modify them. Luckily the production phase of this project isn't as intense as our planning phase. That's all I have for this week, unfortunately; it wasn't as productive as I would have liked.

elisabethdeklee
Explorer
Creator: Elisabeth de Kleer
Project: Communities

Reading through your blog posts the last couple weeks, I realized that many of you are working with teams, so I sent some emails and spent some time reaching out to friends who might come on board to write a couple short scripts for me and in general, help me with programming in Unity. I am independent to a fault, but like everyone, I have natural strengths and weaknesses. One thing I've been learning recently is that working in a well-balanced team means everybody wins AND the project gets delivered on time.

Over the last month, I spent a lot of time with my 3D scanner. This week, I got more acquainted with my Gear360 camera. I filmed a lecture, some promotional material for my friend's start up, and some footage of a prison reentry program that involves Dungeons & Dragons (the subject of a documentary I've been working on for the last year). I am becoming known as the girl with the 360 camera. It's a good way to get invited to things. :wink:

Here's a shot of the D&D group showing off their miniature figurines.

dcj2s61ou2br.jpg

Next week, I need to:

-Work with someone to make sure my VR/360 integrations are compelling enough for the demo (or find a workaround).
-Decide which of the many communities spaces I've filmed to feature in my proposal.

msourada
Explorer
Week 7: AthenaeumVR

After a hiatus last week (coming back from out of office/vacation and getting back to real life subsumed us all), and so, the push began anew. First things first. Co-founder pushing through with our art gallery partner to move forward on our agreement for work, and on my end? Several prongs. First, looking for talent to help us with our endeavor. Thanks to Cecile Esenazi, connected to an artist encalve over in Bulgaria, so broaching first conversations to explain our concept and see if there is a potential fit for them to help us.

Secondly, starting to formally (really put ink to paper) sketch out what the interim app will look like. Spent a fair amount of time going through personal and public photos of the gallery and conceptualizing a best fit for where a user will start in the VR experience. Admittedly, I have a vision of seeing the experience being based out of their classic courtyard in the antique wing. But with the new facility being constructed starting next year, its crucial that we also offer this as a design choice (for a starting point).5gr84xthvcw6.jpgn tllf2umvpstam.jpghe anrev2szn22jci.jpgitubnh5bq5cqzil.jpg82d0e3pgtrjs.jpg
Thinking a bit (on a tangent admittedly) about how well we could tweak our experience to AR as well, but at this juncture, erring on the side of conservatism and attempted perfection. As the saying goes 'you can do many things, half heartedly, or do 1-2 things, perfectly'. I believe we are all on the side of getting our VR experience wired and working on both Gear, Rift and Vive, and then looking at what the next steps hold.
Expanding creative conceptualization to more 'esoteric' concerns such as, should we offer different 'light' settings for a user? For example, does he/she want to tour the works/gallery in dawn light, twilight, night or full daylight? how much overhead/work is that (or is it just equivalent to a 'skinning' process?)
Is it just easier to create one settting and bake? (My desire runs towards the former rather than the latter).
Also thinking about motion through the experience.
Having spent a fair amount of 'research' time at the NY Met this summer, keenly aware of the coupling in motion to experience, but given the constraints in VR.... trying to best level how we will leverage this in our own.
Accessing portals is the easy part, but once within... steady motion as we learned bootcamp weekend probably first order solution, but wondering if for Rift, we can actually explore some 'minimal' motion... (exploring)

Continuing to explore the artists that are now doing more interesting pieces online and gallery side in VR only.
And also continuing to do Unity tutorials/education. 
This week will be a bit throttled back due to a journey down to San Diego, but there will be some education along the way. We are going to take the 360 camera with us, and experiment, experiment EXPERIMENT. Looking forward to having a ton of footage to import into Unity next Sunday 😉 and mess with.....




marmishurenko
Protege
This week might have been totally off the project, as we visited Casual Connect conference, but nevertheless we managed to translate breathing algorithm to Unity somewhere in-between game showcasing and sessions. 
Also tried some Gear VR games at the conference and shared some experience with Brazil developers. Everyone is complaining about how hard is that to implement the controller. I really hope that we could get some more tools for implementing controller features and testing it in the editor. 
Hope next week will be better. 

aerialshading
Explorer
My team is still chugging along on their assigned tasks, and finally I'm back home and able to work in my lab again. This last week has been a whirlwind of tasks and problems and it's been quite the ride. We had a meeting early this week to decide game plans for the next week as our 3d artist is currently moving to Australia. Our programmer now has direct access to a Rift rig and has implemented an interaction system and a teleportation system as well as some scripts for triggering audio logs. It was definitely nice to pass off my rudimentary scripts to a competent developer! Our other 3d artist sent me a message that just said "I'M HAVING A BLAST WITH THIS" along with screenshots of a half dozen models that he had made that day, so I'm very happy with the momentum we have with CURIO. I have also consulted a composer who has made some demo tracks for our experience, and as soon as my department's recording studio is revamped I'm going to have a go at recording some VO.

Since my wonderful team is hyped up and making incredible strides, I've been learning what it means to be a producer. I've realized over the course of this project that my role (that we've jokingly referred to as Professional Cat Herder) is actually, well, what a producer does in a studio anyway! I've been lucky enough to go to several events with my team including the launch of Fullbright's Tacoma, and I've been picking the brains of artists and actors and developers and producers about what they do. It's an interesting thing to discover, that I've fallen into a role that frankly, I didn't even know existed until the bootcamp this year. 

I know these posts are supposed to focus on the progress of our experiences, but reading back a lot of the things i've written this summer has taught me a lot about what games are, what it takes to be a studio, and what I need to do to be successful- and I think overall the most important thing i've learned is that my skills are important and valuable to the team. This is something I struggle with a lot! But doing this thing and having Oculus believe in my ability to do and make great things- that's what's really pushing me to make something really, really cool. 

sjhx
Honored Guest
This past weekend I went to Siggraph to learn more about the state of the art of graphics (especially for VR and realtime rendering). It was pretty on topic for the work I was trying to do for my immersive audiovisual experience! There were many great AR/MR/VR exhibits which were very creative and got me thinking outside the box on all the various types of interactions and inputs for these kinds of experiences. Apart from the conference, my goal this week was to port over my work from the Gear to the Rift. I managed to get a good baseline ported over and next will work on more intensive graphics work for the audioreactive part of the experience.

shuandang
Protege

Lantern Models

Put together a model of a lantern this weekend, and also got the Unity project building to the Gear VR.  Celebrating this minor success!  

hq0qy7o7bcnr.png

It's a completely different feeling viewing this inside of the headset!

I had to tackle a lot of 3D processes, such as how to make a skybox, model, and texture.  The lanterns, and eventually the level itself, are being created in Blender.  I'm amazed at the in-VR-art that others are creating, it really adds that hand-crafted feel!  My quest to try out Quill on the Oculus begins!


s7q24t2g9f53.png

Lighting the lanterns has also been a bit challenging, especially when it comes down to the minute changes in each setting.  Preliminary lights on the lantern had me put 5 spotlights on each one to light every side (including the inside).

Progress has been made!  My goal for this demo is to build out a prototype of the first chapter, and have lots of great interactions with the boat and lanterns for the player to act out!  Getting the blocky outline of the starting area gave me a better idea of how large of a space the boat is going to move through.

02un8nr60a6a.png

Given the speed of the boat (it's going to move very, very slowly), it'll be a couple minutes of gameplay.

Andrew

robotliliput
Protege
Updates from past week:

1. After receiving feedback from the Oculus team, I decided to whittle down my app structure and content format even further. Rather than trying to capture the complexity of a complete cycle of abuse in each experience, I want to focus each on a specific abusive behavior. Rather than a menu where portraits of storytellers are portals to an experience, I will create a simple menu with objects representing a behavior. The menu will have a label with the name of the behavior, as well as a short text description of the behavior. There will still be 5 experiences, with the following subject matter: Triangulation, Gaslighting, Minimizing, Stonewalling and Hoovering.

To start, each experience will contain a single sketch, with some elements potentially animated. The sketch will be accompanied by audio/dialogue that describes a specific scenario relating to the abusive behavior. In the future, I may continue to add additional scenarios from stories and memories contributed by other people. 

2. I started work on the first experience, a representation of Triangulation. Triangulation is defined by Melanie Tonia Evans as "one individual attacking, discrediting (smearing) or/ and abusing another person with the use of third-party people or institutions." I created a short script, with helpful feedback from fellow launch-padder, Jesmin. She suggested including pauses between dialogue to allow for maximum impact. The resulting script is below:

  • You: “What do you want to do for dinner later?”

  • Them: “Hmm. Maybe that Thai place down the street?”

  • You: “Oh yeah, let’s do that.”

  • Them: silent, 5 seconds.

  • Them: “My friend offered to watch my dog for me.”

  • You: “Oh, that’s nice! When?”

  • Them: “Next weekend.” Silent 3 seconds. “It would have been nice if my own girlfriend offered...”

  • You: “My apartment doesn’t allow dogs... otherwise I would have.”

  • Them: silent 3 seconds, “They wouldn’t even know, she’s so quiet.”

  • You: “I could get evicted if someone saw and reported it.”

  • Them: “You know what, forget about it...”


3. I spent a day working in Tilt Brush to create the scene for the Triangulation experience. I still don't have my laptop back from Razer support and it has been the cause of much delay. I am so thankful that Jesmin and her collaborator helped me with their equipment and space to work for the weekend! I would have been much further behind without them. I am hoping the laptop finally comes back by the end of this week, though I can't be sure.

4. I hope to have the first sketch loaded onto a GearVR device later tonight so that I can do demos at the Tilt Brush event tomorrow night. Currently, I am running into some issues with Tilt Brush toolkit scripts not working on one of my backup computers. I am going to try loading the scene in an older version of Unity to see if that helps. Below is the sketch loaded in Unity, without the correct shaders applied. The scene is of two women sitting at a table in the kitchen of an apartment.

382u4rnsjccj.png

In the next week I plan to put together the app menu scene and use proxy models to trigger the loading of a scene. If there is time, I would also like to get the rough audio recorded and loaded into the scene. I will continue reaching out to find contributors to submit their stories for additional experiences. Jesmin may help with some of the Tilt Brush sketches and is attending the event at Google with me tomorrow to get feedback.