cancel
Showing results for 
Search instead for 
Did you mean: 

Week 6: Oculus Launch Pad (Due July 30th Midnight)

Anonymous
Not applicable
We are about half way through! Let's see how everyone is doing!!!!
61 REPLIES 61

CatsAndVR
Explorer
This week was a little tough to get some Launchpad stuff done but I did manage to work on some music for the project and do some testing on the phone. I personally decided to take a mental break and actually get some video gaming and music creation in for inspiration to continue the the project.

Thanks to the Unity 360 update I also decided to go out an shoot some 360 elements and finally got the mini sd card to go out and shoot with the gear camera.
Hopefully today I can figure out what 360 content I want to add to my project. 
z60ezaoc3ivl.pngt2rr60rwzfbb.png


 


sjhx
Honored Guest
This week I worked on learning shader code. It feels a lot more primitive compared to other programming lanugues so took time getting used to the new language. There were a couple tutorials online but the most helpful was reading The Book of Shaders. Focusing on shaders was useful for my project since the experience consists of many similar objects doing somewhat uniform movement and interactions (which apparently a particular type of shader code is great at). As a reminder, I mentioned having performance issues last update which is why I transitioned to having the interactions/"animations" done via the shader. I haven't transitioned my project to use shader code yet but will aim to do that this coming week now that ice familiarized myself with shaders. 

marmishurenko
Protege
This week was very productive. We figured out how to generate some organic stuff/fractals in the realtime, started to port the breath detection algorithm from openframeworks to unity and made some environments. Also we took an opportunity to participate in VR Brain jam and kept working on our project (which is called AWERE by the way). We met some wonderful scientists there and there were kind enough to explain us how to gather statistics to prove the hypothesis. 

Some L-Systems (generated in runtime)
v7gcpj3xvfxa.pngkh6gl744mwwd.pngjxypjv3jje1u.pngm5e7c3u3gafa.png5ylqytfy6cz7.pngka8epqydsye5.png1ndf37oxdlft.png

Breath tracking data:hqtrog9yypb9.png
Starting environment

59xdozpymt6t.gif

Headed to Casual Connect this week to promote our other game! Good luck everyone with the development!

sskolnik
Explorer

Week 6 Oculus Launch
Pad, Shayna Skolnik



This week has been incredibly productive despite the fact
that I was on travel at yet another conference, this time the Earth Science
Information Partners (ESIP) summer meeting in Bloomington, Indiana where I
presented VR/AR for Science and the VR study that I did for NASA.



Fernando worked on stitching the 360 footage that we took in
Delaware and we ended up with about 45 seconds of usable clips. I’m reworking
the script to include more data and maps to see if that makes the piece work
without having to film more. At least for the demo, it should be sufficient as I want
the finished piece to be 60-90 seconds.



Adam and I also continued to work on the menu interface for
the app. As noted in the Oculus group, Unity's new video player component can
now be used to render to a texture that the skybox uses, creating a great and
simpler way to play 360 videos. I switched to this method, and appreciate how
much simpler it made the project, and the fact that it has less distortion than
the inside-out sphere method.

20yqu7th5gsd.png



The next step was to test this method on the Gear VR
platform. We were having some trouble getting the project to build to android,
however this turned out to be an outdated Java SDK. The video plays fine now.



Previously we were using textures on meshes for menu objects
to give them a curved appearance. Adam had high hopes for the CurvedUI Unity
plugin as a way to replace this with a system that has the same look but uses
Unity's native UI canvas system, but so far it's not working. We are going to
contact the developer.



Since I was at ESIP, I also had the chance to network with a
lot of data scientists and practitioners (NOAA, NASA, USGS, many universities, etc.) and get new leads for where to find
data that will work for the sea level rise component. This coming week will be
a lot of follow-up and research to make sure that the science and data
component is “right” before we start any animation sequences and integrating
them with the 360 video clips. 

nicole_babos
Protege

We worked on updating the Start Screen by creating a
bulletin, a train station depot, and some foliage. I was really inspired by
using a sphere with inverted normals, like the 360 video technique Sarah from
Unity showed us during Launchpad. Thus, I created my own sphere (with a flat
top and bottom) and created a mountain range background in Illustrator. When I
put it in the scene, I couldn’t decide if I liked it or if I wanted to create
some 3D mountains.

qd7q6ucm7bpm.png



We also added randomizing faces logic. When the scene loads,
the passengers will have randomized personality that determines their face
texture. As the tension in the train increases, their faces become less content
and change to progressively more irritable expressions. Basically, everyone
will get mad if you don’t clean up the vermin or remove stowaways.

9uamnyrh58gh.gif



Additionally, we added nine hats to add to the uniqueness of
each passenger. With the hats working as a proof of concept, we liked its
execution enough to work on neckwear such as scarves, neckties, and a few other
accessories. Below is a screenshot of the hats and a further down is a GIF of the passengers
wearing the hats in a scene.

ghtf92pew2li.png



Additionally, we added new logic for passengers to show
their train ticket, resolving a passenger’s anxiety to have their ticket
checked. If a passenger doesn’t have a ticket, they are a stowaway and must be
kicked off the train. We added the logic for stowaways, so when they don’t have
a ticket, you can grab them and kick them off the train. This adds to the set
of tasks the player must complete to provide a satisfactory ride for the
passengers.

24diuq7e6297.gif

qim22no6ozd1.png



Since you will need to toss out the vermin in order to make
the passengers happy, I added animations to the doors and windows. When you
click on a door or window, they will slide open, stay open for a little bit,
then slide close automatically.

ximt12ja5sd0.gif



I want to focus on getting the game in the VR headset next
week, so we can begin testing in the production environment. We’re pretty
satisfied with our testing the mechanics against the test environment, because
they provide a loop of gameplay that the player can continually satisfy until
the end session.

Nicole
www.fancybirdstudios.com

youtnodknoem
Protege

Some weeks are fun, and some weeks are rough. You have to be ready to modulate emotions when things go poorly, and not to rest on your laurels when you are winning. This week presented a good opportunity for reflection on the state of our game.

As many readers will know, my team set out with a ambitious vision to complete within 3 months. We were going to create a 2v2 team based multiplayer game in VR, inspired by MOBAs and recent shooters like Overwatch. Last week we had our first playtest with members of the Oculus Developer Strategy team. We uploaded our prototype to the Oculus Store (word of advice, don't wait until the submission deadline to upload your build, the build process takes some getting used to).

The demo gods were not kind to us on Thursday. Bugs that we had never seen before began to surface. The game started crashing for the dev rel team, and we spent the first 25 minutes of the call trying to get everyone setup and running properly. After that, the dev relations team only took 5 minutes to test our prototype before we were asked to go back to the drawing board and keep working on it. Looking at it from a third person perspective, it made logical sense for them to give us that feedback, but it still stung since we had worked so hard on it.

Never take anything personally. With every negative piece of feedback, you gather valuable nuggets of data to improve your next experience.  We've already taken steps to improve stability, and their feedback has absolutely shuffled our roadmap priorities.  Stability, aesthetic, and easy of learning are going to be a large part of our final push in the last month and a half of the program.

We're eager to prove to the developer relations team that we can release a polished, fun demo before the end of Launchpad, and look forward to our next test with them.

In the midst of all that, I am proud of what our team did to pull off the test.  We got our bases textured up in our main level, and it is already making a big difference in the aesthetic feel of the game. All of our core mechanics and multiplayer components are now in place. We'll have demo and gameplay footage for you guys to see soon!

Below is some in editor footage of our base and our lobby scene to tide you guys over until you see the actual multiplayer gameplay :).

Textured Base -- still need to add lightmaps and normals.

Dark Lobby with view of Sky

 

ben_cone
Explorer
Week 6!  So fast!

Since we are developing for the Gear VR, and lighting is a bit taxing, especially in a game where dynamic lights are the core mechanic, we needed to find a way to artificially "light" the world using flat shaded textures.  We created a function that draws to a render target each frame the location of a light in relation to the UV of a material.  The material then unmasks the "lit" portion of the material.  We had previously thought to use decals, but those became expensive, and were uneffective.  This method is much cheaper and easier to work with.  Each material will have a lit and unlit version which will be masked by the lights to create the illusion of light.

https://youtu.be/OMvDsms9vH4

c32tc42iy7xv.png

Also, there has been devised a system for player movement.  Copito's pathing system will be simple and follow a few rules like: most recently planted flower, flower proximity to danger, flower life time, past experiences, etc.  The logic to Copito will be a core concept of the game since his emotional state is the basis of the experience.  Once the demo is created, the logic to his pathing will be explored as the game develops and elements are added.

We have also just begun the modelling process for the environment now that the core mechanics are done. 

We plan to have the beginning of the map done and working in the game by next week.  This will give us a template to follow every time we add new elements.  We also hope to have the character animated for the intro story element and walk cycles for the game elements.  The next week will be about the Snake and repeating these steps for the snake.


Anonymous
Not applicable

23 days left to complete my Oculus Launch Pad project! Week #6

This post is also published on my blog

Week 6 marks the halfway point of the Oculus Launch Pad program. The cohort of 100 participants to this 2017 edition has until September 10, 2017, to deliver their prototype to compete for funding. In my case, things are slightly different, since I am no longer eligible for funding (check out Week 5’s blog post to find out why). So technically, I am not tied by this deadline. But I still want to make the most of the program and welcome a deadline to help me move things along. I even decided to move my own deadline forward! Well, I am going on vacation on August 23 and realistically I won’t be able to do any work during my travels. This means that my own personal deadline is August 22 and I only have 23 days left to complete and submit my prototype! That’s tight, but doable. 

This past week

  1. I have captured a lot of footage at a couple of capoeira events (catch up with Week 4’s blog post if you missed the shooting story) and it is now time to edit it. I have started by transferring all the files from my 360 camera onto my laptop. Between the 2 events, I have captured 33 GB of videos! Yes, 360 videos are heavy, particularly in 4K. Then I started stitching the videos. I used the Cyberlink’ Action Director software, which comes with the Samsung Gear 360 camera. My fear was that my laptop would not be capable of stitching the 4k videos. I started with a 2-minute video then managed to stitch videos up to 6 minutes so far. It is a slow process and would certainly work better with a better laptop, but it looks like I may be able to get away with my current machine for now. I should be able to upgrade once I start my new job, but for now I’m trying to do everything with my limited budget. If I bought a new laptop, I’d invest in a high-end model good for video editing as well as VR, so I’d be looking at around $1,400–2,000. Definitely a cost I’d like to avoid for now. 
  2. Through this process, I now have a bunch of videos to review, to decide which ones I will keep in the final edited pieces. This is great! Well, sort of. This is also the step that highlights what went wrong during the shooting and what I could have done better. Video issues. Sound issues. Shaking issues. Dirty lenses, etc. I have started writing a list of all the mistakes I made so you don’t have to do them yourself (you are welcome!). I will publish the list in a few weeks, once I have made ALL the mistakes 😉 Seriously, this is a very humbling process. But hey, I’ve been transparent all along, I am a complete beginner and I’m going through the learning curve, one step at a time. So I am bound to make mistakes. This week I also stumbled upon this article, which I found very comforting. It encourages ‘everyday people’ to build VR. ‘You gotta try DIY Virtual Reality, even if you make hideous things.’ Ok, sir, I’m on it.
  3. Finally, I have found more meaning for my project. You may be wondering why I picked capoeira as a topic for my first VR experience. I haven’t spent a lot of time talking about my grand vision for the end product and I will write more on that in the coming weeks. In the meantime, I have spotted the video below, illustrating how capoeira can unite people who may otherwise be divided. Capoeira is about peace, togetherness, love, respect for others and self control. I’d like to raise awareness of this tradition, as I believe everyone could become a better version of themselves by joining the capoeira community and practicing capoeira. 
https://www.facebook.com/theguardian/videos/711871115667321/

Capoeira is about peace, togetherness, love, respect for others, self control and unity

The week ahead

  1. More stitching! I think it may take me several weeks to be done with all the stitching. I’ll do a little bit every day (and night!)
  2. Editing. I aim to start learning Adobe Premiere Pro, as I’d like to use it with the Mettle plugin to edit my 360 video. Another steep learning curve ahead. I can do it!
  3. Starting the creation of the app using Unity. My fantastic Unity teacher at the Launch Pad boot camp, Sarah Stumbo, has just shared this article with us: How to integrate 360 video with Unity. Perfect timing!

More updates next week! Thanks for reading! 

chinniechinchil
Protege
Tania Pavlisak - Enliven VR week 6:

I started this week by tinkering pet swapping module, by creating 3 different points in space related to user position (sitting, reclining or laying down flat), placed planes to represent the pets, and wrote scripts to instantiate and destroy objects at run time based on preference.  I was a bit worried about performance at this point, but since there would be only one pet at a time in the level, the instancing/destroying method shouldn't affect performance much.

Afterward, I modify the menu system a bit. For easy access, I want the menu to be available at all times, probably floating somewhere the user can gaze at. However, I don't want the menu to obstruct the user's view of the world. After pondering for a while, I use the menu system from my favorite mobile game, Neko Atsume as reference. I like how its menu always available at the top left corner, and it is really easy to use.

Here is a sample of the pet swapping module and the new menu system.

w9j50iq2bf77.gif

For positional menu, I decided to line the buttons vertically instead of horizontally like before. This way, it is a lot more comfortable for the user in any position to pick the button menu based on his or her current position.

As I test the opening scene in space, I realize this level can be pretty relaxing as well. So instead of just floating in space, I started creating a mock-up space-pod. For the art direction of this project, I try to avoid square, or sharp edge geometry, and create more curvy shapes. So for the space-pod, I made it a transparent sphere.

For the pod seat, I looked up some reference pictures for sci-fi captain seats and found out that most of the designs look really uncomfortable, like the ones from Star Trek.  Then I turn my search to pool floats, and they look really comfortable. However, having a pool float inside a bubble feel a bit weird, so I added some structure, like curvy metal beams to the pod. The beams help in creating the feeling of inside a personal space pod, and interestingly enough, I feel safer being inside this room compared to when I was just floating in space.

Below is a video of the space pod experience. I haven't had the time to implement the space as its own relaxation level.

m8daw8yahvgv.gif


I let my husband tested this build last night. Unfortunately, due to
time, I wasn't able to create custom materials for the mock up space
pod, so I used existing glass material from my previous project, which
was a survival horror game, and the glass was cracked. As my husband
tried the demo, he felt very uncomfortable in the space area since the
cracked glass made him nervous.

For next week, most likely I won't have much done in term of development, since I will be at DevGAMM, Casual Connect Seattle and also a 48-hours game jam hosted by Seattle Indies. I will definitely bring my latest build and GearVR, and test it to whoever wants to give it a try. We'll see how it goes.

Here is the video of the latest build. See you next week!
- Tania









prettydarke
Protege
I'm in LA finishing up my teaching gig. It's been a wonderful week being back home and hanging out with artists. I had the chance to check out the Underground Museum and it was a really transformative experience. I sat with the work for about 3 hours. Walking around. Taking notes. Sitting and and thinking and writing. I realized some things I could do with my current VR work. 

I head back to Madison on Tuesday, a dev friend is picking me up and we're going to jam on some ideas. My artist friend is not going to be available for the SotU project but I think that's okay. I'm now thinking we can try two different paths. 

1. Touch base on SotU and see if we can execute that project to a satisfying point. I like our initial idea but it continues to develop into something more interesting. It's also important that it's the kind of project we want to work on long term so that if it doesn't fit for launch pad it isn't just a shelved demo. If we like the new direction but need more time to make something we feel can be presented to a wide audience, I'm thinking we'll pivot to another option. 

2. I was lucky enough to spend some time with one of my former professors and now a very good friend. They sent me along to the Underground Museum where I sat and really thought about my work, more specifically, my latest VR project. I had a solo show a few months ago and the work was well received, but personally I found many flaws that I was unsure of how to fix. Sitting in the Museum reflecting on talks I'd had with other artists - I came up with some new ideas. I'm excited. I think I've found a way to improve upon the last piece while also addressing some of the immersive/interactive limitations of other VR work. 

Ideally we take steps in both directions. I don't know which project I'll end up submitting, we'll see what works out. I will say, creating a work related to In Passing would allow me to take more of an active role and be responsible for not only the art assets and creative direction, but some of the code. Lots to consider. Looking forward to next week.