cancel
Showing results for 
Search instead for 
Did you mean: 

Week 7: Oculus Launch Pad (Due Aug 6 Midnight)

Anonymous
Not applicable
Week 7 - we are at the half way point. Looking forward to seeing the updates you are sharing. See you on the call Friday!

70 REPLIES 70

Anonymous
Not applicable
OLP 2017 Blog Post #7

This week I got some feedback from Oculus about my proposal and also from one of the Oculus mentors.

The proposal feedback says my budget is too high, unless I can pull off a great demo and then it might likely be too low and/or short on necessary production time. (paraphrasing)
That's good to know and what we are shooting for in the demo. Its going to be tough for sure.

Oculus mentor simply said..."Hey, needs to tell a story! Give the player something to chew on" (also paraphrasing) He is totally right and it's exactly what I needed to hear to move forward. This was what I was thinking too...and it's funny how it didn't really hit home until hearing it from the outside. 

The latest build of my game is indeed lacking a motivation for the player and this week we are talking/acting on and hopefully solving that.

Hearing all this feedback has energized me and now its up to me to energize my team. One thing I am learning from this process is that I don't seek outside critique enough, even if its just to reinforce my own beliefs...its also good to hear others ideas and takes on ideas. 

Note: Listen closer to gut instinct. Believe.
Note2: Living in a bubble es no bueno.

Anonymous
Not applicable
OLP 2017 - Week 7

More building in Unity, testing in Gear, and running video compression tests. I am still having issues with encoded .webm video filetype (VP8) and an “error recognizing video header message” so I am troubleshooting and pursuing other options at the same time. I have been using FFMpeg, Adobe Media Encoder, and Handbrake for these tests. 


Also, a lot of trial and error with mapping video textures onto meshes and seeing what I can get away with in terms of performance. I was excited to be able to use some Tiltbrush meshes. 


On the workflow side, I do wish that there was a way to view the unity project more directly in Gear without having to plug in and out all of the time. I was going to try to publish over wifi, but due to my lack of techspertise I’m not sure the time savings would be that significant for me. I really wish the Rift had a GearVR simulator setting!  


Been a busy week, but still making progress. Enjoyed the “Friday Office Hours” from my perch on a mountain in Eastern Ky (family reunion).  Connection was spotty, so I wasn’t able to hear it all, but what I did hear was interesting and helpful. Thanks again to Ebony and team for making it happen. 


Have a great week! 

brendachen
Protege
Just got back from Japan this week and was super jetlagged and had a huge deadline at work so did not get much of a chance to work on anything. I did however set a few more deadlines for artwork and music and will have some new content by next week. Currently I’m working on the 3D models while the rest of my team is working on concepts and scene building in Unity. I will be taking the rest of this weekend to work on this project. That’s about all for this week. Concept art by team member Dave Zhu below.
r43qpit3anow.png

virtuallyfifer
Protege
NEURO-EXPLORER WEEK 7
Learning Unity with VRTK and Oculus SDK

This week we trained in the art of watching youtube videos.  We started by getting the headset up and running within Unity.




We are still having a difficult time getting the Tiltbrush SDK working properly and hopefully this next week we will be able to figure that out when we add a developer into the mix. 

We were able to get flying working with the touch controllers visible which is exciting.



This next week we will work heavily on artwork and storyline and really get the experience moving. 

haskins
Protege










Thoughts from week 7:


Virtually Home


A small setback here, someone stole one of our GearVR headsets … however the GearVR is $99 on Amazon Prime, so we had it replaced in 2 days. The show must go on!



We’ve been working on our “walk up to the house” 360 shots. We were able to capture the shot using a helmet mount, but there is some noticeable shake. I’m about to research YouTube’s shake correction, and hopefully that would be an easy solution to that problem. I’ve read about Facebook’s 360 video stabilization as well, so I’m optimistic about testing these solutions. For our deliverable, we’re taking the 360 footage we capture of the homes, and creating short movies in Action Director with our own music and (optional) voice overs. The final videos will be uploaded to Virtually Home’s Youtube and Facebook pages so our clients can share their content easily. We originally going to offer the service of capturing the video and just giving it to our client but from a customer experience perspective, that actually puts more problems on their plate. The business of properly cutting video, and delivering it to the web in the highest quality is a valuable service in itself, so we’re rolling that in the deal. The experience for the client ends up being that they get 2 links to share immediately.



We also get the right to share their content, and each new location could be a nice portfolio piece. My thought is that as the service becomes more popular and we get followers on FB and YT the added exposure on the listings could assist our clients in getting more views on for property. One bit of technical discovery I’d like to share is that on iPhone and Android, it seems the YouTube app is required to view the videos in proper 3D format (ex: for cardboard). Our clients needed some coaching that if they want to view the video on the phones, they must use the YouTube App for their phones. Again, it’s nice to learn these lessons at a small scale! Another pain point is that super popular real estate service Zillow is making it nearly impossible to share video on listing that is not created with their mobile app. I think I understand why they would want to corner the market there, but I do think it’s an inconvenience for the clients. It seems the only way to do it is to cut and paste a URL into a listing descriptions, and hope the user can then copy and paste it into the address bar (no hyperlinks).


Danciverse


I had anticipated more time to work on Danciverse this coming week, however, web development work calls. The project I’m working on had a series of client requests, scope changes, and a developer took a vacation ?! all in the last week. I’m not the project manage on this team, so I don’t have all the details as to how this situation occurred... but that’s just how it goes sometimes. The surprise workload simultaneously a setback and an opportunity. It’s a setback in that my production time for Danciverse is going to have to lose about 30 hours, but it’s an opportunity for nice billable hours. The hours lead to more money, and more money means buying more of time to work on VR.. so I’m going to have a long and highly productive web development week next week. Features will be started and finished, tickets will be closed, a site will be launched, and deadlines will be met!



We did get in our weekly meeting as BunnyGun, and we figured out our designs and opening story shots, as well as some of the sound effect, set dressing, theme, and ‘bits’. What I’ve discovered is trying to storyboard for 360 is tricky. For each major shot, we’re breaking the scene up into 4 sections: North, South, East, West. For each direction, we want something interesting for the player to examine. For example, in the first scene: the major action is going to come in from the ceiling.. But in the meantime, we want to make the soundscape, and the surrounding environment entertaining, so we have bits with a cuckoo clock, some 80's parody posters, and a few other ‘bits’. One other helpful design thing I’ve stumbled along is planning out the environments on paper from an overhead perspective. That helps me to quickly think through the set arrangement without getting caught up in specific shot framing details. Once that is laid out, then I make a couple more panels of each of the ‘direction’ shots… those look like a traditional storyboard frame.

JDakotaPowell
Protege
Week 7: The Red Flute

Sincere apologies for deleting an earlier post (tried to completely erase it but no can do), but wanted to show more of the interactivity and UI that's emerging for this piece.

I've changed the genre of The Red Flute from an RPG to an interactive VR adventure / psychological thriller (think "Wilson's Heart"). In the opening scene, the player meets an Asian man in a padded cell with a strange box hanging around his neck. He claims to have slipped through a wormhole from 2000 years in the past and has been interrupted on a quest. As the player, you can enter his world and finish the quest. The dramatic question that looms throughout: is this Asian man crazy or is he telling the truth? This change in premise gives me the poetic license to create an entirely fictitious world and take imaginative leaps. Although inspired by Chinese history, the story and world of The Red Flute exists only in this man's mind.

As per UI, I'm using gesture recognition to clue the user in re what to do. For example, the player can draw circles to show/hide tool tips with his/her hands, as shown below:

374dljl40tnx.gif

The player exists in the present and always has access to a persistent tablet (note: movie is placeholder only), which he/she can access by drawing a "Z". To hide the tablet, the user simply swipes left to right:

x2jzlje4i8hj.gif

As per scene transitions, I'm aiming to create transitions that reflect the story. For example, instead of using a portal, the player grabs the Imperial Seal (red cube - placeholder) and the asylum walls, ceiling and floor crumble -- the walls of the mind disappear -- while the next scene (Diyu) blooms under the player's feet.

zsuvqc5qox6i.gif

The nav map and inventory shoots out of the player's left and right controllers, respectively - using button controls - which will store objects and notes that the player will collect throughout the journey. This functionality is done, I just need to create icons and figure out the objects required for an MVP.

Decided to use Keijiro Takahashi (Japanese dev for Unity in Tokyo) shaders because they fit well with this particular vision/story. For example, I modeled the tea maker and gave her Takahashi's wig shader. The movement and flow of wild hair and skirt (cloth) are mesmerizing in a headset.

b3o5u3iv4l9w.gif

I've gotten the bow-and-arrow functionality down for shooting the demon monkeys in the bamboo forest, but my focus will be less on combat and more on my strengths -- theatre, storytelling and dance -- to make this piece come alive. Aiming to choreograph a circle of Kung Fu warriors moving to Chinese Opera music and incorporating Takahashi's skinner shader, which, from a visual standpoint, may be quite interesting.

While I'll weave in some twitch elements, the guiding light will be story.

doubleeye
Protege

Double Eye Blog - Week 7 -Mechanics, Mechanics, Mechanics!

We have about four main technical requirements in this game. From my filmmaker point of view, I break them down into how I see them as two primary mechanics and then two effects on the system.

The first is the mechanic of destruction. We’ve had two approaches for this. Hammering down to level a building, or tossing it in the air. Since the narrative is key to me I like the metaphor of “tossing” a building a way -- that basic idea of tossing its history, a business or its people away.  We’ve accomplished this first mechanic this week and it has some nice physical properties.

We’re currently working on the second main mechanic of construction -- adding a property, whether a residency or an amenity. This mechanic gets tricky because not only does it involve Logic but also it brings the next three aspects to the forefront: a) What are the effects on the systems each time a new building is added? b) How does the player add a building? c) Where do they select it from? I’ve played with lots of ideas with my UX designer and we want to keep this playful. I love using scale in VR so at the moment we are trying to find ways to make the buildings small and grow them to human scale. Our goal is to master this component by early Week 8.

The two effects on the system are harder. One affects population count. I’m not sure how we’re going to do this technically and I’m hoping our Tech Advisor can help us figure this out. The idea is that every time you add a building into the system, it affects the population.

The other main system is what I originally called the “Happiness Meter.” For example, if you construct many apartments but you don’t add in a bar or a grocery store, your residents don’t have places to get food or a way to enrich their social lives. Since “happiness” is really subjective, we’ve re-named this “Quality of Life.”

Updates on UI: I am hoping we can get through the second main mechanic and connect the UI in a fun way. Since this game deals with such a serious topic I’m always looking for elements that feel like “play.” Once we get through this mechanic then I am hoping we can begin to cover the effects on the systems.

Updates on the Art: We’re working on the 3D modeling. We’ve tried 3D Max modeling from scratch. The first building Cecile has done has nice depth but the windows have a scooped-out gradient effect that doesn’t seem right. Cecile is also trying some techniques with the texturing to achieve that classic, Brooklyn red-brick color (see below). This coming week we’ll try a new approach working off of the concept art as a kind of blueprint.

xra4673y9bd2.jpg

Updates on the 360 docu-stories: I am continuing to film in Gowanus and to gather stories. I’ve had to cast my net a little wider in Gowanus beyond the main area of focus to capture more stories. I’ve now spoken to an entrepreneurial Moving Man of sorts. He transports residents and furniture in his van to their apartments. He loved pointing out the changes in the skyline. I’ll have to stitch to see what footage is usable. I’m still trying to access a certain location for a shot I’d love. Let’s see if my new friends in Brooklyn can help me make it happen!

Anonymous
Not applicable

Cecile

(post also published on my Medium blog)

It’s the end of Week 7 and here is where I am at in the development of my first VR experience, Capoeira World VR. If you missed previous posts, I’ve been blogging weekly about my VR journey into the Oculus Launch Pad and you can find all posts on my Medium page. I’ve also been vlogging weekly and the videos are all on a dedicated YouTube playlist

https://youtu.be/Hx665WBBQTU

360° footage review

This week I have been doing more stitching and I now have a lot of 360-degree video footage to review. I got to review some of it in a VR headset and that’s quite an exciting experience! The first clip I chose to watch in my Samsung Gear VR was one of a fast-paced roda from the UCA-Berkeley batizado. Everything looks quite big in a VR headset and that makes for some quite impressing action scene in VR! I took a screen capture to show you, but unfortunately, this does not do justice to the real experience, mainly because: a) there is not sound and b) the only way to experience VR is in VR — a 2D screen capture just won’t do it.

Get a speak peak of what watching this in VR could be (but be prepared for a very difference experience in a VR headset!)
https://youtu.be/cZX1R8zAZIY

While reviewing my footage, I uncovered an issue. For a reason that I don’t understand, Cyberlink’s Gear 360 Action Director (the software I use to stitch videos on my PC) flips videos around during the stitching process. The result is that the default point of view in the stitched video is the one of the rear camera, which is not what I want. I have filmed with the main action in front of the front camera. I will be have to fix that in post-production. 

Editing to the next level

Speaking of post-production… I installed Adobe Premiere Pro CC and have started learning how to use it. It’s a steep learning curve, as I had been using basic video editing software so far (Windowns Movie Maker! Yes, that’s right). There are tons of free tutorials on the Adobe website, I started with those. I even edited my first video with the new tool, it’s the above video of the VR screen capture. I’m starting with ‘flatty’ (2D) videos so that I’m more comfortable with the tool when I get to edit my 360 footage. 

My first project in Adobe Premiere Pro CC

I have heard that the Mettle SkyBox plugins for Adobe Premiere Pro were really useful tools to edit 360 VR content. 

https://twitter.com/aVRjourney/status/878039441096495104

Now that Mettle’s SkyBox has been acquired by Adobe, the plugins are made available for free to Adobe Creative Cloud paid subscribers. I emailed them to request my access, I hope to get it next week. Another tool to learn!

More learning!

I’m always trying to brush up my skills and in that spirit I have found this free online course on Udemy: Cinematic VR Crash Course — Produce Virtual Reality Films. The purpose of this course is to “Learn How to Produce 360° (3D) Video Content Like the Pros and Become a Virtual Reality Expert”. I’ve only covered the Introduction and Pre-Production sections so far and I hope to find more good pointers in the subsequent sections. 


Next week, my focus will be on getting better at using Adobe Premiere Pro CC. I also need to create the interactive part of the app in Unity. I have been delaying this activity for a while, I think it’s time to make it a priority!

prettydarke
Protege
This week was really taxing, kind of hit with a bout of depression due to some unfortunate news. Anyway, I'm back in Madison with about 10 more days of teaching, followed by a conference at UW that I now have some responsibility for (surprise!). I'm tired and drained but trudging along.

Project: Animals

I'm about half-way done with the script. Still trying to figure out the end of the experience. Things have departed from the initial idea - not aesthetically - it's just been abstracted in a manner that feels weird and exciting. We're leaning toward more surreal elements and thinking about how to work with scale and place in a way that won't make users sick. 

I'm nervous about splitting my time between my actual job and this project. I am already overworked with the first; I feel tired, anxious, and distracted. Somehow I need to get some 3D models together this week...and get the head gestures implemented.

Honestly I don't know how this week is going to go. I'm excited about the idea, I feel more invested in it, I'm just not sure the timing is going to work out for Launch Pad. My team is fractured and we all seem to be going through a lot.

I keep reminding myself that I only had a team for the last 72 hours last year, and we made something pretty rad. I've done more with less, gotta keep the faith, keep my job, keep my head and just move forward.

Goals this week: Finish the script. Model the first 2 characters. Prototype the head gestures. 

CourtsideXperie
Protege
s3dlzsynho6n.jpg

This weeks blog post is to share the logo design for Beat The Expert
Virtual Trainer. These logo’s appear in the menu of the VR app and allow
players to look around and select from various playing modes.


BTE Virtual Trainer has 5 different game modes: Dribble Instruction,
Rapid Trainer, Math Dribble, Dribble Arcade, and Bench Dribble.


In the next couple of weeks, we will explain what each game mode is and post screen shots of what the game looks like.


Overall, we are happy with the design of the app, so far.


We bagan a lite bata test of the app with some student athletes and
they said that they enjoyed the experience and that if available, they
would use the app regularly.


This week we will continue to bata test and make improvements based on the feedback we receive from the student athletes.


peace.




http://www.studygreatness.com/olp/6th-olp-blog-bte-virtual-trainer-logo-design/
FYI: Study Greatness is a where we will study the greatest in history to learn how to move from them. .