cancel
Showing results for 
Search instead for 
Did you mean: 

Week 4: Oculus Launch Pad 2018 - Due July 30 @ 11am PST

Anonymous
Not applicable
Let's Keep It Going - Week 4!

As a reminder, Keep It short and sweet unless you feel really excited! We won't mind reading the details. 
68 REPLIES 68

Anonymous
Not applicable

Week 4: VR SeaLegs 

Our first step during design was to develop the methodology behind our choices. 

hpt3c8pzk36h.jpgHere's our plan. Read our blog posts to unlock more content!

Why should we pick one movement style over another? Which can we trust for VR newcomers, and which are more advanced?

We are starting with teleportation as it is the most comfortable initial experience. One they are accustomed to this type of movement we will take them into Hand Guided grounded movement, which uses the direction of the hand to control movement angle as opposed to the direction of a joystick.  

Why not jump right into joystick movement? People coming from a gaming background find stick movement familiar and they have experience in controlling the movement, however VR users with little to no controller based gaming experience find this difficult. Users without a lot of joystick experience will often press slightly diagonal on the stick instead of perfectly “forward”.  If you press forward you expect to go forward, not diagonally. When this occurs the user experience an extreme mismatch of their intention and their movement which translates to VR Sickness.

When you signal your direction by pointing and pressing a button, we call it movement intention. Teleportation causes the least amount of discomfort in part because it take time to target where you are going, which solidifies intention. You know exactly where you want to go, you declare your intention by pointing and pressing a button.

However, teleportation can be extremely limiting. When you have close combat, teleportation breaks the user experience and essentially cheats death. It’s also problematic for vertical traversal of the game space.

cv4rxu804pci.pngSpaceDragons don’t teleport. They fly!  

Our first areas after the intro use hand-guided movement. You point the controller, press the movement button, and you're off. You signal your intention, and then movement follows. This is one of the core aspects of reducing VR nausea. For the very first experience, we showcase grounded movement because it’s familiar to the user, and lessens the risk that they’ll experience vertigo.

In order to get them accustomed to the style of movement, we start slow and with simple tasks. After these tasks are complete, users will unlock the ability to sprint, and we’ll start getting them accustomed to faster movement.

As you can see on the image we have more stuff to unlock…up next: Flight Path!

MichaelAdrian
Protege



UNDER PRESSURE
It may sound strange to many people but some of us (myself included) like a degree of pressure applied to various parts of their life. I have been steadily building my 360 Film project:
I. Idea nailed down and refined specifically for a 360 space as opposed to other storytelling methods,



II. script written and being revised, tweaked, and elevated
III. Researching the right camera(s) and spaces for the project
IV. Creating my team to help me pull this off (editor, cinematographer, director, casting director)
V. Negotiating finances
VI. Negotiating locations



VII. Negotiating partnerships if only for promotions
IX. Creating storyboards
X. Checking the time. “What the… week 4!”


About Wednesday it hit me that this is week four and although I have done a lot there is so, so, so much more to do. For a few minutes. About 120 to be exact there was a lot of cursing and rushing around looking at to-do lists. Then I stopped. Took a deep breath and embraced the pressure. Deadlines are good because (at least for me) they force extra creativity even if it’s just time management.


Totally looking forward to putting together my August 10th check-in and feeling out all the good people at Oculus’ feedback.

kinjutsu
Protege
r0xr3vmb6ewc.png

Happy Sunday!

This past week I've been making steady progress
on the Mail Snail prototype.  It's finally starting to feel like a real
game which is pretty cool.


This week's rundown

  • First pass Snail model and rig created
  • First pass on controls completed
  • Level travel system implemented
  • First pass on hub level almost done
  • Made a foliage "wiggle" shader
  • Worked on the budget and pitch documents

kqrxif12kw27.gif
WIP snail model and idle animation

This week I got the first iteration of Post the snail in game.  The model and animation were created by Tim Curry (@bronomelon)
and have increased the adorable factor by 100%.  There's still a lot of
work to be done, so look forward to updates on the model, textures and
animations in the coming weeks, along with additional characters. 


I
had hoped to have a first pass on the hub level done to share, but I
still have a few things left on that to wrap up so I'll be sharing
images of that next week. 


Goals for next week

  • Have pitch and budget documents completed for the mid-point check in
  • First pass on hub level completed
  • Continue work on Snail model, textures, and animations
  • Complete the narrative system

Have a great week all!



@elysebk  |  elysebk.com
Oculus Launchpad Project: Mail Snail

maja_manojlovic
Protege

On the one hand,
everything seems to be coming together as magic and, on the other hand, it
seems as if all these massive blocks of stuff that need to be considered for
the Reconnect! The Amazon Medicine Garden project is “gelling” way
to slowly.



Since our extremely
informative Tech Talk on how to pitch our projects I’ve been doing lots of
research:



1.    What technology do I envision using to bring
to life the organic microcosm of flora and fauna in the Amazon, as well as the
“different” vision enabled by the Yawanavá’s communing with their environment?



2.    How do I define my project in order to pitch
it to appropriate festival categories and funders?



3.    How do I get to experience as much compatible
VR work as possible to better understand how to both place my project and how
to differentiate it from those already made.

One of the artists
creating work I find both very appealing and similar to how I see the
experiences of my project is the Australian artist Lynette Wallworth. She
developed her two projects Collisions
(2016) and Awavena (2018) at the
Sundance Institute New Frontier/Jaunt VR Residency Program.



25ti916825rz.jpeg

Collisions tells a story of Nyarri Morgan, whose first encounter with
Western culture and science was his witnessing and atomic bomb test in the
1950s.



ocm8smmmyfor.jpeg



Awavena tells a story of a first woman becoming a Yawanavá shaman.



pa68fz35k1a0.jpeg



A stereo pair of a
fluorescent caterpillar filmed in the Amazon forest at night. It was meant to
bring to life the way Yawanavás see their environment using their visioning and
altering states of consciousness technology – the Ayahuasca rituals.



These fluorescent caterpillars are imaged with a process called “confocal imaging.” Anya Salih, a fluorescent and marine biologist describes it
as a technique that uses a laser beam to scan the surface and subsurface inside
the organisms, or a cell, and then uses these images to generate a 3D view of
the internal view of cells or creatures. Anya Salih was the director Lynette
Wallworth’s close collaborator on the creation of Awavena.



Here is what the
artist used to make Awavena (the
information below is collected from Lynette Wallworth’s interview with Jay
Holben at the Sundance Canon Creative Studio):

·     
2D
photography, 3D photography, photogrammetry, laser scanning, drone
photogrammetry, microphotgrammetry, and extreme low-light fluorescent photography
of plants and insects in the Amazon jungle.



·     
They
utilized highly sensitive Canon ME20F and ME30F cameras.



I am now writing out
how is Reconnect! The Amazon Medicine Garden, which is in many ways an outgrowth of Awavena (although I envisioned it way before
I even found out about the amazing work Lynette Wallworth does with Awavena) different from it. What does it
bring to the users? What are its “target” audiences?



For now, I know that
Reconnect! The Amazon Medicine Garden can
be marketed to several communities:



1.    Academia & Education:



·     
Anthropology,
Cultural Studies, Climate Change, Environment and Sustainability: an
introduction to a culture that is intricately connected with its environment
and is acutely aware of their interdependence with it.



·     
Entrepreneurship,
Research & Development: a conceptual machine, a fluid framework for
accessing intuitive and creative approaches to find out-of-the-box solutions
for problems in both sciences and humanities.



·     
This
makes it appropriate for any classroom demonstration and/or project, as well as
applicable for experimenting with alternative pedagogies in the classrooms at
any level.



2.    Social Change & Justice:



·     
This is
a project that changes people’s hearts and minds about the nature of our
interrelatedness with our self, others, nature, culture – the world at large.



·     
This
makes it eligible for Sundance’s partnership with Stories of Change and other similar philanthropic projects.



3.    (Alternative) Medicine & Meditation:



·     
This
project incorporates elements of self-healing that may be somewhat similar to
(guided) meditation or, biofeedback.



·     
This
makes it marketable to various physical Yoga studios, meditation sites, etc.



 Thank you for reading!!!!



 

erinjerri
Protege
Below I'll outline, technical, artistic, design, and other business decisions for BaybayinVR that have been on my mind as of late, what I'm working on and other considerations for the future that have to be tabled to some extent so I can focus on having a solid prototype I'm happy with now for mid-point check-in (though not required, it's something that is long overdue since I spent a good amount of time onboarding the rest of the team and figuring out a few things that led me to other things I hadn't thought about - for example, NLP nuances, that can get me side tracked).

I'm attempting to compartmentalize when having to think in two different streams of thought
1) in Python and advanced word embeddings without data and then
2) keeping it into consideration for design interaction is critical as I'm learning more about my ethnic heritage, language, and culture simultaneously in this project from my friend, Kristian, who is one of my collaborators on Baybayin,
3) while also thinking of technical developments of what's been missing past years (besides cool interaction design, storytelling and edtech we're doing here), as I delved deeper into reading the HXStory of the language.

In this post:
1) Technical
2) Integrating sound and visuals
3) More technical stuff for cross-platform mobileAR I have to table
4) Budget prep - updating this to reflect it to be accurate for time spent on this project for the mid-point check-in that I'm currently not trying to freak out about.
5) NLP + More Interaction Design Choices 

Next week I hope to have more code done and at least a video to show of storyboard in Sketchbox if it exports properly as well as prototype video, which means I will write a whole lot less on the blog, will write a lot more code, and show you more cool stuff I'm working on!

1) Technical
6hakoolptwg3.png
Erin fangirling Mike Alger (Designer, Google Daydream)

1) I'm still debating on actual training of data once captured properly so that it's statistically significant with user generated content after the app is launched. I had feedback from Mike Alger when visiting Google Tiltbrush this week that I should have folks paint in 3D but compress the 3D as flat to make processing easier. Easier said then done? Suzanne was trying to translate what he was saying which I may have misunderstood.

ocevzlr67cg2.png
91thari4k1go.png
More folks at Google Tiltbrush Meetup, along with fellow participants from 2018 Oculus Launch Pad 

v4bwl12w643g.png
FilipinX Americans at Google Tiltbrush Meetup

I'll have to consider the visual guide system (system of constellations of a tutorial) that bounds the user to physical space whenever they draw a character so it is actually accurate when it comes to what would be gesture/paint data from the Tiltbrush/Quill controller from scratch. 

2) Integrate visual and sound. 

Another to-do item I'm working on is gluing together some code snippets I took from Berkeley City College I TAed last quarter to hook up mp3 sounds appropriately. Need to finish this tonight so that I at least have something half meaningful in preparation for Niko's lecture if I have any good questions to ask.

g6n4ywlaykh5.png
Erin with Kristian at UNDSCVRD Filipino Night Market + Arts Event last week

Sound
md9rfm7zbj6d.png
Erin with Filipino American DJ world champion: DJ Q-Bert (a.k.a. Richard Quitevis) at UNDSCVRD

Here's a a link to the long video that I wanted to use actual DJ sounds for from for our interaction from DJ Q-Bert and DJ Shortkut's performance last week, as I'm looking forward to Niko's lecture on adaptive music this week. We may have to ask Niko and Q-Bert if there any issues with crediting sound.

I am looking at making the scratch itself an erase tool in the palette/controller on the left hand and making the paintbrush sound and graffiti spray attached to the paint interaction. 


3mccyp6h28dc.png



Visual
Here's some new art by our technical 3D artist to serve as inspiration for me this week. 

3) Figure out some other polycount/rendering issues (what's the max I can push to BOTH Oculus Rift and ARKit - mobileAR).

4) Budget - updating more work (the # of hours already spent by my team and what I project out even after a launch). I'll need to ask Ebony how much we should account for and how specific we need to be. There's also the question of what hardware may be necessary (fast GPUs) for AI when we have user generated content (3D data capture of Baybayin characters and gestured movements, we'll need good hardware for making VR demos easy and more portable than carrying around my extremely heavy ad expensive PC rig at home, but also really taking into account the speed at which I can acquire/capture data and do actual machine learning). 

5) Other technical + design outstanding issues when it comes to cross platform development I'm thinking about include:
  • Thinking of it's worth applying to Adobe's Project Aero to figure out if iPad and iPhone for mobileAR as a second step for later is worth adding, right now just want to get something working up in the Rift.
  • Earlier this week after another call with my book anthology publisher, O'Reilly, my co-editors and I were discussing our current work. I was discussing integration with AcrossXR, my friend and co-editor of the O'Reilly book, Creating Augmented and Virtual Realities was offering a way to partner, but I still need to get enough up with scratch that I'm happy with so I can properly integrate with him to have a cross-platform experience/application that makes sense. 

pw0cw8mg6opo.png  
Baybayin carved onto bamboo slate - a question of controllers (besides paintbrush)

5) NLP Nuances + More Interaction Design Choices.
I also am having fun thinking about my native tongue and any other things I may have forgotten (nuances) in the development of Philippine Baybayin fonts reading Hector Santos's work and referring to videos by my collaborator, Kristian Kabuay on his work and the hXstory of the writing system itself. I also emailed Kristian asking about the use of knives on bamboo slates (how Baybayin was previously practiced besides a paintbrush) in case we want the option of different object controllers in the future. 

sherveenuduwana
Explorer

It’s one of those rare weeks where the things I say I want to do actually get done! In addition to the reading mechanic I got up and running last week, I also made a simple writing mechanic and talking to NPCs is also implemented. The writing mechanic is really bare bones, it’s just directly modifying the pixels on the existing texture. The fun part was getting it working with the OVR Mouse Pointer, so now when you look at a writable object (paper, graffiti-able wall etc), a cursor appears, and a pointer click enables writing, much like clicking a pen. Ideally, I’d want to make the pixels drawn smoother, but I think this is enough to put it in people’s hands and get feedback.

NPCs are pretty simple right now. They wander randomly, and if I define some “food” for them, they might go towards that instead. You can highlight them with gaze, which displays their name, and eventually some kind of UI prompt. If you click the pointer, i will trigger their dialogue, which can come in multiple batches.

From the editor view, You can see I’ve set up a rough “territory” for the NPC, which they’ll mostly stick to when they wander.

I’d say all the mechanics are at a good alpha stage now, which means I can build more of a footprint, get more of the scenes and systems set up so that there’s an actual playable experience here for next week. This is good timing for working on the pitch doc this week, hopefully. I’m going to try and limit myself from polishing everything too much until I have a footprint, or I will just spend hours on that and not get anything done.


doubleeye
Protege

This week has been dedicated to my multiplayer game
METROPOLES which I began last year in Oculus Launch Pad. I traveled to CA for
some meetings. 

uuipifa50f6a.jpg

First and foremost I’ve been able to meet with my new VR producer,
Adam Rogers. I’m thrilled to have Adam join forces with my team because he has
had experience working with the multiplayer experience (CHORUS) that I admire.
Adam is excited about “the breadth of the project and how it crosses so many
branches of interactivity and film.” Adam cares about the social impact
potential for this story and creating meaningful interactive content in the VR
industry.

Observing the amount of real estate development that is
sprouting across the urban landscape is apparent. There is endless construction
it’s hard to navigate the common sidewalk. The worst feeling is stepping around
a stream of homeless ghosts that wander the sidewalks without a place to call
home and a place to belong. My heart broke when I saw a woman with an “LA
County Jail” shirt trudging down the street like a zombie. There are many
problems to fix in our great cities; and gentrification is tied to many of
these systems.



I finally had the chance to put my composer Lucas Lechowski
into an Oculus Rift to hear his music in this full immersive experience. The
demo is still a work in progress but it gave him a chance to experience the
feeling of the storyworld. He’s going to work on a new mix for the opening 360
video. He will also work on sending us some music for the main game scene.
Tomorrow I meet with my editor. Our developer is cranking away on working in the
new logic to adapt to the latest 3D models and artwork. So much to go but it
feels great to share these tiny advancements in person with important creative
on our team.




Next up: On the Theater VR experience we’ll take a look
working with Improvisation.

ndshort
Protege

Neil D. Short
OLP18: Week 4
Project Title:
Adventures of PICL Jones
Type: VR Game
Genre: Comedy/Adventure,
Sci-Fi/Western



 Our team has grown this week. I added two additional sound
designers, bringing our sound team up to three members.  One member of the sound team is going to
focus on sound asset generation and the others will focus on implementation in
addition to asset creation.





On the art side, character concept art is underway. We’ve
decided to include three characters in addition to the player character. We’ve
also added a character modeler and a rigger to the team.



nwdnl4eywud7.jpgc6fv6use3vjm.jpgjdeecdc82ii0.jpga2qx98wd3ctu.jpg



 



Concept art for the ship is under way with exterior design
and initial floor plans.



ikun35qk38wf.jpg4nrlyzfuj3wx.jpg







Finding people on the tech side has proven more challenging. My first round of recruiting has not been very successful. I've gotten good advice, but no one to work on the project yet. I will continue to work on that next week!





marmishurenko
Protege
Maria Mishurenko and Gordey Chernyy
Hello! This week has been busy, but we managed to add some sweet stuff into the game version: scissors sounds, music, some new art. 
New scoreboard (NYC-subway-like):
klhyi8dwby6i.png
An attempt to prevent players from stepping forward all the time (yet to be tested)
s43txzd0ca0h.png

Logo!!

8nzlpb5gt10d.png

And Gordey managed to do some research on low-poly face styles (certified weirdness):

m5oamxnufcgo.png

idlxmvhw8euw.png

Next week we'll refine interactions and finally will model some good scissors!