cancel
Showing results for 
Search instead for 
Did you mean: 

Week 7: Oculus Launch Pad (Due Aug 6 Midnight)

Anonymous
Not applicable
Week 7 - we are at the half way point. Looking forward to seeing the updates you are sharing. See you on the call Friday!

70 REPLIES 70

dilanshah
Protege
I've had a great time working on the project using Google Blocks––its allowing someone like me with no 3D modeling experience to get the ball rolling and learn. After you try Oculus First Touch, it's pretty tough to imagine the Future of Farming scenes without that level of interaction with any object in the scene. In order to achieve this, I had an obstacle with my workflow to get individual pieces to be interactable.

Several efficiencies were found after conferring with a more skilled artist than me regarding making discrete objects (i.e. CDs on a desk) from a scene created in Google Blocks. For flow purposes, it's much better to create a scene all in one Google Blocks session. So for example, if I have a TV remote sitting on a table it's a part of the overarching mesh that I export from Blocks. This more experienced artist showed me how to use Maya to go into the living room scene, select the TV remote using either the edge or face tool and then follow this subset of steps which is mostly the same for both. 

1. Use Mesh>Extract, while focal object is selected
2. Go into object mode and select the separated geo
3. Go into Mesh>Combine
4. Select the newly created mesh 
5. Modify>Center Pivot
6. Rename



With WRLD, I've found the certain long. and lats. don't offer as much in the way of buildings as other coordinates (34.236137, -77.941537) are the ones I've tried for this shot. Next week, I've got to narrow in on where the Unity World space will be set, and carve space out for my rooms. @Micah brought up a relevant question pertaining to the beginning of our experiences. Namely, should we have a static load screen or should there be a kind of load-in area where the experience starts? For now, I'm trying to make the menu scene a bit like the user is looking at Earth from a satellite and is then able to select a location. 


yputwqha2b7d.png

tyruspeace
Protege
Week 7

Traveling again. This time we're driving home from Ventura, CA with my parents. I'm not going to get much work done if I can't stay in the same place for more than a week. Family calls and airlines fail; drives across the country with a 13 month old happen.

I want to get my head down and work, but I only just figured out how to get my hand on the keyboard through this baby car seat. The mouse situation looks dire.

Earlier this week was pretty productive! I got some great feedback on my project proposal. I also made lots of minor tweaks/fixes for VR comfort and consistency between in-editor and GearVR testing. It doesn't sound like much, and I do wish I'd had more time this week. But! It's actually hugely motivating for things to feel as solid as they do in VR. I'm relatively new to VR-specific game development, and it's a nice familiar touchstone to find the fun in a prototype like this.

There are a few problems that I could actually tackle on this car trip. Maybe concept art or level-serialization code? Skill switchin'? I'll see what I can do.

Oh. My Surface pen is missing its nib. Bumpy code it is!

nicole_babos
Protege

This week we focused on finalizing and fine-tuning gameplay
scripts. We tested deployment to the Gear VR and using the VR remote. We improved
several of the gameplay scripts to increase performance and maintain a solid
framerate. We also polished some game assets with improved shaders and built
out the functionality for our Start Screen/Hub.



tbv5b8nz330x.jpg

Me testing the game in the headset.



We also worked on improving the user experience through the
reticle. We created assets for our reticle to be responsive to the type of
action taken when interacting with an object. We are hoping our reticles help
guide the players’ actions in a non-verbal way.



 r9kwwacsssub.png



The game was really lacking audio feedback, so I found
several sound clips for our train from www.soundbible.com
and purchased a few clips from www.audiojungle.com
.
Satisfied with what I found, we implemented those sounds, really improving the
feedback and feeling of the game. There were still a few sounds we needed, so I
recorded some sound clips using my Yeti mic. This was fun and new for me. My
cats were really intrigued when I made recordings and kept trying to meow into
the mic. 
xno98c5da7uu.jpg



Now that we’re confident in the gameplay loop, we’re focused
on polishing our assets. The biggest feedback suggestion from Oculus was to
focus on the asset quality, so we are going to spend next week enhancing our art
assets.

Nicole
www.fancybirdstudios.com

chinniechinchil
Protege
Tania Pavlisak - EnlivenVR Week 7

This week I did not have any time to work on my own project due to conferences and local game jam.  There was DevGAMM on Monday, Casual Connect on Tuesday til Thursday, Seattle Indies game jam on Friday til Sunday. In fact, I'm writing this from the game jam, now that I have some free time after churning art assets over the last 3 days.

I took a lot of notes from the conferences:

From DevGAMM:

  • localizations boost sales & visibility / discovery (which is a big issue for current game developers)

  • VR experience has more impact than video games. Positive - people who play more VR can overcome certain phobias, such as height. Negative - can be traumatizing & bad experience, user no longer interested in VR (not just your product). Design carefully

What I learn from Casual Connect:

From Things to learn from Raw Data:

  1. know target audience (hard core players, expressive, or creative).

  2. Community management. Target VR enthusiast, engage them. Know where to reach the community based on audience (pc gamers use reddit). Do content marketing - for example create graphic novel about in- game content for Comic Con. Run Freaky Friday event where VR devs from different companies try each other's products, share the result with the community.

  3. Identify the influencers. Give keys at pre-alpha to streamers or influential VR enthusiasts

  4. Key relationship. Build & maintain relationships with platform companies, like Oculus, HTC, Microsoft, etc. Look for opportunities to get featured in the platform store.

  5. Analytic - use data to find information

  6. Launch. Use appropriate information to present at conferences & events. For example, highlight technical talk for GDC, while talking more about contents and features at E3.

  7. Demo & arcade. Give public opportunity to try the product.

  8. Optimizations - happen early & often; level of detail art assets need to be tested in VR; sound location matters (ex, for a gun, the cocking sound, bullet, shooting sfx need to be separated)

  9. Breaking rules- let user have control of the head; if dev need to add feature, make sure it enhances the experience, not making user uncomfortable.

From Shipping the first VR game (Archangel):
  • make no assumptions

  • VR / AR needs a lot more testing than games

  • Need testing using all elements (textured 3d assets, effects, lighting, audio) instead of just greyboxing/ temp assets

  • understand what works

  • expect delays, need more iteration in VR than games

  • expect technical challenges, especially if managing multiple platforms

  • expect process challenges. It' s even more difficult when working with remote team members

  • When designing interactions, consider different skill levels. Consider user position. There’s a reason stand up meetings are super short. Consider players daily activities. People who work long hours standing might not want to go home & play standing up.

  • UI - text is hard to read

  • Locomotion - still the hardest problem in VR

  • Budgeting - add 100% to budget compared to game dev due to delays, not including hardware. Always add extra time.

  • Stylized art asset can affect a lot in VR. Can’t rely on existing video game assets; they will need to be optimized for VR

From Opportunities and Challenges of Building Games in VR/AR/MR:
  • Challenges: user comfort & fatigue. Unlike video games, it's hard to play game in VR when we are not feeling well. VR can be isolating than shared experience. Locomotion & motion sickness are still big issues

  • Design tips: don't go immediate hi-resolution with assets. If needed, reduce amount of things in the world. If it's not interactable, maybe they need to be removed.   Keep frame rate minimum to 90 (check with device specs). Use spatial sound, cues, haptics. Get experimental to enhance presence.
  • For testing, put a lot of people through the experience, listen to them and also watch their body language. The first test is motion test, to see how the user react to motion sickness. Second test is to see if people understand the experience. The third test is design balancing.
  • Comfort & accessibility - consider designing experience that can be used with only 1 controller. Implement height adjustment for those in wheelchair. Give people alternatives for sensor (?), for example after a while player can get tired of talking, standing, etc.
From State of VR Content in 2017:
  • VR is not casual, nor it's like everything else (mobile game, PC games, etc)

  • Don't coast on novelty. People are no longer shocked. They want more experience.

  • Explore both PC and mobile. Mobile systems are getting closer to closer to PC.
  • Be more broad on genre. People are curious.

From Investment Opportunity for VR Gaming Studios:

  • In VR/AR, a lot of creative works come from women. However, lots of funding are not going to them.
  • Investors look at company's portfolio before investing. They look at the leadership of the company, what they're thinking about the future, whether they have longetivity, have enough money to survive 18-24 months, also if they're willing to learn and grow.
  • VCs currently avoid investing on hardware, since they take a while to tweak. They like to invest in market trend, not in long-term R&D projects. They invest in tools & contents. They're big on location-based entertainment.

For the game jam, the theme is "it's not a scam".  I'm working in a team of 5 people (2 programmers, one audio person, one 3D artist.)  We are working on VR experience of a buyer's remorse, when the user bought their dream space ship, only to find that things break along the way. The development process is interesting. We have one person with Oculus Rift + touch, and two people with HTC Vive, all working on the same project at the same time.  We are using mainly SteamVR and Newton VR for physics.  If anyone is interested, you can download our project at https://github.com/chinnie/SpaceLemonVR .   Currently the main scene is integrated for HTC Vive build.

Plan for next week:
- sleep in and relaxing on Monday.
- create art assets for Enliven VR

See you next week!






ben_cone
Explorer
Week 7!

This week was mainly focused on the assets that will be used in the final product.  The pipeline to create the assets and import them into the editor to work with our homebrew "lighting" workflow is interesting and we've been trying out a few tests.  

Here is an example of the lighting working correctly using the material functions.

https://youtu.be/qTDwJ9hd4Zs

Also, there was a big focus on the art side of the assets, and concept art was explored, discussed, and now we are on to building the assets themselves.
a8t1zmi5xd2v.jpga2jfzab1bxzu.jpg4ic3a6508tzk.jpg4kgsxvk2uabn.jpgzmf09bygyhvh.jpgu7uwdjwie8qf.jpgvndlxw5z8o7c.jpgo97hs1n5fi69.jpg


skatekenn
Explorer
Hi everyone! I got a little more development work done this week, and I also found some inspiration in a talk I attended! Boston Unity Group had a meetup on Thursday, at which Seth Alter of Subaltern Games gave a talk about his game, Traitor Nightly. It's really motivating to hear devs talk about their work, so I really appreciated being there for that!

ndshort
Protege

Neil D. Short
Project Title: Achilles Heel
Type: 360 Video Series
Genre: Sci-Fi Sitcom

OLP17
- Blog 7



Like
many other OLPers I was traveling this week to the mountains of Idyllwild,
California and a day in L.A. to attend the screening of a film I helped produce in the L.A. Shorts Film Festival. I mostly did remote work via email and Google Hangouts with my production
designer and concept artist.



I
received feedback on my proposal this week and it was good to get questions now
that I can address while still in this development stage. One of the big
hurdles for the demo is the cost to hire the actors I will want and to build a
physical set. With this in mind, and the feedback I received I have decided to
use concept art to create a virtual version of the set in 360 and shoot a few
scenes with temporary actors on green screen and composite them in. This will
hopefully be a way that is “cost/time friendly to deliver the concept clearly”.
It is risky, since I want to use practical sets, but not having the time and
money at this point, I feel this is my best option.



We
have a good floor plan for the main room the series will take place in and my
production designer is creating it in Sketch Up to hand over to the concept
artist to flesh out the look.



 xzslgl6sclsx.jpg



wmxnecq93l0x.jpg



Another
question in the proposal feedback was regarding interactivity and how to
provide the viewer with a feeling of inclusivity (agency). My initial plan was
to have the experience mostly passive, but considering the feedback I’ve
decided to add an element of interactivity.



The
viewer’s role is observer through a journalist’s camera placed in the center of
the room. The action takes place around the room in various places and my
initial thought was to block the actors in such a way that they move closer and
further away from the camera based on if they are to be part of the immediate
“scene” or conversation. I’ve decided to add strategically placed 180-degree
ship cameras, built into the wall like security cameras, that are accessed by
the ship’s computer. As characters move away from the 360-camera, I want to
give the viewer the ability to use gaze selection to choose a ship camera near
the characters to spy on their conversation. By doing this, the viewer will
have knowledge of information revealed to one or two characters that is not
shared with the whole group, which can change the perspective of a
conversation. For example, if Drew says something off topic and silly, and his
younger supervisor Samantha tells him to “shut up and go find something to
clean” and he immediately complies, it may seem like he is just following the
orders of his boss. Now, if we jump to a ship camera and listen in on Drew as
he explains to Taylor that the only reason he has a job is because Samantha
tolerates and protects him from their superiors, then the viewer will have a
different context to future exchanges between these characters. At the same
time, if the viewer does not switch camera views they don’t loose out on the
story, they just don’t have as full an experience as is possible.



 Next
week’s focus will be on camera tests Unity integration. I’ve been learning on
my own, but like so many others doing 360Film the learning curve vs. time vs.
to-do list is proving to be a challenge. I’m going to reach out and see if I
can find a Unity developer to join the team.

higazy
Explorer
August 6th - Week 7

As part of my pitch I wanted to figure out how to plan for a location based build out for my project so I linked up with OptiTrack to budget for a multiplayer experience with the occulus. Luckily - their system directly ports to the Oculus and for a 40' x 40' room (About the size for my experience), I can just use their system.

For the rigs, I got in touch with HP and will be planning to use the HP Z Workstations for my build. Finally, for the weapons themselves I hope to source the Strike VR system as part of my budget pitch. 

While I don't initially plan to have the build out for my first phase of the experience, I definitely wanted to get a realistic budget for what the project would cost once we moved to a location based experience.


Thudpucker
Honored Guest

Another short one
this week as I was out a bit in the start of the week…



 I took a step back
from a lot of the conversation detail this week and focused on working with my
team on the atmosphere. It's really important that this scenario feel grounded
in reality. If this comes across as comic-booky or arcade-y, a lot of the point
is missed.



The character
(customer/ guest) is now looking really good. We got a great hair sim in there
(using cloth, not hair, but it works!) and played around a lot with shaders to
really give her skin and clothes a polished look.



The environment is
near complete. It's close enough to start playing with the lighting aspects.
Lighting is always tricky in VR. Too many dynamic lights gives some huge
performance problems. Too few lights leaves your scene unappealing. And baked
lighting can take a long time. The goal is to use all dynamic lights, but
sparingly. We'll bake it when we're happy with the result of the dynamic
lights. Luckily, our NPC isn't going to move much, so we don't have too much of
a concern about shadows, etc.



As for the
conversation, I've played around with that some, but honestly just didn't have
the time to do as much there as I wanted to this week.

Sneak peek!:

qljd878hxgfo.png

rbazelais
Protege

Week 7: Games For Change & Next Steps


Week 7 has been a bit of a whirlwind for me. I know that I'll have around 6 weeks left to polish the potential submission of our game Spark a Memory. Most of which comes from demoing and presenting the game to the rest of the VR For Change day, receiving good feedback and interest. Wick and I took the stage to represent out team at the VR for Change to show what we were able to accomplish with Spark a Memory and field a few questions from 5 judges.


One of the most challenging questions was about our next steps and what would our team want to do if we would continue to develop our prototype. I remember explaining that most of the nodes (Neurons) for now are procedurally generated and random. So far we had made the game so it could primarily show how neurons connect to each other and send signals. In doing so the player would complete a neural net that would unlock a memory which is essentially a 360 video. However in order to make the game something full and engaging we would encourage exploration.The first step would be to work the system we have now to design a small series of puzzles to present a challenging element rather than 'ooo this is cool but what's the point'. Implementing this might prove to be an obstacle. I've never built active puzzles before but that's what game jams are about, learning new things. I'll have to spend some time sketching out simple puzzles that play with a neurons ruleset of being able to heat up or cool down the connected cell based on the connection's signal and strength. The current system allows us to demonstrate this but is pre-formed neural net every time the game starts. Once I'm able to place them and have them connect to each other this way level design will actually be a possibility. We'll see what's feasible.

During development I pushed for the game to be able could run on both the Oculus Rift and the Gear VR for it to be accessible to a larger population of players not limited by their systems. It's easy to go hey let's just run this on the rift but it's important that we think about our potential market with this game. As it stands right now our main hardware guy Scott made a great effort to make the game as comfortable as possible in terms of the game running at 60fps on both platforms. He was excited that our play experience had no hick ups and our demo didn't make anyone sick. One problem is the Unity Video Player make our Gear version crash after the video scene ends. So we'll be buying an video player asset for that. The other issue is compressing the 360 video diminished the quality of it completely on Gear. It was a quick fix to avoid large apk files but considering that the video went from 1080p to 480p you can imagine that production value has also diminished. This is a priority on the list of things to do as well we add memories and level design.

This project combines the VR with 360 video in a unique way. There's a little bit of science, exploration, and wonder in it as well constructing the kind of impact we want our game to have. We've been able to accomplish so much in just a weekend, we'll have to see what we can do now that we're not all huddled in the same room for a couple days. I look forward to giving you more updates. If anyone has any solutions to the video compression issue I would be very grateful for any advice.