cancel
Showing results for 
Search instead for 
Did you mean: 

Week 5: Oculus Launch Pad (Due July 23 Midnight)

Anonymous
Not applicable
Looking forward to week 5 updates!
72 REPLIES 72

elisabethdeklee
Explorer
Elisabeth de Kleer
Communities Project

This week, I obtained a 3D scanner, scanned a couple community spaces, and began working with the files in Unity. 

Room object files, before textures:
x1wg29u1s77j.jpg

In wireframe mode, where you can see the polygons. Lots and lots of polygons!
2efa54g7k2j3.jpg

With the textures:
9aaguo4ym4f8.jpg

Next week's goals:
-Finish scanning some of the spaces I started this week.
-Working on integrating 360 videos and scans.

rbazelais
Protege

It's been a tough week. As many of you know Chester Bennington committed suicide earlier this week. His music has touched many around the globe and helped me through some of my darkest times. His suicide has shaken me as someone who also battles with depression and anxiety. Frankly it's left me questioning how well I've been taking care of myself. Honestly, I haven't been taking the time to sort myself out. My normal routine is a weird concotion of chaos. I work at my internship 9-6 then work on games stuff, the show up for local events and network in both tech and games. Then I do it all again the next day. Don't get me wrong I love what I do but it comes at a cost. Most of the time I end up giving my work more priority over self care. To the point when I am constantly working. Which in retrospect is not a good way to manage my well being. So this week I've taken some time for myself, mostly to grieve and to rebudget how I'm spending my energy and reaching out to my support group here at home. I'm almost a little afraid to post this but it is part of my journey.

Well back to the grind. Take care.

Next weekend is the VR Brain Jam being organized by the Games For Change festival. I'll be participating with my team. Should be a great chance to get creative juices flowing.

If there's any Launch padder out there that maybe just wants to talk or just needs a friend, I'm out here. I know what it's like and you're not alone.

jacqueline_assa
Explorer
This week we had a focus group with 20 high school students and 10 teachers to ask what they would find the most compelling in terms of how to use this product in the classroom and the kinds of projects they would find the most value in virtually developing within the platform. We learned a lot from the science teachers on what kinds of existing lesson plans currently exist around learning about technology like circuits, batteries, chemistry experiments etc. We are looking forward to integrating their ideas into the product. One of the biggest pieces of encouraging feedback we got is that buying raw materials to do experiments can be costly (particularly since many of these items only allow one use) but that doing science and technology experiments in virtual reality actually is less costly over time because all of the "materials" and "tools" can be used as many times as the students or teachers would like. This week, we continue to be heads down on development and fixing bugs. Really excited about how this is developing and looking forward to continuing working on a project that we genuinely believe is going to add so much value for students and educators. 

Anonymous
Not applicable
Jasmine Roberts
threadVR

I signed up for a shadow puppetry course for the next month to further propel my aesthetic design. I have also decided to make the experience using 2d shapes and layering them to create the illusion of a three-dimensional space--similar to a pop-up card. 

I am now examining shaders on OpenGL to best replicate this effect. 

GangstaClo
Retired Support

Week 5 - Back to the Research Board - Clorama Dorvilias



Anti-Bias VR Testing & Training Tool
i1u9l2t0anv1.png

Building off the momentum for having a concept prototype developed out for the classroom scene, Jessica Outlaw and I had decided to take the week to research and gather case studies to better inform the way we make further design and development decisions for the testing.

kqlwpqn5zsl3.png

She sent me a detailed document outlining a proposal for the testing plan. It provides a great overview on what we should aim for in user testing, setting up the testing scenes, some ideas for how we can model the interactions, and some research studies that she has come across that can inform and put a tangible picture to the problem we are trying to solve for of bias impacts on students in the classroom.

Here are a few of my thoughts on the testing plan and what I'll be thinking hard about for the development of the experience.
  1. Visual charts is that this is a great tangible look at what is being impacted with the solution we are designing towards solving for/alleviating for.
    • This makes me think that we should start putting together a list or sources that site this problem to further defend it when it comes to 'pitching' etc.
    • We should also look at sources that show how bias is being reduced. I can start putting that together as well a document of studies that inform/will inform the decisions we are making
  2. I think that's a great frame in the testing scenes to narrow the subjects of two children side by side, similar to the set up of the IAT.
    • When designing the questions/answers, we should figure out whether we want to integrate the same logic of a positive/negative association being framed in the Q&A that could be associated to the child.
    • For ex: Do questions with negative/positive connotations get associated with the children who get picked to answer it?
  3. Identifying the factors that went into the IAT and why its respected and making sure that all those elements are aligned in our scenarios will be extremely vital, I think.
  4. I like the idea of 50-100 people being tested on this.
    • One of my many hats that I wear includes being a UX Designer and its common practice that 5 testers will give you about 80% of what you need to do to refine a prototype (which has been validated in my personal experiences of conducting user testing sessions with way more than 5 on a regular basis).
    • So I think we can factor in a small user testing session with a prototype trying out a couple of the interaction designs
  5. The Interaction Design Question
    • I'm thinking a point and click with gear VR can be close enough--in the same way we point a remote to change the channel on tv or whatever, that can also physically resemble pointing towards a student to call on them to answer in the most similar way.
    • Additionally, this will make it easier to test with teachers as well as the gear is more affordable/portable and the nature of the game doesn't necessarily require optimal graphics, engine power, etc. What do you think?
  6. Controlling test for racial bias being a factor
    • We should include a control test where skin color is either ambiguous/non-human color to validate whether skin color is a factor in the decision-making in the other prototype tests.
    • Additionally, would be interesting to see if gender should also be tested for while we are at it? A couple prototypes with white male/white female and Black male/Black female? Could this be an option?
  7. Find real life case studies or a real life story of this problem to really solidify the messaging/importance of this type of testing in the classroom.
    • How do we defend why 'raising hands' and 'who is picked on to answer questions' is the determining factor or element to be tested for in solving this problem. How do we link this to other potential racially biased teaching practices that could harm student development/educational outcomes?
    • For ex: Why not 'who gets punished more' or 'verbal feedback', etc. based on the case study cited that points it to it. Are there other ways we can measure racially biased associations to real world children behavior in the classroom? (To demonstrate that we also considered other ways to reveal this problem and its not just a surface example).  
My To Do List:

Seriously, make the time this week to brush up on my research for bias in the classroom and looking at different case studies, starting with the ones you have included in the document.
  • Start doing a comparative solution analysis and gathering successfully proven research studies for testing and alleviating bias. I'm going to create a couple more documents that we can both add links to case studies/research on implicit bias and its impacts in education and I'll do a separate one for the workplace.
  • Technically, I'm going to do research on Unity 2017.1 today and look at what features it offers that could make development for this smoother. (see below)
  • I can start creating some demo prototypes of two children side by side with the interaction in place for the specific proposal that Jessica outlined in the testing plan.
  • I have a friend, Victoria Duran, who is a high school teacher and is also finishing up a Ph.D. in human rights. I shared with her about the work Jessica and I were doing and she was extremely excited about this project and offered to supply us a group of teachers to help test/and organize participatory design workshops for them to lend their insight onto how best to develop this out! We're going to have a dedicated follow-up on this in the middle of August.
-------
Testing Unity 3D 2017.1 - Timeline Editor feature.

In terms of quick and dirty prototyping, since our testing will require several rounds of game scene interactions that add or emit factors for testing, I am excited for the Unity 2017.1 Timeline editor feature.

This is a great source for an overview: Unite Europe 2017: Overview of Timeline and Cinemachine

I got it up and running this afternoon and It seems like, once all the animations clips are in, being able to trigger responses with simple scripts, etc can more quickly get put together. I put this together this evening but dealing with the unexpected bugs and some real life interruptions, I hope I can get a better show of the development for two applicants for a job, and the user in position to decide which one to interview for the position. The female avatar is experiencing extreme errors so as of this blog deadline:
28daaacvnzm0.gif

I downloaded Morph 3D assets that have the cool ability to customize character features with a sliding scale. -Positive side, I can create multiple adult characters with different features in minutes.
-Downside is that each character is like a million megabytes and I'm worried this could really drag down the game.
- They are also super buggy and Mixamo animations isn't vibing with their skeleton.
-And this just in, the female avatar has been deprecated. 😞

So I'll have to look into this the next chance I get. In any case. More research still needs to be done on my part to better get a sense of what direction we should go for designing a testing and training concept for the classroom and a high-risk workplace scenario.

Personal stuff!

Aside from dealing with crunch time tasks for a successful product launch in my Code for America fellowship, I'll be in Cuba for my sister's birthday next week.  Then headed straight to NYC to participate at the panel discussion at VR for Social Impact Conference during the Games for Change Festival at Parson's University. I had a conference call with the moderator and all the other panelists today and we all vibed really well. I'm excited for the discussion we'll be having on "Forging the Cyber-Feminist Future".  Hope to run into a lot of Launch Padder's there!


icy_violets
Explorer


a7yobfknz4a4.jpg  i4udxglkf65v.jpg 
This week brought a lot of forward motion in terms of creating relationships between the music and the visuals, that is, in relating the musical gestures with the painterly 3d gestures. The key to this work lies in establishing certain norms between the music and the graphics and then violating these norms at certain points for surprise and expressive effect.  We established key textural corollaries, for instance, staccato musical textures now generate pointillistic visual brush strokes, while legato music generates long and fluid brush strokes.  Another norm that we created had to do with camera motion. Here slow and steady camera movements in circular trajectories on the horizontal plane surrounding the paint strokes brought out the 3d nature of visual environment, making it an engaging and inviting experience. Violating this horizontal motion for a vertical orbit at a particular moment in the music also brings a thrilling, unexpected and slightly vertiginous moment to the piece.  We also will seek to move inside the morphing musical paint-sculpture at another crucial moment in the work. At still another point, the brush strokes quickly shift from a dominating color palette of cool blues and blacks into deep, vibrant reds, a shift that mirrors a change in the music towards bright and dissonant metallic sounds.  Subtle as these changes may be, in a non-narrative work as ours is, these changes can carry the emotional weight of the work and, hopefully, suggest a wide range of experiences and emotions.







robotliliput
Protege
Updates from the past week:

1. Broke down individual tasks at a high level, and then ordered them by week. In doing this, I realized that I would prefer to do create a single experience with a higher level of craftsmanship than create all 5 experiences at a lower level of quality. This can serve as an example of what future installments will look like and be used to help show potential content contributors what the final result would look like. I also want to be realistic about the complexity involved in the project and give myself time to refine the process and make something that I can be proud of. I don't have the ability to focus on this project full-time, though I do have help from another OLP participant as an assistant. I am not sure how hard it will be to find people who want to contribute their experiences to the project, so allowing extra time for that seems safer. If I can have the four additional content contributors lined up by September, I will consider things to be in good shape.

2. Started end to end test of the production process. I wanted to make sure that what I hope to achieve is possible before I put too much time into script writing and sketching. I created a test sketch in Tilt Brush, exported that .fbx file into Unity and used the Tilt Brush Toolkit for Unity to load the meshes into the scene with appropriate shaders attached. This process seems to work well, see screenshot below. I am excited to experiment with scripts that come in that Toolkit to allow for animation of the sketches in a sequence or in response to audio. If I hit a major roadblock with this I may also try a similar test with Quill.



I am still completing Android build setup work in Unity and plan to test building and loading the mini application in the GearVR later tonight or tomorrow.

3. I got my laptop back from support, but unfortunately the problem was not fixed. I will need to send my laptop back in again and may not have it back in time for next weekend. If this happens, I will spend my time working on tasks that don't require Tilt Brush and shift the schedule accordingly.

Next Steps:

This week I need to continue to refine the script for the first experience. This process took longer than expected; the act of writing the memories brings up a lot of difficult feelings. This may turn out to be the hardest part of the project, but it's also something I want to push through so I can understand how to help other participants to be more comfortable sharing their stories. I also hope to complete a rough application structure blocked out with proxy models and scripts for basic interaction. Later I can start adding in the final sketches and audio as I complete them.

dustin_harris_7
Explorer
I spent this week measuring and blocking out the gameplay area for my experience. The area is based on a small apartment, called an "apodment." The room has a bed, closet area, bathroom, and kitchenette--everything you need in an apartment at a scale that is almost explorable using roomscale VR. The total area is 13'x13.5 feet. Here are some notes from my measurements and greyboxing:

j1ztw61ro7j6.jpg
xkgwecz0j0gn.png
1bpm7jdxxekl.png


Along with my ongoing 3D modeling education, next week I plan to model out some of the objects within the space, create rough versions of some of the fantastical elements, and start testing the 360 camera. My goal is to create a virtual environment that directly maps to the 360 video physical environment.

aa_lique
Protege
Week 5 Blog Post July 23

Brief blog today.
Had to take a few days to work on freelance assignment. But back to work on OLP project.
Wow! Time is flying and very aware that we have 7 weeks left. I wish I could work faster.
I continue with modeling game assets and developing textures, filling in the scene as per the schematic

fr9jmcj9l0zo.png


Working out the mechanics for the interactive puzzles I have four but I'll be happy to complete 2.

I start modeling a functional mechanical gadget I'll call the Astroscope based on this clock and the Fishlaphon on Tuesday

740946vajvby.png

Goals: for this week
All modeling completed end of next weekend
Writing up a developer's brief for coders
Start designing the beginning UI menu




doubleeye
Protege

Double Eye Blog Week 5 - Farewell to Cardboard -- Hello Metropoles

This week we abandoned the cardboard aesthetic. This was a difficult choice to make as it has been a part of our narrative on gentrification in other VR experiences such as my 360/VR film “Cardboard City.” Using cardboard has also been a part of the workshops we are designing for communities. We made a huge pivot for the following reasons.

First, the process of photogrammetry isn’t perfect yet. I’ve played with 123D catch as well as more sophisticated scanners like the Sense and Structure scanners. After the first capture it was very exciting to me how much detail the scans preserved. And when we brought the cardboard models into Unity it was exciting to see a giant, life-sized version of these structures that we crafted with our hands. However, if the lighting is altered in Unity it suddenly reveals many imperfections. In fact, depending on the model the edges can look very sloppy.

Second, we need more consistency. Since we need to recreate a cityscape we need to be more accurate with our edges, and having messy edges doesn’t look cute anymore but inconsistent. The other major inconsistency was size and scale. If people make whatever they envision in their imagination they may make a 6 story building while others create a single story building. We don’t want to limit their imagination in the workshops. So when these imaginative buildings are assembled together it can be exciting and meaningful for that group; but it loses accuracy, context and meaning if it is representing an existing neighborhood. Surprisingly these tiny details greatly affect our narrative choices. Within the context of this game, do we want communities to envision any future or do we want them to work more specifically with the present that they have and build the future into that present situation? I’ve leaned towards the latter.

Lastly it comes down to interactivity in the 3D format suited to VR. Cardboard will still be useful for our workshops because it is a great prototyping material. And as an artist I love to merge analogue and digital; but at the moment it feels more natural to work with 3D objects in a 3D space especially if we want the player to interact with them. The final restriction of using these cardboard models is that they inhibit our ability to make them dynamic; and what is VR if it isn’t interactive?

I’ll leave this week with that final question and keep that as my metaphor to move the team forward and deeper into art and architecture this coming week. As a result of bidding adieu to the cardboard aesthetic we have renamed this experience “Metropoles.”

bbm1wbvib2te.pngldt1nunnoml4.pngp95f6gp248kj.png