Oculus Launchpad Dev Blog- 8th Entry- DevLog 757.008
With my 7th entry I explained how the post was too long to include some of the Immersive Experience information I wanted to, so I'm addressing a lot of Immersive Experience points here. I'm also including some info from a post I never got around to about a month ago where I time stamped a video done by Unity trainer Mike Geig. His training videos are awesome, and the one I've time stamped here on Assets and Game Objects is great for a beginner, like me, to check out. It's not the standard format I've been using lately, but being a little over the hump in terms of all the blog entries that need to get submitted, I think this one still has a good deal of useful information for new storytellers who are entering the VR medium world. Immersive Experience One of the things I think works well for VR is having a liberal arts background. Not because it gives a specific advantage per se, but one of the things that a liberal arts background is supposed to do is train your mind to synthesize information from different sources, and then apply them to a context where information can take form in a new way. Maybe even a new unique context. Thinking like this isn't special or new, most schools just don’t do a good job of training students in higher order thinking. I was fortunate Amherst College helped me learn to think in these ways. I also believe this is the particular ‘competitive advantage’ to the way a lot of people on the autism spectrum think and see things because, many times, what they’re really good at is associative thinking. For example, in the movie Temple Grandin with Claire Daines, someone mentions something about Temple’s shoes. Temple then describes to the other person that the mention of shoes makes her think about every pair of shoes she’s ever owned. The movie visually does a montage of all the different pairs of shoes she may have owned since she was a child to an adult. Think of what the possibilities are if you’re into engineering concepts, you dig VR, and you’re technologist who loves sci-fi and thinking about the future. If Temple could do that with shoes, if your thing is engineering, there's no reason not to believe new and interesting concepts wouldn't come out your train of thinking because you were fortunate enough to be an associative thinker. It’s my personal opinion I think most people who are considered ‘genius’ are really good at- Associative thinking. There are just so many different combinations of things bouncing against each other in terms of ideas, the inputs are myriad, but the outputs are just as creative. In that sense, I’ve been looking at a lot of different source materials of experiences, games, books, and what have you that I’ve found have been the most ‘immersive’ when I’ve played or interacted with them. Sometimes a really good story is just that immersive, you can’t help but feel like you’re there with the characters. But telling (and finding) a story that good is rare. The games I’ve played that were super immersive I’ve come to the conclusion used multiple sources of information to help draw user attention into the experience- perhaps tap into several different modalities of experience and learning. I’m thinking of 'Where in Time is Carmen San Diego' (which used maps and an encyclopedia in addition to the computer clues), 'The Oregon Trail' game on the old Commodore 64, JJ Abrams’ book 'S.' which has all kinds of postcards, personal letters, and a personal correspondence handwritten in the margins on every page of the book as you’re reading. And even though I know the primary motivation was money, I think part of the reason the Angry Birds movie got made was because if you have stimulation available for people on multiple platforms, all aiming at a central theme, perhaps what you’re really aiming at is a multiple modality way to engage user experience (and their money) at a very deep level. Angry Birds messed up (somewhat) because the movie came out so late after the height of the game’s popularity. But what if a product was able to time it so that a purely visual experience came out at the same time with the VR product? Easter eggs placed within a movie experience which would only help or enhance the user experience in VR. Full (or added) immersion through engagement. One of the things I’ve come to a conclusion with regarding Underdog is, being such a unique and innovative VR product, it should be tied to something equally as unique and innovative in the real world. This could be the immersive experience aspect, even though I already have plans for how players of the game can and will engage other users in the physical world. I have specific plans for this. I think Underdog should be attached to products, policy, and reform momentum that is aimed at changing special education in a significant way in this country. Being a special education teacher for nearly 10 years gives me unique perspective with regard to the ways in which special education tries, but often fails students and families. With Underdog, if the one issue we attached our self to, besides raising awareness about autism and bullying, is improving outcomes for young people on the autism spectrum who are exiting or ‘transitioning’ out of high school, it could mean a great deal not just for the autism community, but the for special needs community as a whole. It’s no secret that many parents who have a child on the spectrum fear their child leaving public school because, up to this point, the school system took care of services like: speech and language, occupational therapy, physical therapy, identifying appropriate technology that would be beneficial in school, etc. While you’re in school, the school system finds service providers, screens them for skills, pays for services, arranges appointments, and is contractually (read: legally) obligated to keep those appointments. Once a child graduates from public school, every single one of those responsibilities falls to the parent, including payment for services. I’ve read stats where up to 90% of people on the spectrum are unemployed and largely living at home with parents after they graduate. There HAS to be a better way to service families with kids on the spectrum who are leaving high school. Attaching Underdog to a documentary film that broached a national conversation about what happens to kids on the spectrum after they leave high school- after they’ve experienced some of these traumatic bullying experiences that Underdog will help to address- would be phenomenal. You could do it like the Michael Apted 7UP documentary series where every 7 years you go back to the same group of kids and check in on them to see what their progress is years later. I know of a documentary movie opportunity at the school where my girlfriend is a principal that could help connect Underdog to a conversation about special education reform. Is a documentary film way too much to take on, in addition to building the game? Of course. But I know these guys at Spectrum Laboratories who just might be the right crew to help pull this off. To be continued... Some random Immersive Experience material https://www.youtube.com/watch?v=k12NZLh_Xvg -- Sleep No More (theatrical immersive experience) mentioned as one of the few experiences allowing viewer/participant to have agency in creation and outcome of story; Mentioned in Charlie Melcher interview of Voices of VR podcast http://worldbuilding.usc.edu/ --Established in September 2012 at the University of Southern California’s famed School of Cinematic Arts, the World Building Media Lab has emphasized the power of using technology as a vehicle to enhance storytelling capabilities. With explorations into Virtual and Augmented Reality, the WbML, has established itself as a leader at the forefront of technological-based entertainment. I made this a month or so back, I see no reason why it shouldn't be put to good use:) Unity Tutorial on Game Objects (YouTube). Mike Geig. http://unity3d.com/learn/tutorials/topics/interface-essentials/game-objects 3:26- coordinate system (how games are built and rendered on screen) 2D (above) 3D (above) Not all game engine use the same coordinate systems Z plane is in and out (on a plane) in Unity If you use UDK, Z is up and down and Y is in and out 4:52- how Blender and Maya use coordinate systems (not same as Unity) 5:40 orthographic camera to make 2D 6:45 changing camera perspective to make 2D 7:28 discussion of Unity 2D toolkit 8:47 2 types of coordinate systems (global and local) 9:25 local coordinate system 9:40 all objects based in global, but everything that happens/attaches to them, happens in local coordinate system 9:45 graphic for coordinate systems; World (left) Local (right) 10:46 Definition of world coordinate system. (Position relative to every other item in the world) 10:49 Def of local coordinate system. (Position relative to 1 particular object) 11:28 Fundamental definition of game object 12:38 Ways to create game objects 13:04 Command+Shift+N creates a game object on Macs (if you create an empty game object in a scene, it’s just a point in space at 0,0,0 with a transform 14:38 ‘Create Other’ in Game Object menu 15:00 Aspects of the Cube that was created (adding and changing functionality) 15:53 Sphere game object creation 16:02 Plane creation 16:15 Underside of a Plane is not viewable (not rendered) 16:53 Finding the camera, getting it back in perspective view, and rotating the plane 18:16 Creating ‘light’ game objects 18:21 Point light creation 18:38 Creating directional light 19:00 Components 20:00 Adding components 22:53 Particle system component addition 23:38 Removing components 24:34 Why objects look dark in an empty scene 24:54 Universal lighting (on and off) 25:37 Transforms (an object’s unique position, rotation, and scale in the world) 27:18 Translation (left/right, up/down, in/out) 28:08 Translation tool; ‘W’ key 30:00 Rotation tool; ‘E’ key 32:26 Scale; ‘R’ key 32:44 Scale is a coefficient 33:55 Order of operations w/ transformations 35:30 Cntrl+Z is like the undo button 37:28 Parenting/nesting/grouping 39:15 Coordinates are based on parent, not child 40:24 Un-nesting 40:45 Movement of child is relative to parent movement Unity Courseware Using Game Objects and Assets Here are examples of the course objectives and learning outcomes from the Unity Courseware Objectives and Learning Outcomes documents (both publicly available through the Unity 3D website). Objectives Learning Outcomes466Views2likes0CommentsOculus Launchpad Dev Blog- 5th Entry- DevLog 757.005
I’m sticking to the format that I’ve selected for my blog that specifically addresses the areas I’ve come to believe are important for my app, Underdog, and its development. This format has 6 sections, but some sections might not be as long as others week to week because usually I end up maxing out the character limit and I have to push things back to the next week’s entry. That being said, the 6 sections are: Scripting, Audio, Made With Unity, GitHub, Storytelling/Narrative, and Immersive Experience. This week the Storytelling/Narrative section is light because the Immersive Experience was long, but many times these two bleed into each other anyway. Scripting Variables and Functions-- from Unity Scripting tutorial: https://www.youtube.com/watch?v=WQnpeR7u7HM 2nd part (in purple) is ‘Initialization’ where declaration gets assigned a value Raja- using C# is a more organized programming language than JavaScript; also, when applying for jobs, most game dev studios code in C# and not Java Inside the bluespace (aka between the ‘curly braces’) defines the body of the Class (Highlighted in blue) = ‘Namespace’ (Highlighted in blue) = ‘Inheritance’ ‘Start’ and ‘Update’ = ‘Functions’ Typing this inside body of the Class means we are getting the Console View to LOG a message inside the Compiler; don’t forget the semi-colon at the end of your line of code when you have completed a scripting command A script makes its connection with the internal workings of Unity by implementing a class which derives from the built-in class called MonoBehaviour. (MonoBehaviour is a Class) You can think of a class as a kind of blueprint for creating a new Component type that can be attached to GameObjects. (from Unity manual re: Scripting) I've come to realize that figuring out how to talk about and discuss scripting is harder than I'd like it to be Audio #Sounddesign & #3DAudio on Twitter are a great way to lead you to articles which could be super helpful in researching and designing your audio needs HRTF (Head Related Transfer Function) http://www.3dsoundlabs.com/hrtf-101/ “When a sound wave travels from a given location to our eardrums, it interacts with several parts of our body such as the head and the ears. All these reflections, diffraction and absorptions (depending mainly on the shape of our head and the structure of our outer ear) change the nature of the incoming sound. These sound wave filters depend on the source’s direction. Our auditory system has learned over time to associate each specific filter to its direction. This is how we recognize where a sound comes from. The representation of these directional changes as filters is called “Head Related Transfer Function” or HRTF.” Since HRTF filters are based on our morphology and that we know that we are all different from each other with respect to our features… HRTF differs from one individual to another! If you use somebody else’s HRTF, you end up with inaccurate or fuzzy sound localization. “Besides providing audio immersion in Virtual Reality, personalized HRTF are also used in applications where good sound localization is mission critical. Precise localization of an enemy or an alarm by sound can be lifesaving” https://www.ossic.com/blog/2016/8/9/how-virtual-reality-will-change-the-music-industry A great story published Aug. 11, 2016 about the intersection of music, sound, and VR How does this relate to my project? Since I’m dealing with bullying and harassment as the major theme within my game, sound localization will be important because, as you saw from the quoted text above, “localization of an enemy or an alarm can be lifesaving”, so it definitely gives me awareness (and a couple of ideas) of how and why I should play around with sound in my game to make sure that those who are being bullied are getting exposure and training in becoming aware of bully proximity, and thus potential danger- whether it be physical, mental, or emotionally based Because people with autism can also be highly sensitized to sounds (and learning more about this area makes me wonder if it’s sensitivity to particular frequency bands of sound), it will be important to not only ‘get the sound aspect right’, but to be cognizant of it *in the development stage* so as not to overwhelm the user with sounds that could be debilitating or inducing fear because they occurred too suddenly. I do think it’s important to have some sounds be immediate- a bully isn’t always going to give you warning when they’re in your proximity, but perhaps including something in the UI/UX of the game design so that sounds can be adjusted to the needs of individual users Made w/ Unity http://bit.ly/2aZRJt7 -- a Made With Unity story about a physics based game called ‘Treple’ that pivoted from being one about building things with cranes, into a numbers-based (seeming) version of Sudoku. 2 main things stuck out to me about this story. First, I thought it was cool and interesting that this team consciously took time of out their product development to work on and design experiences that were not their ‘bread and butter’ They took time out for passion projects, because they know that ‘all work and no play make Jack a dull boy’. I know Google incorporates the same practice as part of their 80/20 rule. Work on things that pay the bills 80% of the time, and 20% of the time do the stuff that excites you so you don’t get bored or your work doesn’t look/feel uninspired. The second thing about this article that stood out was how, when they got stuck on the game play, they focused in on polishing the visuals of the game. The aesthetics. So they had a really nice looking game, but the playability sucked. Which is a theme I’ve been and seeing in developer books and articles. No matter how pretty it is, if it’s not fun, people won’t play it. So make sure that you get the mechanics down first, before you dump a bunch of time into making it look good. Your time is valuable. Act like it. The other thing I liked about this article is the indirect referencing to rapid prototyping. You know, just build stuff so you can get the experience of building something. It doesn’t matter if it’s bad because you’re going to figure out things along the way even if you make something you don’t like. And what you figure out may be the answer to something important later on down the line. How does this relate to my game? First, the 80/20 rule. I know that for me, my passion project will be an experience that involves the writing and words of James Baldwin. Yeah, Underdog will be great, but there is so much bubbling beneath the surface in what’s going on in America. I think the lessons of history will be vital in guiding us along paths that can teach us *why* we don’t want to go down certain roads. I also think one of the driving forces for VR in Education, initially, will be history-based teaching. Re-creating historical events and putting students inside them will be incredibly engaging learning material. It could effectively begin the revolution in teaching within the American system that we’ve been waiting for since the Industrial Revolution. Baldwin will be one of the first experiences built along these lines. I can’t wait. GitHub https://www.youtube.com/watch?v=_ALeswWzpBo -- AWESOME explanation of Git v. GitHub (this is actually an important distinction). Also discusses repos, commits, branches, master branches, pulls, and pull requests. Done BY a Noob FOR Noobs in like 6 minutes. That’s why Travis is my guy. Plus, he does a stop motion visual explanation of what actually happens when you’re using Git, so that’s really helpful for someone like me, who is less technical and more visual, get the understanding. This video (and the series itself) was *extremely* helpful. Highly recommended. @brntbeer -- GitHub training dude Immersive Experience http://www.social-marketing.com/immersive-engagement.html Greater immersion leads to greater expectations, you can hear fakenesss just as much as see it Story can influence and cause iterations to design and design can influence and cause iterations to your story; keep that in mind as you’re developing so you’re not tied to the one linear flow and outcome of a story User testing as to, ‘is this story design working or not’; anybody working in VR is gambling that immersion is going to be enough to sell an experience Iterative process most helpful is usability; sitting and watching a player encounter a moment; 26:00 he’s in the middle of this discussion about how different platforms have different constraints, and he mentions that VR will be different than console gaming because you probably shouldn’t sit with a VR headset for 7hrs straight the way you would sit in front of a console game. This applies to my development experience in particular because one of the primary features of autism, the community of which is the target audience for my game, is that they perseverate on specific things. Mainly, things that they enjoy doing and seeing. So, if the game that I’m developing becomes something that they enjoy playing and participating in (and I hope that they do), it will be important to build into it a ‘length of play’ feature that will cause the game to shut off after a certain amount of playing time. If allowed, I think it’s possible that some people might involve themselves in the VR world for hours upon hours. And especially with Gear VR platform deployment, having your phone screen at close range in front of your eyes for hours at a time can’t be a good thing. Especially for a target age range of 9-16yrs old, whose eyes are still developing physically. But Rob Morgan does mention at this point how it may become an issue of legality to have gameplay cease after a certain amount of time481Views1like0CommentsLaunchapd Dev Blog- 4th Entry- DevLog 757.004
Ok- So I did a little math and figured out that in order for me to be able to submit to the Launchpad scholarship portal on Aug. 27, I need to essentially do 13 blog posts between now and that date. A little less than 1 every other day. In order to meet the demand with some expediency I decided to break down the remainder of my blog posts into sections I felt like were the most useful/practical, and most inspirational creativity-wise. I feel like a good balance of the 2 is important to keep the process interesting. With that said, my remaining blogs will be broken down into 6 sections within each post: 1. Audio, 2. Scripting, 3.Git Hub, 4. Made With Unity, 5. Storytelling/Narrative 6. Immersive Experience. In addition to the blog, I’ve decided to start keeping a ‘productivity journal’ on a daily basis to help keep myself accountable for stuff I need to be paying attention to, prioritizing, and *doing*. It’s been a great process for the last 2 weeks or so. You learn a lot about yourself and your process when all you do is write down what you get done. And try not to freak out about what you don’t. That being said, here’s the first install of the new format. I’ll attend to the previous blog’s James Baldwin subject matter in the next post. That being said…. Audio Audio for VR talk, VRLA 8/6/16. Skullcandy Presents: Audio for VR: Robert Dalton (Dysonics), Oda Bjorling (MediaMonks), Sam Paschel (Skullcandy), & Martin Walsh (DTS, Inc.) sound as the emotional guide of your game/experience developed for situational awareness (3d audio); cockpit and fighter pilot training was 1st use we listen to music because it presses emotional buttons you want pressed BACCH 3D Sound from Princeton- check it out: https://www.princeton.edu/3D3A/Projects.html will mass adoption come from mobile deployment? Does it need to? Oculus added headphones to shipment kits to have a target marks for sound devs audio production going through paradigm shift (and Martin Walsh said he hates using that term); traditional flat viewing is 3rd person, but VR is 1st person What’s needed is to storyboard audio into production from get go not a single tool that allows you to do that right now (storyboard audio) right now all sound production is essentially happening in post production, which is like it's an after thought when visuals get shut off, you shift to audio to enhance your perception and experience start to push audio forward as critical component of VR is how it's going to push VR field forward we know what we don't know about possibilities in 3d audio the tools don't work together; even when same formats, channeling ordering is different amongst tools have to hack together a bunch of tools to make good 3d audio environment Apple 7 phone is talking abt not having a headphone jack, so what does that mean? what it HRTF? (I did some digging- head related transfer functions). And why is it necessary to know about these in sound design? Amazon recently patented noise canceling headphones that listen for the sound of your voice; http://bit.ly/2ag3pEg ; “Amazon has recently been awarded a patent for a new design of noise-canceling headphones that can actively listen for distinct sounds — like sirens or someone shouting your name — and stop the noise-canceling functions of the headphones, allowing you to hear the outside world.” https://www.princeton.edu/3D3A/Projects.html -- projects as important to know abt as the lab itself Scripting Javascript and UnityScript are the same thing Name of Classes has to match name of Scripts Classes aren’t allowed to have spaces or special characters; doing so causes an error in compiler A ‘comment’ is text inside a script, but it’s not code; it’s so people viewing code can read notes from original coder about what’s happening in the script, i.e. “// Use this for initialization” on a MonoDevelop start page (usually indicated by a lighter grey text) Coding in JavaScript and C# is great way to learn both languages, and distinctions in functions, at same time You can add Scripts 2 different ways; in Project window, just underneath Project tab, hit ‘Create’ button. OR can hit ‘Add Component’ in Inspector, after you have selected a Game Object in the Hierarchy. Made w/ Unity https://madewith.unity.com/ ; I like looking through the Made With Unity section for a couple reasons. First, it keeps you current with the types of games and experiences that are being made right now. They have a ‘Games We’re Playing This Week’ section on the page, so you know it’s super current. Second, when looking at it from a developer perspective, it gives you ideas of what’s possible with respect to all of the new technology that’s coming out, and how you could relate this tech to the experience you’re building yourself. Maybe if I see something that looks like a solution to a problem I’m having, I could reach out the the developer directly and ask how they solved the problem. No guarantee they’ll share their answer, but I hear the Unity community is pretty generous with knowledge. And it never hurts to ask. https://madewith.unity.com/stories/attention-unity-devs-submissions-are-now-open -- Unite 16 is the big annual Unity dev conference. This year it’s in LA Nov. 1-3rd. The submission portal to potentially earn you 2 free passes ($950 value), depending on if your submission is selected by the panel, closes Aug. 14th, 2016. More info below that’s from the Unite 16 website. What: Unite 16 is our annual developer conference and we're hosting a showcase gallery of up to 45 developers to display their awesome upcoming content. Submissions are open until August 14, 11:59pm PDT! When: Setup is October 31 and the event starts November 1 - 3, at the Loews Hollywood Hotel in Los Angeles Why: Show off what you're creating, be part of Unite, get awesome feedback and exposure to new audiences The game I liked looking at the most this week was ‘Inks’; it’s basically a pinball machine game that does some crazy artistry painting as the ball travels around the pinball machine. It’s beautiful, uses lively colors, and is pretty captivating to watch. It’s only for iOS, and I have an Android phone, so I haven’t played it yet, but I will be sure to check it out as soon as it’s available for Android. GitHub Downloaded and installed GitHub. Created an account (public- which is free. Figured I wouldn’t need the personal account for $7/month just yet until I really start building IP stuff for my company); My username is dhenry21, so if anyone is messing around on gitHub and would like to connect, that’s how you can find me Signed up for the ‘GitHub for Everyone’ webinar training Aug. 16 9am-12pm PST as an intro primer to get a good foundation in GitHub. Here’s where you can find more info on training within GitHub: https://services.github.com/ I’m a little weary of spending 3hrs on a webinar for tech I have no idea about, but Mike Geig (@mikegeig) says that anyone who’s building stuff on Unity needs to know how to use GitHub, or some sort of ‘source control’ so you can keep track of all the code you’re writing in an organized format and be able to share it with others if you’re working in a team. Mike is a great trainer, I’ve learned a lot from his Unity tutorials, so if he says it’s needed, I believe him. Found the YouTube channel with the foundational lesson trainings for GitHub: https://www.youtube.com/watch?v=FyfwLX4HAxM&list=PLg7s6cbtAD15G8lNyoaYDuKZSKyJrgwB- Went ahead and did everything on the checklist for the webinar training so that way I wouldn’t have to worry about installing new tech and scrambling around to get it working the day of the webinar. Checklist can be found here: https://services.github.com/checklists/#everyone Installed the Text Editor which they recommend in GitHub called ‘Atom’. Haven’t had a lot of time to get familiar yet, but it looks really cool and I’m excited to check it out. Here’s the website: https://atom.io/ Storytelling/Narrative http://voicesofvr.com/411-living-stories-what-vr-can-learn-from-immersive-theater/ Interview w/ Charlie Melcher. Founded The Future of Storytelling Summit to gather together the most cutting-edge innovators of telling immersive and interactive stories. For the past four years, they’ve been featuring more and more virtual reality technologies at their yearly summit, which is happening again this year on October 5th and 6th. Using code as a canvas for storytelling using interactive media A book w/ JJ Abrams called ‘S’; https://www.amazon.com/S-J-Abrams/dp/0316201642 ; book that inspired some pretty neat storytelling techniques. I’m looking forward to reading this High level take aways in future of storytelling (Melcher): haven’t figured out what the natural language of VR storytelling is. In movies it’s “pans, cuts, or montage’; important to have agency in your virtual space; most of things he’s seen so far in VR would be just as good, if not better, as regular film docs. So many filmmakers are picking up VR and using same techniques and approaches as they would to a film How do you let people have control over the narrative that’s happening; VR is ultimately supposed to be social- we’re only seeing beginnings of social now. Alex McDowell worldbuilding is important resource Local v. global agency 11:45 w/in podcast. Very interesting distinctions AI is tech that’s missing to allow us to have responsive narratives; perhaps read our moods or emotions and respond to us in a way that it doesn’t interrupt our experience or presence w/in story Living stories- personalized, responsive, immersive, multi sensoral- you’re living story. Powerful to point of being transformative. To meet emotional complexity of being alive Mentioned that The Void could play a big role in this Immersive Experience https://www.youtube.com/watch?v=k12NZLh_Xvg -- Sleep No More (theatrical immersive experience) mentioned as one of the few experiences allowing viewer/participant to have agency in creation and outcome of story; Mentioned in Charlie Melcher interview of Voices of VR podcast http://worldbuilding.usc.edu/ --Established in September 2012 at the University of Southern California’s famed School of Cinematic Arts, the World Building Media Lab has emphasized the power of using technology as a vehicle to enhance storytelling capabilities. With explorations into Virtual and Augmented Reality, the WbML, has established itself as a leader at the forefront of technological-based entertainment.704Views1like1Comment