cancel
Showing results for 
Search instead for 
Did you mean: 

Aleem Hossain - Launchpad Weekly Updates - "I Never Told You"

aleemhossain
Expert Protege

WEEK 1

I'm a filmmaker who just finished my first feature film as a writer/director. It's a sci-fi drama and I'm just starting to send it out the festivals. For the past year or so, as I was completing my film, I've been getting more and more excited about VR.

My current goal is to create my first immersive experience (a fictional live-action one) by early fall.


What have I been thinking about creatively?

As a filmmaker, one of my big goals is to approach VR with an open mind. I don't want to bring a lot of filmmaking assumptions to the process by default. And I've been searching for project ideas that feel truly/essentially VR… and not just a film idea shoved into the 360 format for no real reason.

I'm nowhere near the first person to say this but I think the key is to figure out what's truly essential about VR, what's unique to it.

In cinema, I think editing and framing are such central tools b/c film is an amazing way to manipulate what the audience sees and doesn't see, and how fast or slow information reaches them, and what order they see or feel things in, and what information/emotions are combined or isolated. Many (though of course not all) of the best films really take advantage of these strengths or explicitly subvert them.

But what about VR? It's early days in this art form but for sure people have zeroed in on the idea of “presence” as being one of the essential parts of the VR experience. And of course there's the inescapable fact that the “audience” in VR is much more conscious of themselves in the art form… they are almost always made to feel like they are inside the art, a participant.

So I decided I wanted to try and do something that evokes presence and explicitly makes the viewer be conscious of themselves in the experience.

I know, I know… that's pretty vague. I felt that way too… until I thought about Sam.

Do you remember Sam?

zt7gxvip8u7c.jpg

Quantum Leap blew my mind as a kid. I just loved the idea, the characters, everything. If you've never seen the show, it's about a guy named Sam Beckett who is traveling through time in a very particular way: His consciousness is constantly being dropped into the body of someone in the past. He's a white guy in his 30's in the present day but he finds himself in the body of a mafia hitman in the 1960's, an older black man in the segregated south of the 1950's, a single mother, etc.

The show is a 1980's network drama, so it's not particularly subtle. But it's pretty profound if you ask me. We see Sam get sexually harrassed as a woman. We see him racially attacked as a black man. We all know the old adage about walking a mile in another person's shoes… Sam gets to literally do this.

And with VR, we all can do this.

Which brings me to the specifics of my project: I'm going to develop an immersive VR experience about being looked at.

One of the most personal and unique experiences we have is the way people look at us.

Until now, it's been very hard to truly let someone besides yourself viscerally experience this.

I don't want to be too simplistic but just to make my idea clear, I'll start with some pretty clunky examples.

  • The moment when you arrive at a party and everyone looks over at you… I think it's really different for women vs men (Or even just walking down a city street).
  • Getting pulled over by the cops and first moment the officer makes eye contact with you… I think it's really different depending on your race.
  • Being in a room full of strangers vs your closest family – you get looked at differently.

I'd like to film some scenes where I direct the actors to treat the camera as a very specific person (in terms of gender, race, age, relationship to them, etc)… and cycle the user through these scenarios. Some may feel very familiar and others will hopefully feel very strange. You will hopefully get to feel in a truly visceral way what it is like for someone other than yourself to be looked at.

Basically, I'm going to do a meditation on being looked at. I'm imagining this is more of an experimental/emotional piece than an actual narrative one… but there will be an emotional journey for the user. A gut-level, pared down to its purest form, version of walking a mile in other people's shoes.

Those are my creative thoughts at the moment.


What have I been thinking about technically?

VR live-action camera technology is driving me crazy. It's so obvious to me that everything being used to shoot stuff right now will seem totally outdated in a year… and probably laughable in a few years. I'm coming from the film/video side of things where now that we're past the crazy early years of digital, we're at a point where the camera technology is no longer a significant obstacle in the process and can pretty much do whatever we want it to.

But it's also pretty exciting to be at the beginning of an art form, not just artistically but also creatively. I'm going to choose to focus on this. I was never going to be there with Edison and the Lumiere brothers figuring out 35mm film cameras… but the VR versions of those folks are working on the tech right now.

I've got a Ricoh Theta which I'm using to do tests. I want to shoot my actual project on a better rig but I love the Ricoh for testing/brainstorming. (Thank you Marcy for encouraging me to play around with it) When I'm making films, my iPhone (and before that my miniDV handicam) is an essential tool. I'm constantly testing shots and edits. For me, it's way more productive than storyboarding. The Ricoh feels like it will play a similar role in my VR experiments.

Eye contact will be extremely important for my project. I want the user to feel like they are being looked in the eyed– I want them to get that buzzy feeling I get when I've got the headset on and a character (live-action or CG) really looks at me.

So I've started doing tests for live-action eye contact.

h8wghmjh34fu.jpg

At the moment, I'm experimenting with distance to camera (in terms of how readable a face is). I shot of series of tests standing at progressively farther distances from the camera. With the Ricoh at least, the fall off was pretty quick and pretty sharp. Once I was 4-5 feet away from the camera I felt like my gaze into the lens was already less powerful when I looked at it in the headset. But there's more than one variable at work. Partially, it's problem of resolution – both the camera and playback in the headset. It's also problem of optics – how good the lenses are and how they represent depth, space, etc. I'm going to do some side by side tests with my DSLR. The flatness/fuzziness of even multi-camera GoPro based rigs is a challenge. These cameras were just not created to capture the subtlety of human emotions flickering momentarily on a face… which is not to say that can't be used to do this but I'm using my DSLR as a comparison point b/c I know that it can capture clear and powerful human emotion directly into the lens at a pretty great distance from the subject.

A second issue I'm pondering is what an actor looks like when they're staring directly at camera but the user has not fully turned into this direct gaze… is it weird to see this? Does it undermine the feeling of being looked at once you turn fully into the gaze b/c you just previously saw the gazing not being at you?

Another issue I foresee is the ability for an actor to maintain eye contact with the user while moving. On a cheap camera like the Ricoh, this is pretty easy… an actor can traverse 180 degrees of a fixed 360 frame staring at the same lens. But on a multi-camera rig… what's the answer? This is something I won't be able to test until I get access to a more complex rig.

Those are my technical-ish thoughts for the week.

Thanks for reading! 


39 REPLIES 39

aleemhossain
Expert Protege

WEEK 11


I shot my
proof-of-concept footage for “I Never Told You” this past
weekend. It went great.


5u0c8fvmk17f.jpg


I'm stitching the
footage right now and getting ready to start editing it… but here
are a few things I've already learned from the first shoot. I was
really blow away by how much my volunteers embraced the idea. They
shared some really personal stuff.


1. Yes – people will
share super personal stuff even if they know others will see it.


2. Earlier, I mused
about whether the project would be highly edited or would let
individual speakers talk at some length… My guess was the former
but I now think it's the latter. Many of the volunteers delivered
such great messages – that really hold up on their own, with a
beginning, middle, and end.


3. The Samsung 360 was a
great choice for this proof-of-concept because of its ease of use. I
was juggling so many new things it was nice to just hit record and
know the camera would get useable footage. Once I scale up the
project I do want to upgrade to a more professional camera system but
for now it worked great (during the shoot, at least… I'll post more
once I've reviewed the footage).


I'm diving into
post-production… 11 days to get this proof of concept ready!


It's been awesome to see
all of the projects coming together. Good luck everybody!




therealvr
Protege
Awesome work here. Love reading about your experiments, learnings and growth.

How did you like working with the Samsung 360 camera?

aleemhossain
Expert Protege
@therealvr I like the Samsung 360 - I think it's the best consumer camera - but I can't wait for a better camera, even at the consumer level. You have one, too, right? Did you have it at the Launchpad bootcamp?

My mini-review is that its stitching is not great but I love the 4k... The Ricoh Theta S does a much better job stitching, like sometimes it even does a good job even if something crosses the stitch line (slowly) in the mid-range. The Samsung sometimes has bad stitching with even just static backgrounds. 

But if you're doing something where you can strategize around the stitching problem, the image quality is way better than the Ricoh. 

It's funny how much I can complain about these cameras though... a few years ago they were entirely theoretical. They really are a minor tech miracle but we so quickly take that all for granted and start griping. 😉


therealvr
Protege
Yah I had it when we met

I really love the Samsung 360. It particularly saved me on a recent VR shoot. We were primarily using Kodak's, back to back, for a mounted-bike shot. The Kodaks failed in every which way to match stabilization and stitch. Complete waste of a day.

On the reshoot we shot with the Gear 360.The default stitching software was not bad but we ultimately used Autopano for the control. 

Glowing success for both the reshoot and stitch. :smile: 

aleemhossain
Expert Protege
@therealvr I've yet to try and stitch it with Autopano... glad to hear you got improved results b/c the stitching is really my only major complaint with the camera.

aleemhossain
Expert Protege

WEEK 12


I submitted my
application!


This week was a flurry
of post-production.


A big creative question
has been on my mind since I zeroed in on the “I Never Told You”
concept:


Will the final project
be a highly edited (and more experimental) experience where I quickly
cycle the user through snippets of the various messages my volunteers
recorded? Or will the individual messages be powerful enough (and
interesting and coherent enough) to stand pretty much on there own,
unfolding one at a time in order.


I guessed the former.


The answer was actually
the latter.


I was really blown away
by how much the volunteers were willing to share – it was some
really personal stuff. But I was further blown away by how compelling
it was to watch each of them deliver their “I Never Told You”
message from beginning to end. They hold up on their own… and
that's underselling it… really, they are WAY more powerful when I
just let them play.


I did do some editing
within some of the messages, removing a few digressions and just
shortening here and there, but the finished proof-of-concept is
basically 4 of the best messages back to back.


I wasn't sure if the
editing would break the spell of the experience but my test users
barely noticed the jumpcuts in the footage (I very intentionally
locked off the camera so that only the person speaking changes in a
jumpcut, the rest of the background seems to stay the same… a
little nudging here and there in post made this totally seamless).


The finished piece is 25
minutes long… which is a lot of time… but I had three people who
literally had never put on a VR headset try it out and they all
watched from beginning to end and said their interest never lagged
and they didn't experience any eye/brain fatigue. Going forward, I
think I'd probably aim to keep each “episode” or “segment”
(whatever I end up calling them) shorter than that but the range of
emotions and content that the 25 minutes covers is exciting. There's
literally tears and laughter, anger and joy.


Here's a little graphic
I put together using some images from the proof-of-concept footage.


7yngkbudiun6.jpg





I'm really happy with
the proof-of-concept and I can't wait to expand the project!



So what's next?


I am happy with the
visual side of things but I want to work on the sound. The quality of
the production sound is great. It's now time for me to dive in and
really learn how to spatialize it. In some ways, my concept is so
simple that it might not seem like spatialized sound matters. But I
really really want it to feel like you are literally sitting in that
room with the person talking – and so I think the subtle impact of
their voice really sounding like it's coming from 3 feet in front of
you (and responding as such if you look around the room, etc.) will
add a lot to the realism.


I've never really done
much post-sound myself. But as a filmmaker I've worked with
post-sound people a lot on my films.


Today I'm diving into
learning more about it.


The pathway I'm taking a
crack at is:


Adobe Premiere (where I
edited the project) > Vordio > Reaper plus FB360 Spatial
Workstation


So I'm exporting an XML
from Premiere and using Vordio to turn it into a Reaper project. And
then hoping to use the FB360 tools to spatialize the audio.


I'm also starting to
make plans to do a brand new installment of I Never Told You based
around a specific theme.


I'll report back on
audio and future installments plans next week.

aleemhossain
Expert Protege

WEEK 13


My audio experiments continue now that the rush to submit is over and my application is in.

I'm having really basic struggles with the FB 360 Workstation for audio.... like I can't get my on the fly spatialization movements to stick. I've watched tutorials and saw someone do it at a seminar but for some reason I can't do the thing where you play the sequence, watch an object moving through space in the video and simultaneously move the "puck" in the plugin interface to try and match the spatial location of the moving object. I can move the puck, my choices just won't record.

I've been able to spatialize fixed objects and confirm that headtracking is working. I'm a little underwhelmed by the final result b/c most of the existing output and playback formats are not really taking full advantage of the sound. I'm glad 360 sound with head tracking is even available at all on YouTube, etc but it's all kinda the least sophisticated version possible... I look forward to better audio! I think it is SO important for creating realism and immersion.


aleemhossain
Expert Protege
Hi everyone,

I've been moving forward with the sound work on my proof of concept which I think has really come together as a full-fledged standalone experience rather than just a thing I can show with a lot of caveats just in the context of pitching the project.

I submitted a plan for a I Never Told You installation to Sundance New Frontier. Good luck to all of us who applied!

I've also started working on my first narrative live action idea. I'm just in the brainstorming phase but I'm looking forward to making a fictional work while continuing my work on I Never Told You.

3DNegro
Protege
Good luck with doing your first narrative live action project.

aleemhossain
Expert Protege
Hi all, a quick update.

I've continued developing my first narrative project for VR... hoping to have a fully formed pitch and supporting docs in the next month and then I'll go look for some funding. 

And I'm planning the next steps for my Launchpad documentary project, as well. Definitely excited to shoot more episodes of that. I get a good response whenever I pitch the idea to people. 

I was sad to miss OC3. I would've enjoyed reconnecting in person. Hopefully next year!