cancel
Showing results for 
Search instead for 
Did you mean: 

Aleem Hossain - Launchpad Weekly Updates - "I Never Told You"

aleemhossain
Expert Protege

WEEK 1

I'm a filmmaker who just finished my first feature film as a writer/director. It's a sci-fi drama and I'm just starting to send it out the festivals. For the past year or so, as I was completing my film, I've been getting more and more excited about VR.

My current goal is to create my first immersive experience (a fictional live-action one) by early fall.


What have I been thinking about creatively?

As a filmmaker, one of my big goals is to approach VR with an open mind. I don't want to bring a lot of filmmaking assumptions to the process by default. And I've been searching for project ideas that feel truly/essentially VR… and not just a film idea shoved into the 360 format for no real reason.

I'm nowhere near the first person to say this but I think the key is to figure out what's truly essential about VR, what's unique to it.

In cinema, I think editing and framing are such central tools b/c film is an amazing way to manipulate what the audience sees and doesn't see, and how fast or slow information reaches them, and what order they see or feel things in, and what information/emotions are combined or isolated. Many (though of course not all) of the best films really take advantage of these strengths or explicitly subvert them.

But what about VR? It's early days in this art form but for sure people have zeroed in on the idea of “presence” as being one of the essential parts of the VR experience. And of course there's the inescapable fact that the “audience” in VR is much more conscious of themselves in the art form… they are almost always made to feel like they are inside the art, a participant.

So I decided I wanted to try and do something that evokes presence and explicitly makes the viewer be conscious of themselves in the experience.

I know, I know… that's pretty vague. I felt that way too… until I thought about Sam.

Do you remember Sam?

zt7gxvip8u7c.jpg

Quantum Leap blew my mind as a kid. I just loved the idea, the characters, everything. If you've never seen the show, it's about a guy named Sam Beckett who is traveling through time in a very particular way: His consciousness is constantly being dropped into the body of someone in the past. He's a white guy in his 30's in the present day but he finds himself in the body of a mafia hitman in the 1960's, an older black man in the segregated south of the 1950's, a single mother, etc.

The show is a 1980's network drama, so it's not particularly subtle. But it's pretty profound if you ask me. We see Sam get sexually harrassed as a woman. We see him racially attacked as a black man. We all know the old adage about walking a mile in another person's shoes… Sam gets to literally do this.

And with VR, we all can do this.

Which brings me to the specifics of my project: I'm going to develop an immersive VR experience about being looked at.

One of the most personal and unique experiences we have is the way people look at us.

Until now, it's been very hard to truly let someone besides yourself viscerally experience this.

I don't want to be too simplistic but just to make my idea clear, I'll start with some pretty clunky examples.

  • The moment when you arrive at a party and everyone looks over at you… I think it's really different for women vs men (Or even just walking down a city street).
  • Getting pulled over by the cops and first moment the officer makes eye contact with you… I think it's really different depending on your race.
  • Being in a room full of strangers vs your closest family – you get looked at differently.

I'd like to film some scenes where I direct the actors to treat the camera as a very specific person (in terms of gender, race, age, relationship to them, etc)… and cycle the user through these scenarios. Some may feel very familiar and others will hopefully feel very strange. You will hopefully get to feel in a truly visceral way what it is like for someone other than yourself to be looked at.

Basically, I'm going to do a meditation on being looked at. I'm imagining this is more of an experimental/emotional piece than an actual narrative one… but there will be an emotional journey for the user. A gut-level, pared down to its purest form, version of walking a mile in other people's shoes.

Those are my creative thoughts at the moment.


What have I been thinking about technically?

VR live-action camera technology is driving me crazy. It's so obvious to me that everything being used to shoot stuff right now will seem totally outdated in a year… and probably laughable in a few years. I'm coming from the film/video side of things where now that we're past the crazy early years of digital, we're at a point where the camera technology is no longer a significant obstacle in the process and can pretty much do whatever we want it to.

But it's also pretty exciting to be at the beginning of an art form, not just artistically but also creatively. I'm going to choose to focus on this. I was never going to be there with Edison and the Lumiere brothers figuring out 35mm film cameras… but the VR versions of those folks are working on the tech right now.

I've got a Ricoh Theta which I'm using to do tests. I want to shoot my actual project on a better rig but I love the Ricoh for testing/brainstorming. (Thank you Marcy for encouraging me to play around with it) When I'm making films, my iPhone (and before that my miniDV handicam) is an essential tool. I'm constantly testing shots and edits. For me, it's way more productive than storyboarding. The Ricoh feels like it will play a similar role in my VR experiments.

Eye contact will be extremely important for my project. I want the user to feel like they are being looked in the eyed– I want them to get that buzzy feeling I get when I've got the headset on and a character (live-action or CG) really looks at me.

So I've started doing tests for live-action eye contact.

h8wghmjh34fu.jpg

At the moment, I'm experimenting with distance to camera (in terms of how readable a face is). I shot of series of tests standing at progressively farther distances from the camera. With the Ricoh at least, the fall off was pretty quick and pretty sharp. Once I was 4-5 feet away from the camera I felt like my gaze into the lens was already less powerful when I looked at it in the headset. But there's more than one variable at work. Partially, it's problem of resolution – both the camera and playback in the headset. It's also problem of optics – how good the lenses are and how they represent depth, space, etc. I'm going to do some side by side tests with my DSLR. The flatness/fuzziness of even multi-camera GoPro based rigs is a challenge. These cameras were just not created to capture the subtlety of human emotions flickering momentarily on a face… which is not to say that can't be used to do this but I'm using my DSLR as a comparison point b/c I know that it can capture clear and powerful human emotion directly into the lens at a pretty great distance from the subject.

A second issue I'm pondering is what an actor looks like when they're staring directly at camera but the user has not fully turned into this direct gaze… is it weird to see this? Does it undermine the feeling of being looked at once you turn fully into the gaze b/c you just previously saw the gazing not being at you?

Another issue I foresee is the ability for an actor to maintain eye contact with the user while moving. On a cheap camera like the Ricoh, this is pretty easy… an actor can traverse 180 degrees of a fixed 360 frame staring at the same lens. But on a multi-camera rig… what's the answer? This is something I won't be able to test until I get access to a more complex rig.

Those are my technical-ish thoughts for the week.

Thanks for reading! 


39 REPLIES 39

silvrlake
Protege
Hi Aleem, there's been some amazing recent work done on microexpressions (ie https://www.youtube.com/watch?v=CWry8xRTwpo). Are you using actors to simulate these gazes? Or are you putting real situations together? I wonder if it would it be possible to capture microexpressions?

aleemhossain
Expert Protege

silvrlake said:

Hi Aleem, there's been some amazing recent work done on microexpressions 


Thanks! Such an interesting topic. Hadn't seen this video before and although I'd read about the subject before I'd never heard of the specific term "body language code."

I'd love to eventually find a way to record in a real environment with a "hidden vr camera" (whatever that would be) mounted on an actor to truly capture the real life reactions people have to gender, race, etc. For the moment I'm going to start with actors simulating different real life scenarios.

DrSzilak
Expert Protege
You know Marina Abramovic's "The Artist is Present?" all about looking and being looked at? 
Also, would check out VR film Collisions with a single moment of eye contact with an aboriginal man who says "hello" in English. It is a beautiful moment. 
Lastly, go into Altspace VR for hiding/being seen in VR experience. Hope that's helpful. 

aleemhossain
Expert Protege
Thanks for the recs! I'll check out Collisions.

And yes, The Artist is Present is an amazing and inspiring work! I'd love to achieve something like that in VR.

agilibility
Protege
I worked with my multi camera rig capturing reactions with mixed success. In some situations people are perfectly content to ignore the strange looking tripod in their area. In other situations my shots have been completely ruined by people coming straight up to the rig and staring into it as it's recording. Maybe we need to do studies solely on people's reactions to the camera rig? haha

Lupac
Protege
There are some local camera houses- like Radiant Images- who might be willing to help out with the best available cameras, like Blackmagic or Sony, for final projects. I went to a good VR open house there and they seem open to collaborations....whatever that means. We can try! 

aleemhossain
Expert Protege

WEEK 2







I
made progress on two fronts this week:


1. I
shot some camera tests


2. I
started to get a more specific artistic vision of the experience I
want to create


I'll
do the quick overview first and then, if you really want to hear my
nitty-gritty thoughts you can read the rest of the post.






1.
Camera Tests


I
wanted to compare how readable emotions are on faces using the
wide-angle low-quality optics of the Ricoh S vs the “normal length”
high-quality optics of a 35mm lens on a DSLR (in video mode).


I
did these tests at 3', 6', 10', and 15'. Here's 3' and 15' (I've removed the color to limit the number of variables between between the images):

bvuoe26mngs5.jpg

hqhsyiwa55wo.jpg


To
be clear, I know this is not a fair fight… a DSLR and a Ricoh are apples and oranges. It would be really hard to make a 360 VR
video with a 35mm lens on my DSLR (though some folks have definitely
done it with wider lenses on a DSLR). And the Ricoh was not made to
take high quality video. But this “unfair fight” is the problem.
“Presence” is what we are raving about in VR, right? Going
forward, I think achieving amazing emotional experiences via presence
will depend on capturing human emotion in faces (even at 15', which
my DSLR can do).


When
I looked at these tests in the headset, another thing occurred to me.
When I'm standing 3 feet from myself in the Ricoh shot, I don't feel
like I'm 3 feet from myself… I feel further. I don't feel the
intimacy I would feel if a real person was standing just 3 feet from
me. And it makes me wonder if the initial wow factor of just any
live-action human looking us in the eyes will wear off and once we're
over the novelty we won't really be that impacted by human looks
until we can shoot with lenses that more truly represent space,
bodies, etc. (another side note: I don't think any of this applies
to the CG world… when Henry looks me in the eyes I think that
emotional power will last beyond the novelty phase and I think I do
feel that I am the actual distance from him that I am supposed to be
in that world).


All
of this make me lean toward shooting a VR project in quadrants with
higher-end cameras and real lenses. See this case study:
https://medium.com/@FutureLighthouse/ministry-vr-the-case-study-c84678765e10#.hr9r8rx6p
and here's a pic of Future Lighthouse's two camera rig (they shot in
quadrants…sacrificing some ability to move actors around in order
to gain much better stereoscopic representation of space/people).

jal23jvmxiem.jpeg



2.
Creative Plan


I
spent a lot of last week's post musing about the big picture ideas in
my mind. But what am I actually going to make? What is the specific
execution that will explore this idea of feeling what it's like to be
looked at for someone besides yourself?


I
thought at first that I might try and create truly literal scenarios:


1.
What is it like to be the only woman at a high-powered business
meeting.


2.
What is it like to board a plane a few weeks after 9/11 when you are
of Middle Eastern descent?


3.
What is it like to be pulled over by the cops when you're a young
black male?


But
I worry that these scenarios are too simplistic… and that the user
will be too quick to understand the scenario and therefore know
who “they” are in the scenario and then too quickly be viewing it
through a meta-lens rather than viscerally in the moment. I'm not
saying this idea can't work – and maybe I'll even make
something like that some day - but for now, it doesn't feel right to
me.


Instead,
I think I'm going to try and remove a lot of the context the user
might focus on… and try to create some pure version of being looked
it.


Using
the idea that Brillhart and others have noted about jump cutting in
live-action VR (that it's possible to jump cut if some frames of
reference persist across the cut), I'm thinking about creating an
arrangement of people looking at the user. And repeating this
arrangement with different lookers in different locations… so that
I can jump cut the user through a variety of spaces and lookers…
and via editing create an experimental film experience that cycles
through these spaces/people and creates an emotional journey through
emotional looks.


Or to pare it down even more, it might be even more effective to pick a
single almost context-less location and just jumpcut or otherwise
cycle through the lookers themselves.


I imagine this like a piece of music. Sometimes all the notes will be
in harmony (all the people looking at you will be similar in their
emotional tone). Other times there will be a clash (the people
looking at you differ in their emotional intent). Sometimes the tones
will be long and solitary, other times there will be a flurry of
quick tones. I'll obviously have to experiment to figure out what is
and isn't comfortable but I'm pretty certain the overall approach
will work.


As a
user, I think some of the looks will make a lot of sense… or will
feel familiar. And some will be entirely foreign. And one of the
coolest things, in my mind, is how much the user themselves will
impact the piece. Whatever state of mind, background, history, they
bring into the headset will have a huge impact on how they feel when
looked at with love or rage or indifference by people who are the
same age or not, the same race/ethnicity or not, the same gender or
not, etc.


Okay-
that's the overview… if you want to read my further nitty gritty
thinking this week, here we go:






More thoughts on optics and reading human faces in live-action footage:


If
you read my previous post, you know that I want to create an
experience based around the idea of being looked at. As I said
before, I think how we are literally looked at it various situations
is highly depending on demographics, relationships, and circumstance…
and it's one of the most difficult experiences to truly share with
another person… but maybe VR can change that.


Capturing
the emotional content (for lack of a better phrase) of a “look”
will be very important in my project.


Currently,
even many high-end live-action camera solutions are using GoPros…
and GoPros are terrible for capturing subtle human emotion visually
-especially when compounded by the distortion/flattening that occurs
in most stitching scenarios. And this doesn't even bring us to the
resolution/compression issues that come with playing on the VR
headset.


Side
note – those of you who are not filmmakers may not have spent much
time pondering how the lens on a camera can have a dramatic impact on
the representation of the human emotion. For example, just look at
the way which lens focal length can change how something looks, check
out this gif:

http://i.imgur.com/XBIOEvZ.gifv


So yeah... there are some pretty big technical challenges. And these are
challenges that have a real impact on the art. I've seen very few
live-action VR things that I really really love mostly b/c of how poorly
space/location in general, and human faces specifically, are captured by the current tech. Even powerful docs like Clouds over Sidra are
not immune to this problem. Try watching that film without any sound
and ignoring the subtitles (yes, I did this). The images alone won't
blow you away nearly as much. Of course, even traditional documentary films will not
be as powerful without sound but I can think of many documentaries I
could watch without sound that still move me emotionally. (For
more on this approach to analyzing the visual components of a film,
check out Soderbergh's obsession with watching entire feature films
without color or sound:
http://www.openculture.com/2014/09/steven-soderbergh-creates-silent-black-white-recut-of-raiders-of-...)


Of
course, VR isn't cinema… but I think that the reality and power of
live-action VR is going to be limited until we can capture locations
and people in focal lengths that more closely resemble our actual
human eyes. A fish-eye lens on a GoPro is SO FAR from the way our eye
sees. There's some debate, but most photographers and filmmakers will
tell you when it comes to 35mm film, a focal length somewhere between
35mm-55mm is the closest to our human field of view. I'm sure there
are live-action VR folks who will say that this doesn't apply to VR…
b/c our eyes themselves are present in this art form and will
perceive whatever portion of the image in front of us in the headset
that is natural. But… that brings us back to the gif… in the
headset our eyes will take in the field of view that is natural to
each of us but our eyes can't change the depth/focus/spacial
distortion etc. built into the video image b/c of lens choice while
shooting. 

I'm excited to figure out how to overcome these challenges! More on that next week!





















DrSzilak
Expert Protege
fascinating video above. It shows the complexity of "reading" another person, something we absolutely take for granted much of the time. I look forward to developments on this. You are really diving deep into the problem. 

aleemhossain
Expert Protege

DrSzilak said:

You are really diving deep into the problem. 


I need to make t-shirts with this as a slogan!