For something we use all the time, the average person doesn't know much about it. Which is a problem as we approach a bold future of genetic engineering and BMIs (brain machine interfaces).
THE GOAL Democratize neuroscience for the average Joe to learn about what's going on inside that crazy noggin.
THE EXPERIENCE A MASSIVE virtual tour taking you through the evolution of the human mind from the brain stem through to the neocortex, guided by a top neuroscientist.
WEEK ONE
Idea Genesis I've been interested in the intersection of Neuroscience and VR since I delved into the virtual realm. With a vision to create a EEG driven telepathic VR experience, I contacted MIT Neuroscientist Nancy Kanwisher, excited to put my idea into action. Nancy had bad news for me.
"Using an EEG to determine what is going on in the brain is similar to figuring out what is happening in a football game by putting microphones outside the stadium. You'll know when there's big activity but can never be sure of what happened and where," she explained.
Despondent, I let that idea slip away (only to recently discover a company that is now incorporating EEGs into HMDs, read my article on that if you're curious).
A couple months ago, I came across an article that brought the brain back into the forefront of my brain (yuck, too meta). It describes, with cartoons, Elon Musk's new quest to build a brain machine interface enabling us to symbiotically think with an AI, so that the AI's won't become our robot overlords.
It shook me up, for the first time I had a very different vision of the future. We could telepathically communicate. We could organically create virtually reality simulations by sharing our direct experiences with others. But if Elon Musk was going to build a wizard hat with read write capacity on our neurons, we should probably brush up on the basics of neuroscience. Problem is, this shit is CRAZY COMPLEX! But.. I knew just the way to make it accessible 🙂
Wanna Collab, Bro?
I've known Tiltbrush artist ArtistsareScientists for a while, and boy is he on my wavelength. Having worked at bio lab Odin and helped with the Open Insulin Project, he is knowledgeable and curious about the inner working of the human being. Not to mention, he just built a MR studio in his house.
I shared my vision with him and the strangely tongue-in-cheek Wait but Why article. I told him I needed a BIG color coded model of the brain in 3D with no bare insides (we want to encourage people to put their head through the model). I brought up a nice model of the brain and he got to work.
Augmentation Features
To introduce the power of a future read/write BMI, I want to show how our neurons really determined everything we view as reality. I explained to Micah that I would like the viewer to have a 'neuron wand' which they could use to synthetically stimulate these virtual neurons. With this your perception would change, warping your vision.
But WHAT VISION? To give the most literally experience, I wanted to this view to be the outside of the headset. We played around with using the safety mode to get an idea of what this could be like and I really liked the result:
However, this feature is incredibly hardware specific (with a Rift we would have to attach an camera to the front of the setup) so we will have to keep brainstorming different creative workarounds.
Here are some bad ideas we thought of
"Memories" filmed in 360 Overlayed and Filtered
A Livestream from outside the headset (still hardware heavy)
A recorded "memory" of them entering the headset
All of these are tacky and awful in my opinion. For the sake of simplicity, we may just have the view of the brain itself change (im thinking applying a ripple effect).
Creation
At the end of Week 1, we had made a beautiful cavernous brain, that's pretty fun to look at inside and out. Here's a little video to enjoy. We're going to run it by some neuroscientists to be sure we are on point with our model.
Exploring the activity of our brains will be fascinating in VR! I'm excited about your project. It sounds amazing that you have already identified an artist to begin this 3D modeling aspect; those take so much time. I read your article about the EEG connection to VR, and I think this is so fascinating that there are possible ways to explore how our brain waves are sensed inside a headset. I love the neuroscientist's quote. What a challenge though!
At the NYU ITP showcase this spring I saw an exploration of brainwaves in VR, used for a meditative experience and trying to get deeper into our brainwaves. It was just a demo but so exciting to see this area being explored.
P.S. - I don't think your concluding ideas were bad. I actually think the livestream from outside the headset could be interesting. Just maybe not at this concept stage. But don't abandon those ideas forever. They may be worth returning to down the road.
thanks Kiira! i think you're right about incorporating this later down the line. Pass through vision will become a ubiquity in headsets as AR and VR merge. Thanks for you thoughful comments, it really helps in the process 🙂