What a week it's been! I spent most of last week in the hospital so this week I'll be making up for it. That said, I feel like we've made a huge amount of headway on my project despite not being close to 100%. I've spent the last two weeks iterating and reiterating the logic and functionality of my platform. The further into development we've gotten, the more important it has become to simplify and refine the interactions we're building.
The iterations since day 1 have been as follows:
a tool to create a VR pitch deck
a media player that allows users to annotate on top of content live using voice and sketch tools
a conversational annotation platform that not only allows users to annotate on top of experiences but to leave messages for one another using those annotations
The part that is taking up most of my time is examining the difference between capturing an annotation, re-living it, and responding to that annotation. To capture the annotation, I've decided to do a 360 video capture of that specific moment, recording all annotations (audio & visual) and display it as an overlay on top of the content, like a trail of breadcrumbs. This seems to work really well for anyone leaving feedback on content however, I need to further explore how I can make this more conversational as an engagement tool. My hope is that creators/brands/owners should be able to leave breadcrumb annotations throughout an experience that can serve as a checkpoint or engagement point for users. Ideally, this would transform a traditional survey into an interactive and real-time survey but now I need to work out the technical logic of how a user would respond to those questions in a seamless way.
I'm looking forward to testing our next build which should be ready by Friday. Once we can test in vivo, we should be able to work through a lot of these unknown variables.