Horizon CDT Research Highlights

Research Highlights

What are the opportunities for stories told in mixed reality to utilise audience feedback to become more engaging?

  Joe Strickland (2017 cohort)

I will first investigate the current literature concerning which stories are being told or predicted to be told in mixed reality media (MRM; used to describe both VR and AR), as well as which features of current and near future technology allow for stories to be told effectively. Meta-analyses of previous MRM, such as those by Shilkrot et al (2014) and Zhou et al (2008), will be very important at this point.

This literature review will then inform the next steps of my research namely the creating of two mixed reality experiences. One of these experiences will utilise 360 degree video to capture a live theatre performance from multiple angles, so that the audience can decide where to view a performance from outside the performance space. The other will computer generate a version of a live theatre performance, either using motion capture or free viewpoint video (detailed in Collet et al, 2015), so that audiences can move throughout the performance space while it is happening and view it from any angle.

Once these experiences have been created participants will be recruited to view them, with their behaviour during the viewing being recorded in a number of ways that current and near future headsets are able to. Where their attention is placed in the experience, and for how long, will be figured out using tracked head positioning, described in LaValle et al (2014) and eye tracking technology. If they say anything or laugh or scream during the experience this is also useful information for content creators so on board microphones will record this data. Surveys and interviews will also be used.

This audience feedback data will then be examined and used to inform the second round of content creation, where the experiences are recreated with this feedback in mind to improve them and make them more preferred and attention grabbing, or engaging. These new and improved experiences will then be reviewed by a mixture of new and previous participants and viewing behaviour data collected again to see if the audience feedback was able to inform a more engaging content creation process.

This PhD is important because MRM use brand new consumer technology for storytelling with unique features compared to other traditional media technologies, such as the increased scope for creative interactivity (Nam, 2015). Although applications of this technology are being studied quite widely the next step in researching them is developing systems that can collect data about audience behaviour and feed this data back to content creators and broadcasters to inform future programming (for example the BBC, my project partner, is keen to look at storytelling and the HCI of engagement in AR). Doing this allows me to be at the forefront of investigating how this new technology can shape stories for future audiences.


  1. Collet, A., Chuang, M., Sweeny, P., Gillett, D., Evseev, D., Calabrese, D., Hoppe, H., Kirk, A. & Sullivan, S. 2015. High quality streamable free-viewpoint video. ACM Transactions on Graphics (SIGGRAPH 2015), 34(4).
  2. LaValle, S. M., Yershova, A., Katsev, M. & Antonov, M. 2014. Head tracking for the Oculus Rift. IEEE International Conference on Robotics and Automation.
  3. Nam, Y. 2015. Designing interactive narratives for mobile augmented reality. Cluster Computing, 18, 309-320.
  4. Shilkrot, R., Montfort, N. & Maes, P. 2014. nARratives of Augmented Worlds. Mixed and Augmented Reality-Media, Art, Social Science, Humanities and Design (ISMAR-MASH’D), 35-42.
  5. Zhou, F., Duh, H. B-L. & Billinghurst, M. 2008. Trends in Augmented Reality Tracking, Interaction and Display: A Review of Ten Years of ISMAR. IEEE International Symposium on Mixed and Augmented Reality,193-202.

This author is supported by the Horizon Centre for Doctoral Training at the University of Nottingham (RCUK Grant No. EP/L015463/1) and BBC.