In addition to identifying some of the characteristics and experiential qualities of audio augmented objects, this practice-based research project has begun to identify the potential roles that audio augmented objects can play within audio augmented reality experiences [2,12,14]. The research has also identified ways in which collections of audio augmented objects can be used to realise virtual, though physically explorable, soundscapes that can promote interaction and engagement with museums, galleries and heritage sites, their existing collections, spaces and the objects and artefacts within them. Furthermore, audio augmented objects have also demonstrated potential as physical interfaces through which digital audio archival content can be accessed and engaged with.
Recent advances in mobile augmented reality, perhaps most importantly Simultaneous Location and Mapping (SLAM) technology [6,7], have realised precise indoor positioning systems that have enabled a move from the delivery of static binaural audio content to the delivery of dynamic binaural audio content . By tracking a listener’s real-time position and orientation within three-dimensional space it is possible to create a binaural audio listening experience with a greater sense of reality, where audio content delivered through headphones is perceived to exist at precise locations within the listener’s physical environment. By positioning this virtual audio content at the same location as specific physical objects, it can be perceived as emanating from the physical object itself, at times with a fidelity that participants have found hard to distinguish from reality.
As with visually orientated augmented reality experiences, where computer graphics are superimposed over the camera’s image of the physical environment, the presence of the physical object within an audio augmented reality experience can be observed to create a sense of environmental authenticity  which, in turn, adds to the suspension of disbelief, propelling the virtual audio content into reality within the mind of the listener. As such, interesting associations can be made between audio and object, associations which appear to be quite fluid and malleable, and have the potential to be curatorially and creatively exploited.
It also appears that the combination of precise indoor positioning and freely explorable audio-based content helps to realise a naturalistic, intuitive and transparent interface [5,9,16], unobstructed by the screen-based interfaces of visually orientated augmented reality experiences. Additionally, within the context of an exhibition environment, this screenless experience presents the opportunity to augment without visual distraction or interference within these, for the main part, visually orientated real world environments [11,15].
So far, this practice-based, research-through-design approach [3,10] has realised two public-facing interactive sound installations [6,7] and a publicly available audio augmented reality mobile application. The development, deployment and ethnographical study of these projects, along with the subsequent ethnomethodological and qualitative analysis of participants’ interactions and experiences [4,8], has provided insights into, developed software, and uncovered best practices, for the application of such experiences.
1. Valentin Bauer, Anna Nagele, Chris Baume, Tim Cowlishaw, Henry Cooke, Chris Pike, and Patrick G. T. Healey. 2019. Designing an Interactive and Collaborative Experience in Audio Augmented Reality. In EuroVR: International Conference on Virtual Reality and Augmented Reality. https://doi.org/https://doi.org/10.1007/978-3-030-31908-3_20
2. Benjamin B. Bederson. 1995. Audio Augmented Reality: A Prototype Automated Tour Guide. In CHI95: Conference on Human Factor in Computing Systems, 1–2.
3. Steve Benford, Matt Adams, Nick Tandavanitj, Ju Row Farr, Chris Greenhalgh, Andy Crabtree, Martin Flintham, Brendan Walker, Joe Marshall, Boriana Koleva, Stefan Rennick Egglestone, and Gabriella Giannachi. 2013. Performance-Led Research in the Wild. ACM Transactions on Computer-Human Interaction 20, 3: 1–22. https://doi.org/10.1145/2491500.2491502
4. Ann Blandford, Dominic Furniss, and Stephann Makri. Qualitative HCI Research. Synthesis Lectures on Human-Centered Informatics 9, 1: 1–115. https://doi.org/10.2200/s00706ed1v01y201602hci034
5. R. Casas, D. Cuartielles, A. Marco, H.J. Gracia, and J.L. Falco. 2007. Hidden Issues in Deploying an Indoor Location System. IEEE Pervasive Computing 6, 2: 62–69. https://doi.org/10.1109/mprv.2007.33
6. Laurence Cliffe, James Mansell, Joanne Cormac, Chris Greenhalgh, and Adrian Hazzard. 2019. The Audible Artefact: Promoting Cultural Exploration and Engagement with Audio Augmented Reality. In AM’19: Audio Mostly 2019 (AM’19: Audio Mostly 2019), 1–7. https://doi.org/10.1145/3356590.3356617
7. Laurence Cliffe, James Mansell, Chris Greenhalgh, and Adrian Hazzard. 2020. Materialising contexts: virtual soundscapes for real-world exploration. Personal and Ubiquitous Computing 20, 3: 1–16. https://doi.org/10.1007/s00779-020-01405-3
8. Andrew Crabtree, Mark Rouncefield, and Peter Tolmie. 2012. Doing Design Ethnography. Human-Computer Interaction Series. https://doi.org/10.1007/978-1-4471-2726-0
9. David Cuartielles. 2012. Embodied Sound. In Proceedings of the Participatory Design Conference, 1–4.
10. Bill Gaver and John Bowers. 2012. Annotated Portfolios. Interactions: 1–10. Retrieved from https://dl.acm.org/doi/pdf/10.1145/2212877.2212889
11. Caleb Kelly. 2017. Gallery Sound. Bloomsbury, London. https://doi.org/10.5040/9781501305948
12. Francisco Kiss, Sven Mayer, and Valentin Schwind. 2020. Audio VR— Did Video Kill the Radio Star? Interactions: 1–6. Retrieved from https://dl.acm.org/doi/pdf/10.1145/3386385
13. Winifred Phillips. 2014. A Composer’s Guide to Game Music. MIT Press, Cambridge, Massachusetts.
14. Marjan Sikora, Mladen Russo, Jurica Derek, and Ante Jurčević. 2018. Soundscape of an Archaeological Site Recreated with Audio Augmented Reality. ACM Transactions on Multimedia Computing, Communications, and Applications 14, 3: 1–22. https://doi.org/10.1145/3230652
15. Yolanda Vazquez-Alvarez, Ian Oakley, and Stephen A Brewster. 2011. Auditory display design for exploration in mobile audio-augmented reality. Personal and Ubiquitous Computing 16, 8: 987–999. https://doi.org/10.1007/s00779-011-0459-0
16. Mark Weiser. 1999. The Computer for the 21st Century. Mobile Computing and Communications Review 3, 3. https://doi.org/https://doi.org/10.1145/329124.329126
This author is supported by the Horizon Centre for Doctoral Training at the University of Nottingham (UKRI Grant No. EP/L015463/1).