The Online Safety Act 2023 is now in force within the United Kingdom. Focusing on the online ‘world’, this legislation obligates social media platforms to consider users' safety and has been promoted as making the UK the safest place for users to be online (Department for Science, Innovation & Technology et al., 2023)
These provisions have been introduced in response to the urgent need for platform regulations, stemming from the increasing reports of young people falling victim to the user-generated content that pervades online platforms (Woods & Perrin, 2019), and increasing reports of young people coming to harm due to the user-generated content that exists and is circulated online (Gordon, 2021; Jacob et al., 2017; Livingstone, 2014; Staksrud et al., 2013)and revelations of the roles platforms have in this sequence of events (Paul, 2022; Draft Online Safety Bill - Correct Oral Evidence: Consideration of Government’s Draft Online Safety Bill, 2021; Seetharaman, 2021).
This project assesses the 2023 Act to understand how, where, and if the provisions aimed at reducing harmful experiences for young people will deliver on the promises made. Whilst there has been extensive work both in academics and across NGOs, charities and organisations that focus on the harms that young people experience due to problematic user-generated content present online (Revealing Reality, 2024; Ofcom, 2023) There is little work that explores this without preconceived ideas of what is harmful to the user group and investigates the everyday, mundane interactions young people have with platforms that lead to harm.
By focusing on the interactions young people have when they interact with these services and platforms in their everyday lives, an assessment as to the likely success of the Act can be made. Focusing on those aged between 12-20, this project places the voices of users at the centre of discussions to establish both what it is online that young people feel harmful whilst also testing the following overall project hypothesis: when the understanding of harm is considered from both a policy and user perspective, there is a mismatch that has the potential to leave UGC unregulated and leave YP exposed to harm online despite the promises that regulations will reduce the likelihood of such.
This project will use traditional legal analysis (Hutchinson, 2015; Hutchinson & Duncan, 2012) to examine legislation and design techniques (Azmi & Raza, 2015; Wyeth & Diercke, 2006; Gaver et al., 1999) to engage with users. It aims to contribute to academic and policy discussions about the implementation and potential success of the Online Safety Act. Additionally, it will provide insight into how, where, and why young people encounter harmful content online in their daily lives when using their devices.
References:
Azmi NH and Raza FHA, ‘Doing Probes Diary with Children in Reporting Social Networking Behaviour’ (2015)
Department for Science, Innovation & Technology and others, ‘Press Release - UK Children and Adults to Be Safer Online as World-Leading Bill Becomes Law’ (26 October 2023) <https://www.gov.uk/government/news/uk-children-and-adults-to-be-safer-online-as-world-leading-bill-becomes-law> accessed 26 October 2023
‘Draft Online Safety Bill - Correct Oral Evidence: Consideration of Government’s Draft Online Safety Bill’ <https://committees.parliament.uk/oralevidence/2884/pdf/>
Gaver B, Dunne T and Pacenti E, ‘Design: Cultural Probes’ (1999) 6 Interactions 21
Gordon F, ‘Online Harms Experienced by Children and Young People: “Acceptable Use” and Regulation. Full Report’ (Catch22 2021)
Hutchinson T, ‘The Doctrinal Method: Incorporating Interdisciplinary Method in Reforming the Law’ (2015) 8 Erasmus Law Review 130
Hutchinson T and Duncan N, ‘Defining and Describing What We Do: Doctrinal Legal Research’ (2012) 17 Deakin Law Review 83
Jacob N, Evans R and Scourfield J, ‘The Influence of Online Images on Self-Harm: A Qualitative Study of Young People Aged 16–24’ (2017) 60 Journal of Adolescence 140
Livingstone S, ‘Risk and Harm on the Internet’ in Amy B Jordan (ed), Media and the well-being of children and adolescents (Oxford University Press 2014)
Ofcom, ‘Online Nation Report 2023’ (Ofcom 2023) <https://www.ofcom.org.uk/__data/assets/pdf_file/0029/272288/online-nation-2023-report.pdf>
Paul K, ‘Whistleblower Frances Haugen on the Alliance to Hold Social Media Accountable: “We Need to Act Now”’ The Guardian (12 October 2022) <https://www.theguardian.com/technology/2022/oct/12/frances-haugen-council-for-responsible-media> accessed 13 October 2022
Revealing Reality, ‘Children’s Media Lives - Ten Years of Longitudinal Research - A Report for Ofcom’ (Revealing Reality 2024) <https://www.ofcom.org.uk/siteassets/resources/documents/research-and-data/media-literacy-research/children/children-media-use-and-attitudes-2024/childrens-media-lives-2024-summary-report.pdf?v=367549>
Seetharaman GW Jeff Horwitz and Deepa, ‘Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show’ Wall Street Journal (14 September 2021) <https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739> accessed 16 June 2022
Staksrud E, Ólafsson K and Livingstone S, ‘Does the Use of Social Networking Sites Increase Children’s Risk of Harm?’ (2013) 29 Computers in Human Behavior 40
Woods L and Perrin W, ‘Online Harm Reduction – a Statutory Duty of Care and Regulator’ (Carnegie Trust UKE 2019) <https://d1ssu070pg2v9i.cloudfront.net/pex/carnegie_uk_trust/2019/04/08091652/Online-harm-reduction-a-statutory-duty-of-care-and-regulator.pdf>
Wyeth P and Diercke C, ‘Designing Cultural Probes for Children’, Proceedings of the 20th conference of the computer-human interaction special interest group (CHISIG) of Australia on Computer-human interaction: design: activities, artefacts and environments - OZCHI ’06 (ACM Press 2006) <http://portal.acm.org/citation.cfm?doid=1228175.1228252>
This author is supported by the Horizon Centre for Doctoral Training at the University of Nottingham (UKRI Grant No. EP/S023305/1).