The Online Safety Act 2023 is now in force within the United Kingdom. Focusing on the online ‘world’, this legislation obligates social media platforms to consider users' safety and has been promoted as making the UK the safest place for users to be online (Department for Science, Innovation & Technology et al., 2023)
These provisions have been introduced in response to the urgent need for platform regulations, stemming from the increasing reports of young people falling victim to the user-generated content that pervades online platforms (Woods & Perrin, 2019), and increasing reports of young people coming to harm due to the user-generated content that exists and is circulated online (Gordon, 2021; Jacob et al., 2017; Livingstone, 2014; Staksrud et al., 2013) and revelations of the roles platforms have in this sequence of events (Paul, 2022; Draft Online Safety Bill - Correct Oral Evidence: Consideration of Government’s Draft Online Safety Bill, 2021; Seetharaman, 2021).
This project focuses on three main stakeholders that are regulated or impacted by the introduction of the 2023 Act: policymakers and regulators, social media platforms, and young people as users to understand. Whilst there has been extensive work both across academia, NGOs, charities and organisations that focus on the harms that young people experience due to problematic user-generated content present online (Revealing Reality, 2024; Ofcom, 2023) There is little work that explores this without preconceived ideas of what is harmful to the user group and investigates the everyday, mundane interactions young people have with platforms that lead to harm. This work focuses on these three groups of actors to highlight and evaluate the disconnects between the three in their understandings of and responses to 'online harms' in comparison to the actual lived experiences and reports of young people themselves.
By focusing on the interactions young people have when they interact with these services and platforms in their everyday lives, an assessment as to the likely success of the Act can be made. Focusing on those aged between 12-18, this project places the voices of users at the centre of discussions to establish both what it is online that young people feel harmful. Whilst also probing whether existing platform-based interventions have been insufficent to protect them as users and the provisions within legal intervention of the 2023 Act are unlikely to fully protect them due to the fundemental differences and lack of appereciation of 'harm' as a complex phenomenon experienced individually.
This project will use traditional legal analysis (Hutchinson, 2015; Hutchinson & Duncan, 2012) to examine legislation and a co-production of knowledge and insight approach via focus groups(Azmi & Raza, 2015; Wyeth & Diercke, 2006; Gaver et al., 1999) to engage with young people directly about their everyday experiences with potentially harmful content online. It aims to contribute to academic and policy discussions about the implementation and potential success of the Online Safety Act. Additionally, it will provide insight into how, where, and why young people encounter harmful content online in their daily lives when using their devices.
References:
Azmi NH and Raza FHA, ‘Doing Probes Diary with Children in Reporting Social Networking Behaviour’ (2015)
Department for Science, Innovation & Technology and others, ‘Press Release - UK Children and Adults to Be Safer Online as World-Leading Bill Becomes Law’ (26 October 2023) <https://www.gov.uk/government/news/uk-children-and-adults-to-be-safer-online-as-world-leading-bill-becomes-law> accessed 26 October 2023
‘Draft Online Safety Bill - Correct Oral Evidence: Consideration of Government’s Draft Online Safety Bill’ <https://committees.parliament.uk/oralevidence/2884/pdf/>
Gaver B, Dunne T and Pacenti E, ‘Design: Cultural Probes’ (1999) 6 Interactions 21
Gordon F, ‘Online Harms Experienced by Children and Young People: “Acceptable Use” and Regulation. Full Report’ (Catch22 2021)
Hutchinson T, ‘The Doctrinal Method: Incorporating Interdisciplinary Method in Reforming the Law’ (2015) 8 Erasmus Law Review 130
Hutchinson T and Duncan N, ‘Defining and Describing What We Do: Doctrinal Legal Research’ (2012) 17 Deakin Law Review 83
Jacob N, Evans R and Scourfield J, ‘The Influence of Online Images on Self-Harm: A Qualitative Study of Young People Aged 16–24’ (2017) 60 Journal of Adolescence 140
Livingstone S, ‘Risk and Harm on the Internet’ in Amy B Jordan (ed), Media and the well-being of children and adolescents (Oxford University Press 2014)
Ofcom, ‘Online Nation Report 2023’ (Ofcom 2023) <https://www.ofcom.org.uk/__data/assets/pdf_file/0029/272288/online-nation-2023-report.pdf>
Paul K, ‘Whistleblower Frances Haugen on the Alliance to Hold Social Media Accountable: “We Need to Act Now”’ The Guardian (12 October 2022) <https://www.theguardian.com/technology/2022/oct/12/frances-haugen-council-for-responsible-media> accessed 13 October 2022
Revealing Reality, ‘Children’s Media Lives - Ten Years of Longitudinal Research - A Report for Ofcom’ (Revealing Reality 2024) <https://www.ofcom.org.uk/siteassets/resources/documents/research-and-data/media-literacy-research/children/children-media-use-and-attitudes-2024/childrens-media-lives-2024-summary-report.pdf?v=367549>
Seetharaman GW Jeff Horwitz and Deepa, ‘Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show’ Wall Street Journal (14 September 2021) <https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739> accessed 16 June 2022
Staksrud E, Ólafsson K and Livingstone S, ‘Does the Use of Social Networking Sites Increase Children’s Risk of Harm?’ (2013) 29 Computers in Human Behavior 40
Woods L and Perrin W, ‘Online Harm Reduction – a Statutory Duty of Care and Regulator’ (Carnegie Trust UKE 2019) <https://d1ssu070pg2v9i.cloudfront.net/pex/carnegie_uk_trust/2019/04/08091652/Online-harm-reduction-a-statutory-duty-of-care-and-regulator.pdf>
Wyeth P and Diercke C, ‘Designing Cultural Probes for Children’, Proceedings of the 20th conference of the computer-human interaction special interest group (CHISIG) of Australia on Computer-human interaction: design: activities, artefacts and environments - OZCHI ’06 (ACM Press 2006) <http://portal.acm.org/citation.cfm?doid=1228175.1228252>
Colegate E, ‘Differentiating between Harm to Users and Third Parties in the UK’s Online Safety Regulations’ 16 European Journal of Law and Technology
Colegate E, ‘Safety versus Speech: The Importance of Rationales for Online Safety Interventions in the Context of Artificial Intelligence Content Moderation Systems’, Protections in International Law (2025)
Romero-Moreno F and others, ‘BILETA Response to Ofcom’s Draft Guidance: A Safer Life Online for Women and Girls’ <https://www.research.herts.ac.uk/admin/files/72087930/BILETA_Response_to_Ofcom_s_Consultation_on_Draft_Guidance_regarding_online_safey_for_women_and_girls.pdf
Romero-Moreno F, Basu S and Colegate E, ‘BILETA Response to Ofcom’s Draft Guidance: A Safer Life Online for Women and Girls’ <https://researchprofiles.herts.ac.uk/en/publications/bileta-response-to-ofcoms-additional-safety-measures-consultation/>
This author is supported by the Horizon Centre for Doctoral Training at the University of Nottingham (UKRI Grant No. EP/S023305/1).