Online safety: what young people really think about social media, big tech regulation and adults ‘overreacting’
Emily Setty/ The Conversation:
Young people are often reluctant to involve adults in their online lives. Many fear that parents and teachers will misunderstand or “overreact” in response to what they mostly regard as normal, unproblematic behaviour and experiences. Others say they are frustrated by adults who “trivialise” their experiences.
Over the past eight years, I have had extensive discussions with (mainly) teenagers from a diverse range of social and economic backgrounds, ethnicities, sexual orientations and genders about their experiences of social media and messaging apps. A lot of those I speak to initially try to downplay any issues. They make it clear they like being online and know how to handle any problems that may come up.
But when I ask them to tell me more about these problems – while remaining neutral and interested rather than appearing judgmental – it’s almost like the floodgates open. They want to talk about the things they don’t like and struggle with; they just worry that they’ll get into trouble if they are too honest.
Some describe a relentless stream of abuse and hate that can “ruin” the experience of being online. One 14-year-old girl says there is “so much sexism, racism, homophobia” which she thinks is wrong, but at the same time just an inevitable part of being online. A 14-year-old boy discloses: “Sometimes they’ve been racist to me … Racist comments [in] messages from other people.”
Some LGBTQ+ girls tell me about the extent of hate they experience online:
[There’s] a lot of bullying … it’s coming from both adults and other children, [even] in safe spaces. There’s group chats online where people are added and it’s purposely [so people can] hate them.
But they also point out that “in the real world”, people don’t accept their sexuality and gender identity either. Most still want to stay online despite the risks because at least there is the chance of connecting with like-minded others. Yet they often seem quite despondent about how to support each other online and challenge bad behaviour, knowing it’s risky to do so.
Similarly girls, and boys too, seem almost to have to accept being sent unwanted and unsolicited sexual content as a condition of being online. “I think you just sort of keep quiet about it,” one 12-year-old girl tells me, suggesting that calling out such behaviour could have awful consequences if the sender then tells their friends.
What’s clear from all my discussions is that most young people regard reining in the big social media platforms as only part of the solution. They see the issues as social in nature – going beyond just being an online problem but as part-and-parcel of their wider lives. As one 14-year-old girl puts it: “It’s not social media which is the issue … it’s society and how we are taught.”
Legal but harmful
The content of the UK government’s new Online Safety Bill is both complex and controversial. The reported removal of a section dealing with “legal but harmful” content published by the largest and “highest-risk” social media platforms has attracted widespread criticism in some quarters, but strong support among those who regard the bill – which must be finalised by the summer of 2023 – as a threat to free speech.
In theory this measure relates to adults, as children are already protected from viewing harmful material by “under 18” gateways. However, many of those criticising the removal of this section are still deeply concerned about children’s ability to view legal but harmful content.
The coroner’s report into the death of 14-year-old Molly Russell in November 2017 concluded that her viewing of social media content had contributed in “more than a minimal way” to her death. The senior coroner, Andrew Walker, said the material Russell had viewed “shouldn’t have been available for a child to see”. In response, her father Ian Russell suggested that social media firms should “think long and hard about whether their platforms are suitable for young people at all”.
There have been numerous other examples of young people coming to harm as a result of online experiences, including after being bullied or having sexual images shared online. According to the UK’s media regulator Ofcom, more than a third of children aged eight to 17 have seen something “worrying or nasty” online in the last 12 months, while one in three lie about their age to access adult-rated content on social media. Consuming content on video-sharing platforms such as YouTube and TikTok is the most popular online activity for children, with 31% having posted content they’ve created, most commonly on TikTok.
Social media platforms have age restrictions but most lack robust mechanisms to enforce them – the user just has to enter a date of birth, which they can make up. The Children Commissioner’s 2020 survey found that over half of children aged 11 to 13 and over a third of those aged eight to ten reported using platforms despite not being old enough.
Of course, children and young people (my term for 13 to 17-year-olds) vary a lot in how they talk about the issues and in their levels of critical awareness and digital literacy. But in this “post-digital era” where the use of social media is taken for granted by children as young as 12 and in some cases ten or younger – hearing their perspectives is a critical part of understanding how best to monitor and regulate the online landscape. In this article, children and young people talk candidly about what they think the most pressing issues are – and how they want to be supported as they navigate the risks that can arise.
The idea that online spaces can be fun, informative and uplifting but also fractious and divided has emerged time and again in my discussions with young people. Some girls have been quite animated telling me about the fun they have with each other sharing dances and lip syncs on TikTok “around the world”, with the aim of “trending” and reaching as many viewers as possible.
Many are quite dismissive of any negative impacts, and breezily say that it’s just about enjoying themselves. They maintain that online spaces can include more diverse representations and “body positive” content, while pushing back against fears over narrow concepts of beauty and overly curated lifestyles.
But they also describe seeing streams of abuse and “shaming” as they scroll through posts and comments – some of which is directed toward them personally. Girls tell me about being “hated on” including about their bodies and appearance:
The way I’m dressed: people will just tell me to go kill myself and slit my wrists but it’s just something you can’t escape. If someone dresses in a smaller dress or with cleavage showing, they are called a slut and told they’re asking for it. However you look, you’ll be made fun of for it.(15-year-old girl)
Even those who think such comments are funny or insignificant at the time can be worried about digital footprints and so-called “cancel culture”:
If you said something maybe a couple of years ago … people will bring that up now and then, like, cancel you for it. They will constantly hate on you [even if] your opinion on it has changed … I know now from when I was young my opinions on many things have definitely changed. (15-year-old girl).