Kids’ online safety: A fragile balance

By Samuel Woodhams | Digital rights researcher and journalist

Last week, Joe Biden sounded the alarm over the lack of child protections online during his State of the Union address. He called for a ban on the collection of kids’ personal data and the prohibition of targeted advertising to children, saying: “We must finally hold social media companies accountable for the experiment they are running on our children for profit.”

The speech would have encouraged supporters of the Kids Online Safety Act (KOSA), one of several bills aimed at strengthening kids’ safety and privacy online in the United States. The legislation would force platforms to take a range of initiatives to safeguard children, including increasing parental control mechanisms and forcing platforms to proactively remove content that could cause harm.

It’s clearly a well-intentioned proposal and there’s no doubt that the internet can be a dangerous and unpredictable place for kids – even platforms specifically made for children can enable financial exploitation and sexual abuse.

But KOSA, like many other policies being discussed around the world, ignores broader societal issues and disregards important privacy considerations. In doing so, it not only risks undermining its effectiveness, it also threatens young peoples’ right to information on a huge scale.

A child looks at his mobile phone as people camp a day before the funeral of Britain's Queen Elizabeth, following her death, in London, Britain September 18, 2022

A child looks at his mobile phone as people camp a day before the funeral of Britain’s Queen Elizabeth, following her death, in London, Britain September 18, 2022. REUTERS/Marko Djurica

Social media, ‘harmful’ content & mental health

Harmful content on social media has become a focal point for politicians around the world, with governments and child safety charities regularly highlighting the negative impact platforms have on young people’s mental health. But academic research on the topic is less clear, particularly when digital activity is analysed in isolation from other factors that could contribute to the rise in mental health problems among young people: think climate change, the pandemic, racism, rising financial pressures.

Even when harm associated with social media content is clear, removing it is not easy. Like the Online Safety Bill in the UK, KOSA will attempt to remove “unlawful, obscene, or harmful material to minors.” Removing unlawful content makes sense. But what (and crucially who) defines obscene and harmful material?

These grey areas have led some to warn that the legislation may actually cause harm rather than reduce it. In an open letter, a group of civil society organisations said KOSA would lead to the greater introduction of imprecise content filtering that often disproportionately affects LGBTQ+ communities.

“At a time when books with LGBTQ+ themes are being banned from school libraries and people providing healthcare to trans children are being falsely accused of “grooming,” KOSA would cut off another vital avenue of access to information for vulnerable youth.”

In the absence of federal legal protections, parents and lawyers are turning to the courts and filing cases against platforms including TikTok, Instagram and Snapchat over their negative impact on children’s mental health.

There’s no denying that platforms and governments need to intervene to stop the constant barrage of harmful material that many young people are being exposed to online. But without adequate safeguards and deliberative mechanisms built into the policies, top-down approaches like KOSA risk being used to further ostracise and harm already vulnerable communities.

A child using a laptop sits on a camp bed

A child using a laptop sits on a camp bed, provided by airport operator Fraport, at Frankfurt airport April 16, 2010. Due to a huge ash cloud from an Icelandic volcano, that caused air travel chaos across Europe, passengers in Frankfurt were left stranded and forced to stay overnight. REUTERS/Ralph Orlowski

Surveillance vs safety

Another central objective of KOSA is to empower parents in their ability to control their child’s online activity. In the face of huge multinational corporations, giving power back to family members makes sense. However, it relies on the assumption that all parents will have a healthy and supportive relationship with their children.

Provisions that would allow parents to control who their kids speak with online and determine the content they’re able to see, including for teenagers, means the bill could increase harmful parental surveillance.

As the group of organisations wrote, “KOSA risks subjecting teens who are experiencing domestic violence and parental abuse to additional forms of digital surveillance and control that could prevent these vulnerable youth from reaching out for help or support.”

Finally, the bill would establish a task force to investigate the best methods for age verification. While slightly better than the UK’s approach of implementing first and asking questions later, the provision would not only harm children’s privacy by collecting even more data about them, but could fundamentally undermine online anonymity for all.

Children play a game on a mobile phone at slum area in New Delhi, India July 4, 2017. REUTERS/Adnan Abidi

Children play a game on a mobile phone at slum area in New Delhi, India July 4, 2017. REUTERS/Adnan Abidi

What’s the solution?

It’s clear that defending children’s safety online is going to come with some tradeoffs. But like many of the proposals being discussed around the world, KOSA’s potential impact on those already at risk are too serious to be ignored.

Rather than disregarding the potential ramifications for those already marginalised in society, policymakers should place them front and centre in all discussions on child safety online. Doing so might just help pave the way for a better balance between safety, online privacy and the right to information.

Any views expressed in this newsletter are those of the author and not of Context or the Thomson Reuters Foundation.