How social media and AI regulations need to change to protect teens

August 2024 · 5 minute read

If you’re an adult who follows “only young gymnasts, cheerleaders and other teen and preteen influencers active on” Instagram, what other content is the Instagram Reels algorithm likely to recommend that you check out? The answer, according to a recent Wall Street Journal investigation, is “jarring doses of salacious content ... including risqué footage of children.”

Think about that friend who always encourages you to order just one more drink at the bar.

To understand what’s going on here, let’s step out of the digital world and go “brick and mortar.” Let’s think about that friend who always encourages you to order just one more drink at the bar. This friend can be a lot of fun, in moderation and in adults-only settings. In large doses and in an all-ages setting, this friend can become a dangerous creep — and turn you into one, too.

Let’s call this friend Al. Al knows you, and he knows what you like. Al is out to show you a good time and keep the good times rolling. Al doesn’t know where the line is. Algorithms on social media platforms and search engines typically act like our friend Al.

As the U.S. Supreme Court explained in Twitter v. Taamneh, algorithmically generated recommendations mean that “a person who watches cooking shows on YouTube is more likely to see cooking-based videos and advertisements for cookbooks, whereas someone who likes to watch professorial lectures might see collegiate debates and advertisements for TED Talks.”

Let’s think about what happens when you and Al go watch football at the local high school. It’s fun to relive your glory days, until later when Al follows the students, and you follow Al ... to the girls’ locker room. There are the cheerleaders, just off the field, still in their sports bras and athletic shorts. You try to tell yourself there’s nothing wrong with seeing them like that — it’s more than they’d be wearing if you were all at the town pool. But part of you — the part that Al doesn’t have, because he’s Al — knows it’s different for you to see the cheerleaders in the locker room. That’s their private space, not a public space, and you’re not a teen girl.

But you stay, and when Al suggests that you follow some of the cheerleaders back home and look through their bedroom windows (They left the shades open! Their parents know the shades are open! They’re dressed, wearing cute clothes!), you go along.

Adults should not be looking into young people's locker rooms, bedrooms or other private places.

If this scenario feels uncomfortable to you, if it feels creepy to you — that’s because it is. Super uncomfortable, super creepy. Take the technology out of it. Take the technical jargon, like “algorithm,” out of it. Adults should not be looking into young people's locker rooms, bedrooms or other private places (unless they are parents or other adults with legitimate, boundary-respecting reasons to be in those spaces).

This story about Al helps to explain how algorithms work. This hypothetical also captures a deeply disturbing reality: Many adults are peering via social media into actual locker rooms, bedrooms and other private youth places, looking at content featuring youth of all ages doing personal or intimate things. Even when this content doesn’t cross the line into the sexually exploitative or abusive, it is profoundly creepy to have adults’ eyes in these youth spaces. And when the content does cross that line, this adult behavior is criminal, dangerous and morally reprehensible.

We need to fundamentally change the game for youth digital privacy and safety, both online and off, so that youth rights are central to social media and other tech success, not stuck on the sidelines.

Here are a few significant, positive signs that the game is changing: Youth are speaking up about the harms they experience when their parents post videos of them online for money. This global, largely unregulated digital entertainment industry, sometimes called “kid-influencers,” makes money for parents — but not always for the children. Earlier this year, thanks to youth advocates and their legislative allies, Illinois became the first state in the country to sign a bill into law that requires parents to protect a portion of earnings from this influencer content for the kids who appear in that content. (Hopefully, privacy protections will be added to future state kid-influencer laws.)

States also continue to consider, pass and defend new laws that protect youth digital privacy and online safety, with innovations such as “age-appropriate design,” which discourages kids from posting personal content and, when such content is posted, limits when and how that content may be shared with adults.

Government leaders at both the state and federal levels are urgently pursuing ways to protect youth digital privacy and safety from abuses of artificial intelligence. Such action can limit the growth of AI content that invades youth privacy, such as videos with faces of real children put into AI-generated setups. And some tech companies are adopting enhanced trust and safety measures to try to better protect youth privacy.

In the meantime, all of us parents and other trusted adults can make daily choices to protect the youth in our lives by limiting “sharenting” (putting our children’s private information online). Al may never cut us off at the bar, but we can help to cut him off ourselves by serving up less youth content online.

ncG1vNJzZmivp6x7rr%2FNm5pnm5%2BifLC8yKegqKZfosCvrsJmpqmhnp68r3vSqJqimZxiuqawyJpkmqFdp7Kowcuaq6Knnqh6sb7OrZycrF2uvLbAx2apnKaRZn95g5Zq