Facebook and its social media peers require users to be at least 13 years old to use their sites.
That age traces back to a 1990s law that prohibits the tracking and data collection of children.
Experts called the age “arbitrary” and said even if it were raised, it wouldn’t ensure kids’ safety.
LoadingSomething is loading.
Once kids turn 13, they can use Facebook, Twitter, Instagram, Reddit, and virtually any other massively popular social app. Why? Because an internet law written in 1998 told them they could.The Children’s Online Privacy Protection Act (COPPA) was intended to prevent online platforms from collecting the personal data of kids under the age of 13 for ad targeting and tracking, but it’s since become a stale framework for Big Tech — ever seeking to reach the lucrative under-16 age market — to use as its minimum age limit.Experts say the industry should support updates to COPPA that raise the age limit to keep up with modern times, even if the measure wouldn’t be a catchall for keeping kids safe online.
“We’re dealing with such a substantially different internet experience now, compared to in the 1990s when we had very primitive types of advertising,” Jennifer King, a privacy and data fellow at Stanford’s Institute for Human-Centered Artificial Intelligence, told Insider. She said the age limit of 13 is both “problematic” and “arbitrary.”Kids will be exposed to the internet, regardless of rules
Dangerous social-media trends.
Facebook and most other platforms ask users to confirm that they’re at least 13 before they can use their apps. But that’s all it is: a request, one that kids can skirt by simply lying, which absolves the companies of any legal liability, since they don’t know for sure if users are telling the truth.That means kids across the world can pose as users above the age of 13, and their data and scrolling habits are free for the ad beast’s taking.”If we increase the age, we’re banning behavioral ads,” Irene Ly, the policy counsel at Common Sense Media, told Insider. “It’s still going to really help parents who are trying their best to keep an eye on what their kids are seeing.”
The platforms have incentives to minimize barriers between themselves and kids. That’s especially important as they struggle to attract younger users, most of whom are flocking to TikTok, the de facto Gen Z hot spot. “Facebook, in particular, is an existential issue because they realized they’re not the platform for young people,” King said.Democratic Sen. Ed Markey of Massachusetts, one of the law’s original authors, initially wanted COPPA’s age limit to be 16. He told The Wall Street Journal in 2019 that he knew 13 was too young at the time, but “it was the best I could do.”Markey, along with Republican Sen. Bill Cassidy of Louisiana, wants to update the act to prohibit companies from targeting kids with ads or collecting data from kids aged 13 to 15, among other provisions.
“Big Tech has a voracious appetite for kids’ attention and data, and these companies have no problem prioritizing their own profits over children and teens’ right to privacy,” Markey said in a May press release.But Chris Olsen — CEO of the Media Trust, a digital security, trust, and safety platform — told Insider that even if COPPA is updated, the age limit hardly matters. What matters is profit-driven companies keeping the consumer’s best interests in mind.”Deciding what age it should be or who it applies to is dramatically less important than people understanding when they visit a platform what the platform is doing to them,” Olson said.Still, the onus will continue to fall squarely on parents to monitor their children’s digital safety.
“There’s a lot of pressure today to do better, and I think COPPA is one sliver,” Olson said.