Why Is There So Much Momentum for Protecting Kids Online?
Bills in states like Maryland, California, and Utah are paving the way for improving the digital landscape for kids, but will the federal government follow suit?
The first three months of 2023 have seen an explosion of activity aimed at protecting kids online. From powerful hearings in Congress, to new legislation in the states, to lawsuits filed by school districts, it seems that Big Tech is on everyone's mind because of how social media can undermine the well-being of young people. And meanwhile, Europe and the UK are taking their own strong actions.
Common Sense has been working on this for a long time, and we welcome the momentum that is building now to hold social media and other tech giants accountable. Two big questions we're asking: What's driving this momentum, and how far will policymakers go to protect kids and teens?
Why kids' online safety now?
A few things have happened in the last two years that have contributed to the momentum. First, U.S. Surgeon General Vivek Murthy declared a youth mental health crisis. While a number of factors are driving the high rates of depression and anxiety in today's youth—especially among girls—social media is certainly one of them, given its connection to addictive behaviors, body image concerns, racism, hate speech, and more.
Second, we heard some overwhelming evidence from tech whistleblowers, like Frances Haugen, that companies like Instagram's owner Meta are aware of the potential negative impact their platforms are having on kids but choose to ignore it. Haugen's testimony was truly a turning point, spawning multiple congressional hearings and spotlighting the different ways that social media can potentially harm kids' lives.
Even President Biden talked about this during the State of the Union address, his most important speech of the year.
Two types of bills to tackle multiple harms
Congress is likely to once again take up bills to protect our data privacy and to force companies to design their platforms in safer and healthier ways for minors. But passage of any bill in Congress is never a guarantee.
Meanwhile, the states are busy. Bills that we're watching, like those in Congress, tackle many different established harms of social media, including:
- Privacy harms: By the time a child is 13 years old, adtech firms have gathered at least 72 million data points on them, which they can use to manipulate them, such as through targeted ads, or sell to third parties.
- Amplification of content recommended by algorithms: Algorithms often take users down dark rabbit holes that recommend content promoting eating disorders, suicidal ideation and self-harm, and dangerous physical challenges like the blackout challenge that are damaging to mental and physical health.
- Sexual exploitation: An alarming number of minors have reported sexual encounters with adults online, and many have been coerced into producing child sexual abuse material.
- Promotion of unhealthy habits: When kids and teens are targeted with online ads for unhealthy food and drinks, they consume more of these products. Advertising also exposes them to smoking, vaping, drinking, and gambling.
- Compulsive usage and addiction: Companies use manipulative design features like endless scroll, autoplay, and push notifications to glue young users to the screen.
We are particularly focused on a few bills this year that would address these harms and make a big difference for kids and families. The bills we are working on in states across the country fall into two categories: platform liability bills and platform design bills.
Liability bills would hold social media companies legally liable when they design their platforms in a way that they know or should know is contributing to addictive behaviors and habits. In California, we are the lead supporter of one such liability bill, SB 287, by State Senator Nancy Skinner (D-Berkeley). The bill would also hold platforms accountable for content that facilitates kids' purchases of fentanyl or ghost guns online. In Utah, Governor Spencer Cox (R) just signed into law HB 311, very similar to the California bill. It would hold companies liable for designing their platforms in ways that are addictive to kids, and it would allow parents or guardians to sue companies if they don't fix the problem. And New Jersey lawmakers are considering the same thing. So, across the country and in both political parties, policymakers are willing to take strong action to protect kids and teens online.
Meanwhile, Maryland and several other states are working to pass platform design bills, modeled after bills we helped get signed into law last year in California. These "age-appropriate design code" bills essentially require companies to design their platforms in ways that are safer for kids and better for their well-being, rather than only prioritizing company profits. Design code bills would require platforms to determine how their data processing activities and design features could be harming kids—how they're using kids' data for targeted advertising, for example—and then determine how they can eliminate or reduce that harm.
Will Congress finally act?
While states are moving forward, we're also working closely with allies in Congress on a number of bipartisan bills we saw last year that have kids' online health and safety in mind. The House may reintroduce the American Data Privacy and Protection Act, a comprehensive privacy bill that has strong protections for kids and teens. The Senate will likely see renewed attention to the Children and Teens' Online Privacy Protection Act, often referred to as COPPA 2.0, and the Kids Online Safety Act (KOSA), which addresses platform design and safety.
We'll keep you updated on any progress around these bills and lawsuits. But remember, we really need your help too. Your voice makes a difference. Sign up to get updates on these bills, and learn where, when, and how you can show your support.
The first three months of 2023 have seen an explosion of activity aimed at protecting kids online. From powerful hearings in Congress, to new legislation in the states, to lawsuits filed by school districts, it seems that Big Tech is on everyone's mind because of how social media can undermine the well-being of young people. And meanwhile, Europe and the UK are taking their own strong actions.
Common Sense has been working on this for a long time, and we welcome the momentum that is building now to hold social media and other tech giants accountable. Two big questions we're asking: What's driving this momentum, and how far will policymakers go to protect kids and teens?
Why kids' online safety now?
A few things have happened in the last two years that have contributed to the momentum. First, U.S. Surgeon General Vivek Murthy declared a youth mental health crisis. While a number of factors are driving the high rates of depression and anxiety in today's youth—especially among girls—social media is certainly one of them, given its connection to addictive behaviors, body image concerns, racism, hate speech, and more.
Second, we heard some overwhelming evidence from tech whistleblowers, like Frances Haugen, that companies like Instagram's owner Meta are aware of the potential negative impact their platforms are having on kids but choose to ignore it. Haugen's testimony was truly a turning point, spawning multiple congressional hearings and spotlighting the different ways that social media can potentially harm kids' lives.
Even President Biden talked about this during the State of the Union address, his most important speech of the year.
Two types of bills to tackle multiple harms
Congress is likely to once again take up bills to protect our data privacy and to force companies to design their platforms in safer and healthier ways for minors. But passage of any bill in Congress is never a guarantee.
Meanwhile, the states are busy. Bills that we're watching, like those in Congress, tackle many different established harms of social media, including:
- Privacy harms: By the time a child is 13 years old, adtech firms have gathered at least 72 million data points on them, which they can use to manipulate them, such as through targeted ads, or sell to third parties.
- Amplification of content recommended by algorithms: Algorithms often take users down dark rabbit holes that recommend content promoting eating disorders, suicidal ideation and self-harm, and dangerous physical challenges like the blackout challenge that are damaging to mental and physical health.
- Sexual exploitation: An alarming number of minors have reported sexual encounters with adults online, and many have been coerced into producing child sexual abuse material.
- Promotion of unhealthy habits: When kids and teens are targeted with online ads for unhealthy food and drinks, they consume more of these products. Advertising also exposes them to smoking, vaping, drinking, and gambling.
- Compulsive usage and addiction: Companies use manipulative design features like endless scroll, autoplay, and push notifications to glue young users to the screen.
We are particularly focused on a few bills this year that would address these harms and make a big difference for kids and families. The bills we are working on in states across the country fall into two categories: platform liability bills and platform design bills.
Liability bills would hold social media companies legally liable when they design their platforms in a way that they know or should know is contributing to addictive behaviors and habits. In California, we are the lead supporter of one such liability bill, SB 287, by State Senator Nancy Skinner (D-Berkeley). The bill would also hold platforms accountable for content that facilitates kids' purchases of fentanyl or ghost guns online. In Utah, Governor Spencer Cox (R) just signed into law HB 311, very similar to the California bill. It would hold companies liable for designing their platforms in ways that are addictive to kids, and it would allow parents or guardians to sue companies if they don't fix the problem. And New Jersey lawmakers are considering the same thing. So, across the country and in both political parties, policymakers are willing to take strong action to protect kids and teens online.
Meanwhile, Maryland and several other states are working to pass platform design bills, modeled after bills we helped get signed into law last year in California. These "age-appropriate design code" bills essentially require companies to design their platforms in ways that are safer for kids and better for their well-being, rather than only prioritizing company profits. Design code bills would require platforms to determine how their data processing activities and design features could be harming kids—how they're using kids' data for targeted advertising, for example—and then determine how they can eliminate or reduce that harm.
Will Congress finally act?
While states are moving forward, we're also working closely with allies in Congress on a number of bipartisan bills we saw last year that have kids' online health and safety in mind. The House may reintroduce the American Data Privacy and Protection Act, a comprehensive privacy bill that has strong protections for kids and teens. The Senate will likely see renewed attention to the Children and Teens' Online Privacy Protection Act, often referred to as COPPA 2.0, and the Kids Online Safety Act (KOSA), which addresses platform design and safety.
We'll keep you updated on any progress around these bills and lawsuits. But remember, we really need your help too. Your voice makes a difference. Sign up to get updates on these bills, and learn where, when, and how you can show your support.