top of page
EARN IT Act picture_edited.jpg
NCOSE.png
EARN IT ACT: HR6544/S3538

Problem Statement and Summary  

 

Scope of Problem

  • The circulation of child sexual abuse material (CSAM) online is astronomical. According to a months-long New York Times investigation:

    • In 2008, over 600,000 images/videos of CSAM were reported to the National Center on Missing & Exploited Children (NCMEC), calling it an epidemic.

    • In 2019, 70,000,000—70 million—CSAM images were reported. NYT called it an “almost unfathomable” increase in criminal behavior.

  • CSAM has so overwhelmed law enforcement (LE), that the FBI and LAPD, for example, prioritize material depicting infants and toddlers, not older children.

    • LE does not even have enough resources to target perpetrators in REAL TIME.

  • A 2018 NCMEC/ Thorn study found that CSAM is getting worse: “The most notable historical finding was a trend toward more egregious sexual content”

 

Harm to Victims

  • CSAM is documentation of crime. Victims suffer gross physical & psychological harm at the hands of offenders. They suffer again as harm is endlessly reenacted & circulating.

 

Tech Turns a Blind Eye: Protected by CDA 230

  • CSAM is identified through digital “fingerprints” (hashes) unique to each image. NCMEC keeps a database with all CSAM hashes; LE & companies can run a program to check CSAM against the NCMEC database to verify if it has been identified already or not.

  • The platforms on which CSAM circulates are not required by law to report it

    • In 2018, of the 18.4 million reports of CSAM to NCMEC, 17 million came from Facebook because it screened all images on its platforms.

    • Other companies (Amazon, Dropbox, Snapchat, etc.) simply don’t screen for it.

  • Interactive computer services (ICSs) have NO INCENTIVE to report it.

    • As a result of court rulings, the Communications Decency Act section 230 (CDA 230) gave near blanket immunity to ICSs; Tech companies can’t be held liable for CSAM because they aren’t “publishers” under law, so ICSs ignore CSAM.

      • CDA230, passed in 1996, was written to encourage growth of the Internet when it was a baby; that stage is long over.

      • ICSs have been given many chances to police CSAM and they refuse.

     

Earn It Act Brings Accountability to Bear on Criminality

  • It revokes the immunity from liability for CSAM that ICSs have under CDA230

    • Survivors and state attorneys general will be able to sue tech companies for facilitating CSAM, using federal civil law and state civil and criminal law.

    • It entails a precise, surgical, socially responsible change to CDA230.

  • It creates a new Online Child Exploitation Prevention Commission

    • Technology changes so rapidly, it is hard for decisionmakers to keep up, in order to keep citizens safe. The commission will establish best business practices & make recommendations to inform policy, the judiciary, and LE community.

    • Preventing sex trafficking, grooming, and predatory behavior while exploring age gating and more family-friendly filter systems are among the commission’s tasks. 

  •  It renames “child pornography” “child sexual abuse material” in federal statute.

  • It upgrades some tools for the National Center on Missing and Exploited Children (NCMEC).

INFORMATIVE
ONE-PAGERS
INFORMATIVE PODCASTS & VIDEOS
ACT NOW
Let your voice be heard!
Reach out to your local representative and make it known that the EARN IT Act must be passed.
To learn how, visit NCOSE's page here

KOSA: S3663 & COPPA: S1628

National Center on Sexual Exploitation Urges Passage of Kids Online Safety Act (KOSA) and Children and Teens’ Online Privacy Protection Act (COPPA)

WASHINGTON, DC (July 26, 2022) – The National Center on Sexual Exploitation (NCOSE) is calling on Congress to pass the Kids Online Safety Act (S 3663) and Children and Teens’ Online Privacy Protection Act (S 1628), both designed to further protect children online, and which are being marked up by the Senate Commerce Committee on July 27.
“The Kids Online Safety Act is an excellent bipartisan bill that creates a ‘default to safety’ standard for online platforms, one of the main safeguards we have been advocating for with tech platforms. Tech companies can and should do more to protect its youngest users given the potential for harm and exploitation on these platforms,” said Dawn Hawkins, CEO of the National Center on Sexual Exploitation.
The Kids Online Safety Act (KOSA) creates a duty of care that a covered platform must act in the best interests of a minor who uses it and includes a duty to prevent and mitigate physical, emotional, developmental, material harms. KOSA provides families with the tools, safeguards, and transparency they need to protect against threats to children’s health and well-being online. The bill also ensures that parents and policymakers can assess whether social media platforms are taking meaningful steps to address risks to kids.
The Children and Teens’ Online Privacy Protection Act (COPPA) will strengthen protections for children and include minors up to age 16 in those protections from the collection, use and disclosure of personal information.
“Both of these bills have the potential to significantly improve young people’s wellbeing by transforming the digital environment for children and teens. Congress must solidify its commitment to the youngest generations by instituting these protections,” Hawkins concluded.

ACT NOW
Let your voice be heard!
Reach out to your local representative and make it known that KOSA and COPPA must be passed.
To learn how, visit NCOSE's page here

SOCIAL MEDIA CHILD PROTECTION ACT: HR821 

WASHINGTON (February 2, 2023) - Today, Rep. Chris Stewart (R-UT) introduced the Social Media Child Protection Act, which would make it unlawful for social media platforms to provide access to children under the age of 16. The rates of teen and adolescent depression, anxiety, and suicide have risen at unprecedented levels since the emergence of social media.

“Our nation’s young people are facing an unprecedented mental health crisis,” said Rep. Stewart. “More than 40 percent of teenagers say that they struggle with feelings of sadness or hopelessness, and more than half of parents express concern over their children’s mental well-being. There has never been a generation this depressed, anxious, and suicidal – it’s our responsibility to protect them from the root cause: social media.

“To all those who say this would be an overstep by our government, I understand your concern. And I share your ideological belief that more government usually makes life worse, not better. But we have countless protections for our children in the physical world – we require car seats and seat belts; we have fences around pools; we have a minimum drinking age of 21; and we have a minimum driving age of 16. The damage to Generation Z from social media is undeniable – so why are there no protections in the digital world? It’s well past time that we take bold, comprehensive action for the sake of our kids.

“President Biden recently wrote that ‘…young people are struggling with bullying, violence, trauma and mental health. We must hold social-media companies accountable for the experiment they are running on our children for profit.’ And the Surgeon General recently stated that adolescents shouldn’t be given access to social media until they’re at least 16 years old. This legislation is a real opportunity for bipartisanship in a divided Congress. Let’s get to work and give our nation’s young people the protections they so desperately need.”

The Social Media Child Protection Act makes it unlawful for social media platforms to provide access to children under the age of 16. It also…


1. Makes it the social media platform's responsibility to verify age (using methods like ID verification);
2. Requires social media platforms to establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from users and perspective users;
3. Gives the authority to the State to bring a civil action on behalf of its residents;
4. Gives parents a private right of action on behalf of their children;
5. Directs the FTC to prevent any social media platform from violating these regulations including implementing fines for violations. 

(Stewart, 2023)

CONTACT YOUR LOCAL ELECTED REPRESENATATIVES

AND ASK THEM TO SUPPORT THIS BILL TODAY!

bottom of page