EARN IT Act picture_edited.jpg
NCOSE.png
EARN IT ACT: HR6544/S3538

Problem Statement and Summary  

 

Scope of Problem

  • The circulation of child sexual abuse material (CSAM) online is astronomical. According to a months-long New York Times investigation:

    • In 2008, over 600,000 images/videos of CSAM were reported to the National Center on Missing & Exploited Children (NCMEC), calling it an epidemic.

    • In 2019, 70,000,000—70 million—CSAM images were reported. NYT called it an “almost unfathomable” increase in criminal behavior.

  • CSAM has so overwhelmed law enforcement (LE), that the FBI and LAPD, for example, prioritize material depicting infants and toddlers, not older children.

    • LE does not even have enough resources to target perpetrators in REAL TIME.

  • A 2018 NCMEC/ Thorn study found that CSAM is getting worse: “The most notable historical finding was a trend toward more egregious sexual content”

 

Harm to Victims

  • CSAM is documentation of crime. Victims suffer gross physical & psychological harm at the hands of offenders. They suffer again as harm is endlessly reenacted & circulating.

 

Tech Turns a Blind Eye: Protected by CDA 230

  • CSAM is identified through digital “fingerprints” (hashes) unique to each image. NCMEC keeps a database with all CSAM hashes; LE & companies can run a program to check CSAM against the NCMEC database to verify if it has been identified already or not.

  • The platforms on which CSAM circulates are not required by law to report it

    • In 2018, of the 18.4 million reports of CSAM to NCMEC, 17 million came from Facebook because it screened all images on its platforms.

    • Other companies (Amazon, Dropbox, Snapchat, etc.) simply don’t screen for it.

  • Interactive computer services (ICSs) have NO INCENTIVE to report it.

    • As a result of court rulings, the Communications Decency Act section 230 (CDA 230) gave near blanket immunity to ICSs; Tech companies can’t be held liable for CSAM because they aren’t “publishers” under law, so ICSs ignore CSAM.

      • CDA230, passed in 1996, was written to encourage growth of the Internet when it was a baby; that stage is long over.

      • ICSs have been given many chances to police CSAM and they refuse.

     

Earn It Act Brings Accountability to Bear on Criminality

  • It revokes the immunity from liability for CSAM that ICSs have under CDA230

    • Survivors and state attorneys general will be able to sue tech companies for facilitating CSAM, using federal civil law and state civil and criminal law.

    • It entails a precise, surgical, socially responsible change to CDA230.

  • It creates a new Online Child Exploitation Prevention Commission

    • Technology changes so rapidly, it is hard for decisionmakers to keep up, in order to keep citizens safe. The commission will establish best business practices & make recommendations to inform policy, the judiciary, and LE community.

    • Preventing sex trafficking, grooming, and predatory behavior while exploring age gating and more family-friendly filter systems are among the commission’s tasks. 

  •  It renames “child pornography” “child sexual abuse material” in federal statute.

  • It upgrades some tools for the National Center on Missing and Exploited Children (NCMEC).

INFORMATIVE
ONE-PAGERS
INFORMATIVE PODCASTS & VIDEOS
ACT NOW
Let your voice be heard!
Reach out to your local representative and make it known that the EARN IT Act must be passed.
To learn how, visit NCOSE's page here

KOSA: S3663 & COPPA: S1628

National Center on Sexual Exploitation Urges Passage of Kids Online Safety Act (KOSA) and Children and Teens’ Online Privacy Protection Act (COPPA)

WASHINGTON, DC (July 26, 2022) – The National Center on Sexual Exploitation (NCOSE) is calling on Congress to pass the Kids Online Safety Act (S 3663) and Children and Teens’ Online Privacy Protection Act (S 1628), both designed to further protect children online, and which are being marked up by the Senate Commerce Committee on July 27.
“The Kids Online Safety Act is an excellent bipartisan bill that creates a ‘default to safety’ standard for online platforms, one of the main safeguards we have been advocating for with tech platforms. Tech companies can and should do more to protect its youngest users given the potential for harm and exploitation on these platforms,” said Dawn Hawkins, CEO of the National Center on Sexual Exploitation.
The Kids Online Safety Act (KOSA) creates a duty of care that a covered platform must act in the best interests of a minor who uses it and includes a duty to prevent and mitigate physical, emotional, developmental, material harms. KOSA provides families with the tools, safeguards, and transparency they need to protect against threats to children’s health and well-being online. The bill also ensures that parents and policymakers can assess whether social media platforms are taking meaningful steps to address risks to kids.
The Children and Teens’ Online Privacy Protection Act (COPPA) will strengthen protections for children and include minors up to age 16 in those protections from the collection, use and disclosure of personal information.
“Both of these bills have the potential to significantly improve young people’s wellbeing by transforming the digital environment for children and teens. Congress must solidify its commitment to the youngest generations by instituting these protections,” Hawkins concluded.

ACT NOW
Let your voice be heard!
Reach out to your local representative and make it known that KOSA and COPPA must be passed.
To learn how, visit NCOSE's page here