thorn Profile Banner
Thorn Profile
Thorn

@thorn

Followers
48K
Following
4K
Media
1K
Statuses
9K

We transform the way children are protected from sexual abuse and exploitation in the digital age.

Joined January 2010
Don't wanna be here? Send us removal request.
@thorn
Thorn
3 months
The Take It Down Act has officially been signed into law. The new legislation is important because it criminalizes the distribution of intimate images—whether real or AI-generated—shared without consent, and holds platforms accountable to act quickly.
Tweet media one
0
2
11
@thorn
Thorn
3 months
During #NationalChildAbusePreventionMonth, @FBI launched Operation Restore Justice—a 5-day, nationwide effort that led to the arrest of 205 alleged child sex offenders and rescuing 115 children. Hear directly from Darren Cox about how the ongoing work to safeguard children.
1
3
9
@thorn
Thorn
3 months
It’s important to start conversations about digital safety now—before kids find themselves in risky situations. Our parent discussion guide, "Starting to Make 'Friends' Online," is designed to help:.
Tweet media one
0
4
6
@thorn
Thorn
3 months
In 2024, @AmazonPhotos used @GetSaferio to detect and report more than 30,000 images—up nearly 25% from the year before. That impact is only possible through deep investment, innovation, and collaboration. Read the full report:.
Tweet media one
0
0
3
@thorn
Thorn
3 months
How do we #protectkids in a world where danger is just a swipe away?. @Bloomberg latest piece, "How to Keep Your Kids Safe Online," asks a critical question: What can families do right now?.
Tweet card summary image
bloomberg.com
Experts from nonprofits say digital safety starts at home with an open dialogue that doesn’t shy away from the embarrassing — or the terrifying.
0
0
4
@thorn
Thorn
3 months
Protecting kids in a digital world means designing with safety in mind, and talking to kids early and often about the threats they may face online. #NCAPM25.
0
2
8
@thorn
Thorn
3 months
A10. Offenders exploit gaps in tech and are increasingly sophisticated and able to scale. To stay ahead, we need smarter tools, stronger standards, and real transparency from platforms. #NCAPM25.
@NCMEC
National Center for Missing & Exploited Children
3 months
Q10. We end each year with the same vital question: As offenders continue to exploit children online with little fear of being caught, how can we stay ahead and protect kids in an ever-evolving digital world? #NCAPM25.
1
5
9
@thorn
Thorn
3 months
The bill will also establish an important notice and removal process through which victims are able to go directly to platforms to get their non-consensual intimate visual depictions, including AI-generated deepfakes, taken down. #NCAPM25.
1
1
4
@thorn
Thorn
3 months
The Take It Down Act would criminalize the publication of intimate visual depictions of children, as well as criminalize threats from perpetrators to distribute these images for the purposes of extortion. These criminal penalties would provide important recourse for child victims.
1
0
0
@thorn
Thorn
3 months
A8. Right now, we are really hoping to see the Take It Down Act quickly signed into law. The House of Representatives passed it yesterday evening with significant support and we are hopeful that it makes its way to President Trump’s desk soon. #NCAPM25.
@NCMEC
National Center for Missing & Exploited Children
3 months
Q8. What would you like to see happen in 2025 to strengthen online protections for children through legislation or policy? #NCAPM25.
1
3
11
@thorn
Thorn
3 months
We are at a critical moment – and if we act now, there is hope that we can keep kids safe from the many threats posed by perpetrators using GAI to harm kids. #NCAPM25.
0
2
8
@thorn
Thorn
3 months
A6. Platforms must prioritize transparency, set safety standards, and invest in early detection tech to stop online enticement before it escalates. #NCAPM25.
@NCMEC
National Center for Missing & Exploited Children
3 months
Q6. Offenders are using GAI to create fake accounts and lure children. What can be done to detect and prevent online enticement before it escalates? #NCAPM25.
1
4
7
@thorn
Thorn
3 months
A5. The images created on nudifying apps can be used for sextortion, grooming, & other harms. These images circulate & the emotional toll is real. We need platform accountability, early education, & safety by design to prevent this threat from escalating even further. #NCAPM25.
@NCMEC
National Center for Missing & Exploited Children
3 months
Q5. “Nudify” apps use AI to remove clothing from photos, often without consent. What risks do these tools pose to minors, and how can we effectively address them? #NCAPM25.
1
10
18
@thorn
Thorn
3 months
A3. GAI is being used to create CSAM—either generating synthetic sexual images of children from scratch or altering real photos into abuse content. It's also being used to groom and sextort children at scale. We are at a critical moment for safety by design #NCAPM25.
@NCMEC
National Center for Missing & Exploited Children
3 months
Q3. How is Generative Artificial Intelligence (GAI) being used to create exploitative content involving children? #NCAPM25.
1
3
7
@thorn
Thorn
3 months
RT @NCMEC: Join us TOMORROW from 2–3 PM EST for an X Chat marking National Child Abuse Prevention Month. Let’s amplify each other’s voices….
0
42
0
@thorn
Thorn
3 months
42% of minors who engaged in commodified sexual interactions—where they’re offered money or something of value in exchange for a sexual interaction online—say the person asking was another minor. Read the full findings and what they mean:.
Tweet media one
0
0
3
@thorn
Thorn
3 months
In 2024, more platforms than ever deployed @GetSaferio—our comprehensive child sexual abuse material and child sexual exploitation detection solution. Read Safer’s 2024 Impact Report to learn more about how our tech helps content moderators:.
Tweet media one
0
0
2
@thorn
Thorn
3 months
🚨 1 in 4 young people report being offered something of value in exchange for a sexual interaction online before turning 18. Everyone has a role to play in protecting kids growing up in our digital-first world. Read our latest report:.
4
1
6
@thorn
Thorn
4 months
1 in 8 teens (aged 13-17) know someone who has been targeted by #AIgenerated deepfake nudes. This recent @USATODAY article tells the story of a teenage girl victimized and impacted by deepfake nudes created by her fellow classmate.
Tweet card summary image
usatoday.com
Mental health and cybersecurity experts say bullying using AI-generated fake nude images is increasingly part of the teen experience.
1
3
2