Daphne Keller
@daphnehk
Followers
31K
Following
20K
Media
878
Statuses
22K
Platform Regulation Director, Stanford Cyber Policy Center. Former Google AGC. This is roughly my zillionth rodeo.
San Francisco, CA
Joined October 2009
We imagine that platforms can bring the whole sprawling chaos of human behavior into compliance with the law. Make our lives policeable, and policed, to a degree no govt in history could have imagined. Not only do we seem to think it's possible– we think it's a good idea.
45
131
372
The letter features President Reagan's FCC chair, among others, signing on to extremely strong condemnation. Here's a taste. 3/3
0
0
1
It's not just Jimmy Kimmel. The FCC's jawboning and chill on legal speech, news reporting, and comedy shows is ongoing. Authoritarians always hate comedy. 2/
2
0
3
This is real news. Reporters, please report on it. ELEVEN former top FCC officials, Republican and Democratic, came together to condemn FCC Chairman Carr's abuses of power and efforts to suppress legal speech. 1/
Government has no business policing media for viewpoint bias or balance. The FCC must repeal its news distortion policy, urges a bipartisan group of 11 former FCC Chairs, Commissioners and other top officials in a petition we filed today with Protect Democracy
3
3
7
For researchers and DSA wonks, this Venn diagram is the heart of the post. You could probably just read this part to get about 60% of what I unpack here.
Here is the first piece in a series of short articles I'm doing about the DSA and researcher access to publicly available information. It focuses on categories of researchers under the DSA, and what data they are each authorized to use. 1/ https://t.co/g2gH8OcJTV
0
2
9
Many thanks to @techpolicypress and @Verfassungsblog for co-running this piece for different audiences. 3/3 https://t.co/tm8fq77Bur
techpolicy.press
The DSA opens important opportunities for researchers collecting publicly available data, but leaves key questions unresolved, writes Daphne Keller.
0
0
1
Perhaps you're thinking, "Of course researchers can use publicly available information! So can other people!" This piece marks the start of a lot of writing from me on how that's not true. And how broken the law on this topic is, and how the AI wars are making that worse. 2/
2
0
2
Here is the first piece in a series of short articles I'm doing about the DSA and researcher access to publicly available information. It focuses on categories of researchers under the DSA, and what data they are each authorized to use. 1/ https://t.co/g2gH8OcJTV
verfassungsblog.de
The EU’s Digital Services Act (DSA) established a host of new transparency mandates for online platforms. One of the simplest yet most critical allows researchers to collect or “scrape” data that is...
3
5
10
The title of this post says it all. This is a blazing big deal for the 4th Amendment. Then again, it's a blazing big deal for like ten other reasons. Mostly because lets the FTC decide if platforms enforced their speech rules correctly. https://t.co/qvUIMbySl2
techdirt.com
On Monday, I published a two-part blog post about the Federal Trade Commission (FTC) settlement with Aylo, parent company of Pornhub. The FTC’s complaint alleged that Aylo violated federal consumer…
0
2
5
Scathing political criticism is a love language like any other. It communicates a belief that ideas matter, that policymaking is an iterative process, that laws can improve. I sure hope that works for us here in the U.S. today. 18/18
0
0
0
I was criticizing the GDPR quite a bit back in 2017 (for reasons I also still stand by). So it felt weird to say this at the time. But it was true then and remains so now: "It would be a beautiful thing to see data protection law save the open Internet." 17/
1
0
0
For Europeans and many outside the U.S., data protection and privacy law can also provide important bulwarks against these threats. Sometimes Americans can use surveillance law for this, but not often enough. 16/
1
0
0
One important way this has played out in the real world is through the evolution of tools like Mastodon and Bluesky, and their underlying protocols. So far this has been a lot more boring and normal and nice than the sci-fi bleakness I was contemplating in 2017. 15/
2
0
1
I was surprised to find some notes of hope in this old post that I think have been borne out by time. The biggest one is simply that technologists and Internet users continue to innovate faster than governments, in ways that are unpredictable and often positive. Really. 14/
1
0
0
In the political and technical reality of 2025, even more so than 2017, "We don’t want an Internet that subjects all of us to a constant, automated, privatized 'content governance cycle.'" 13/
1
0
0
"Tackling dangerous content online is important. But the Internet is where we keep pictures of our kids, and embarrassing old emails, and health records. It’s where teenagers keep their diaries and activists coordinate protests and fledgling rappers post their rhymes." 12/
1
0
0
As I wrote then, "Few of us would really want this kind of supervision and control from even the most benign and trustworthy governments. But none of us live under those kinds of governments anyway." 11/
1
0
0
The "perfect, universal enforcement of rules to govern our public speech and our private communications" was a "terrifying concept" in 2017 and is utterly chilling in 2025. Look around you and think about who will set those rules, and who will enforce them. 10/
1
0
0
And then there's the fallacy that "The inevitable resulting errors and deletion of lawful and important speech" will be fixed when users appeal removals or "platform employees review grey-area removal decisions[.]" This was inadequate then and is inadequate now. 9/
1
0
0
Oooh, and here's a good one: "Lest anything escape the dragnet, platforms should share databases of illegal content, so they can all identify the same speech and enforce the same rules." Sound familiar? 8/
1
0
0
Many still love the idea that "the police (not courts – police) can tell the companies when a post, image, or video is illegal. Then the companies are supposed to use algorithms to make sure no one sees it or says those things again." 7/
1
0
0