I was walking Boomer the dog a month ago when a book caught my eye. I was charmed by its title, How Sex Changed the Internet and how the Internet Changed Sex, and condition: the book seemed to have been left outside to get poured on and then dehydrated by the sun.
Like many of us, I think about sex and the Internet a lot independently, and I’m especially invested in their relationship to each other. And while writer Samantha Cole’s book wasn’t a complete success, (it didn’t quite answer what it claims to in its title), it felt like I found it at the perfect time.
On Monday a bill trying to tackle an issue at this very intersection passed the House. It’s on its way to the President’s desk so he can sign it into law. More of us need to be talking about it.
WHAT IS
The Take It Down Act aims to criminalize publishing or threats to publish nonconsensual intimate imagery (NCII).
Ars Technica laid out the highlights of the bill: “publishing intimate images of adults without consent could be punished by a fine and up to two years of prison. Publishing intimate images of minors under 18 could be punished with a fine or up to three years in prison. Online platforms would have 48 hours to remove such images after ‘receiving a valid removal request from an identifiable individual (or an authorized person acting on behalf of such individual)’.”
These images include what is sometimes called “revenge porn,” (despite the fact that this media is abuse, not porn), and nonconsensual “intimate” deepfakes. (Putting intimate in quotes here because I find that “intimate” seems coquettish in comparison to what’s actually happening).
WHAT ARE PEOPLE SAYING?
Many people, politicians, and organizations including RAINN (the nation’s largest anti-sexual violence organization) say the Take It Down Act will make the Internet a lot safer. Senator Ted Cruz's assessment is that this this bill is “common sense.”
But many tech writers, free speech groups, sex educators, and sex workers are asking, “an Internet safer for whom?” They’ve pointed out that we can’t rely on the Trump administration—and I’d argue any administration—to enact the Take It Down Act in good faith.
Some of us who are abolitionists (a politic that typically seeks alternatives to carceral punishment) might view all legislation coming from our existing “justice” system as obviously unhelpful. As Cole writes in How Sex Changed the Internet and the Internet Changed Sex, “Malicious deep fakes were and still are created in an attempt to own women’s bodies, but instead of an examination of things like consent, bodily autonomy and sexuality online the debate we ended up having focused on politics.”
We won’t legislate away misogyny.
But even if you don’t consider yourself working towards or adjacent to abolition—or are unsure of what it can look like—we still should question “common sense” legislation. The phrase is often used as a means of shutting down opposition, of making others feel dumb. “Common sense” eliminates the possibility of nuance—lest you in this case be accused of supporting child exploitation, for example. We need to understand bills and laws like this so that we can strategize how we’ll fight back and show up for one another—materially, physically, emotionally.
WHY ISN’T THIS IT (abolition aside)?
There are a few reasons this is the case.
Even if we give the Take It Down Act our most generous reading, it doesn’t do anything to center victims of these deepfake violations. Instead, it offers ample opportunity for its provisions to be abused.
Notably, Trump is actively dismantling the Federal Trade Commission, the office responsible for holding tech companies accountable if they fail to act after a “take down” request is made. Considering the sites are given 48 hours and companies like Meta and Twitter have slashed their (notoriously trauma-laden) moderating roles, I’m not exactly sure who will be fielding flagged content. For many victims, there already exists the trauma of having to track down and repeatedly report images of themselves (AI generated or not) in the hopes that sites will take them down. It’s unclear how the bill changes what is so often a years-long cat and mouse game of attempting to scrub the Internet as abusers duplicate photos and videos across sites. Where is the support for the person in the image?
What we do know is that Trump is a sexual predator who hates women and doesn’t care about kids. So if Trump’s administration will enforce the Take It Down Act, they’ll only do it when it benefits him. He actually told us so a few weeks ago: "I'm going to use that bill for myself too, if you don't mind. Nobody gets treated worse than I do online, nobody.”
How might he do that? The Economic Frontier Foundation writes, “the notice-and-takedown system [the bill] creates is an open invitation for powerful people to pressure websites into removing content they dislike.” Maybe Trump will want Twitter to take down a deepfake of him in a compromising position, or maybe he will want Instagram to remove a post that he simply doesn’t agree with. Maybe he’ll cherry pick posts that weren’t removed in 48 hours on smaller sites and pursue them in court if the site’s mission conflicts with his. Maybe we will notice. Maybe we won’t. Maybe we will notice and then it will be reported that the images were deleted in an “administrative error.” There’s enough material out there for Trump (or any other president) to go after a site that isn’t helping him—in his eyes—live out his fascist fantasies.
IS THIS KIND OF THING NEW?
There’s precedent for this kind of law—a piece of legislation at the intersection of sex and technology that exacerbates the issues it purports to remedy.
In 2018, Trump signed bills known as FOSTA-SESTA into law in an (alleged) attempt to reduce human trafficking. “In actuality,” AIDS United writes, “these [laws] have been used to pressure platforms like Twitter, Reddit, Instagram, Google and more to monitor (and censor) content related to sex—including sexual health information, particularly affecting resources for communities vulnerable to HIV—under the guise of avoiding facilitation of sex trafficking and sex work.” U.S. Government Accountability Office report confirmed these predictions in 2021. Of the report’s findings, AIDS United says, “FOSTA-SESTA has not helped prosecutors tackle trafficking cases and suggests it is not frequently applied. Instead, it has caused platforms to preemptively censor sex workers’ presence and speech online, making their jobs more dangerous and difficult.” Sex workers reported that it was harder—if not impossible— to use websites to advertise, screen clients, and protect their identities. Sex work got pushed further underground.
I have to wonder if we’ll get a report on the efficacy of the Take it Down Act in a few years, or if there’ll be anyone behind a government computer to write a report at all.
In the meantime, we have to call out government theatrics and examine how we can protect things like consent, bodily autonomy, and sexuality online.
I literally hadn’t heard about this at all! Thank u for raising this