On Instagram, sex workers have reported their accounts and images being removed, even when they contain no explicit content. They’ve documented instances of Facebook discussion groups being shut down for sex work-related conversations. LinkedIn prohibits the listing of sex work in one’s profile, thus deligitimizing it as work (regardless of jurisdiction). Payment processors from PayPal to Square regularly shut down accounts of sex workers, and nearly every online advertising tool bans sexual content.

While takedowns can be tracked by the user in most cases, Twitter and Instagram engage in a more subtle—and therefore nefarious—practice widely referred to as “shadow banning,” whereby a hashtag or keyword is suppressed from search—either temporarily or permanently—preventing users from finding content on a given subject unless they know what they’re looking for. Twitter has explicitly denied engaging in shadow banning, but sex workers—as well as a number of other activists I’ve spoken with—offer evidence to the contrary.

Smith, who owns the URL “jackisanazi.com” (referring to Twitter CEO Jack Dorsey), has been particularly vocal about Twitter’s methods. In a piece arguing that shadowbans deny sex workers income and community, Smith is quoted as saying: “I have multiple tweets a day that will garner hundreds of likes and then I will post a picture that will get two. That isn’t how the internet works, typically pictures get way more engagement than straight text.”

Twitter also hides certain keywords from users’ replies, deeming them “offensive” and requiring multiple clicks to reach them. Among the words deemed offensive? “Vagina.”

Like Twitter, Instagram employs subtle enforcement mechanisms in its adjudication of sexual content. In April 2019, the company announced: “We have begun reducing the spread of posts that are inappropriate but do not go against Instagram’s Community Guidelines.” Instagram admits to blocking such content from its Explore page or in hashtag searches. One recent example was the hashtag #poledancing, used more by exercise enthusiasts than exotic dancers. Danielle Blunt, a queer-identified sex worker with a master’s in Public Health who’s spent time observing the phenomenon, notes that she’s seen Instagram ban hashtags like #femdom and even #women—while #maledom remains available.

Censoring sexuality to such an extraordinary degree is bound to have a long-term impact on society. By forbidding any potentially sexual content, tech companies are furthering the sexual ideals proliferated by mainstream pornography sites—ideals that many feminists have long considered harmful. And, in particular, by banning positive and realistic depictions of women’s bodies—many of which are created and shared by women—Silicon Valley companies are ensuring that the status quo will remain.

Erika Lust, a Swedish erotic film director, is known for making porn that contains artistic angles, feminist points of view, and ethnic- and gender-diverse actors. In her writing and public speaking, Lust has called for better sexual education and for porn—which she considers “the most important discourse on gender and sexuality”—to change. Lust has experienced censorship on several platforms, including Vimeo and YouTube.

“When pages that promote female pleasure are hidden, we understand that our pleasure is invalid,” she wrote to me in an email. “When drawings of vaginas are removed, we learn that we should be ashamed of our bodies. When female nipples are censored but male nipples are not, we know that we must police our own bodies to ensure we do not arouse men … The bodies, sexualities, and desires that are allowed online translates itself into the bodies, sexualities and desires that are accepted in society.”

Indeed, the same hypersexualized imagery of young celebrities and influencers abound on these platforms, as they do on American television. The now-infamous photo of Kim Kardashian’s ample behind was permitted to remain on major platforms, as was a photo of Justin Bieber clad only in tight briefs. Meanwhile, less explicit images and information intended to empower historically marginalized communities are regularly deemed inappropriate.