26.9 C
New York
Friday, July 5, 2024

Elon Musk’s Reckless Plan to Make Sex Pay on Twitter

Last week, The Washington Post reported that Elon Musk plans to fast-track a product to monetize adult content on Twitter, which he more or less confirmed on Saturday afternoon. To be clear, Musk’s apparent plans—the latest get-rich-quick scheme from the richest man on the planet—have nothing to do with supporting sex workers. Choosing to expand adult content at a moment of heightened scrutiny surrounding sex work and queer people is risky, especially amid reports that Musk plans to remove protections for trans people, a population that disproportionately overlaps with sex workers, on the platform.

The move toward monetization also threatens to ruin a refuge: Since the US Federal Bureau of Investigation seized Backpage in April 2018, three days before President Trump signed the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) into law, Twitter has become the sole major social media platform to tolerate sex workers. Even in the absence of direct payouts, Twitter has long been a safe haven for sex workers (adult content creators as well as in-person providers) in an increasingly puritanical digital landscape. But in order for monetization to work, Twitter must overhaul its content moderation practices and intensify them, in direct contrast to Musk’s oath to protect “free speech.”

More likely is that monetization will be done in haste and without addressing the issues raised by one of the company’s internal teams sensitive to the complexities of adult content and moderation. This is in part because, as of Friday, teams like the Machine Learning, Ethics, Transparency, and Accountability (META) team no longer exist. This move threatens to further endanger sex workers and especially trans sex workers, who are already at an astronomically increased risk of violence that continues to rise with each new piece of anti-trans legislation.

A badly executed attempt to monetize adult content could compound the fears, amid a growing moral panic that sex workers and queer adults are sexually deviant and thus threatening to minors, lending credence to the belief that we have no place on a mainstream social media platform that welcomes users as young as 13. And such a move by Musk threatens to change the experience of everyone on the site.

One of many ways homophobia, transphobia, and whore-phobia—the systemic oppression of sex workers—overlap is that we are all perceived as threats to children. Online sex work is not immune from this bias. Earlier this year, Casey Newton and Zoë Schiffer reported in The Verge that Twitter had been developing an “OnlyFans-style” subscription project, but efforts were stymied by fears over child sexual exploitation material (CSEM, sometimes referred to as child sexual abuse material, or CSAM). A contingent of in-house researchers, dubbed the Red Team, determined that Twitter couldn’t safely roll out the project, titled Adult Content Monetization (ACM), before getting a handle on CSEM, which they believed ACM would exacerbate. The project was tabled in May, only a few weeks after Musk’s offer to buy Twitter for $44 billion.

The Red Team’s findings hinge on two misconceptions about sex work: First, that it, along with human trafficking, exists on a continuum encompassing “the sex trade,” and second, that adult-content platforms do not enforce aggressive moderation policies.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

It’s true that CSEM is near ubiquitous on the internet; its existence has long been used as a cudgel by religious right-wing groups like Exodus Cry and the National Center on Sexual Exploitation (NCOSE, formerly Morality in Media) against the presence of sex workers on the internet. But all online platforms that identify CSEM are required by law to report their findings to the National Center for Missing and Exploited Children (NCMEC). In their report, for example, Newton and Schiffer note that “in 2019, Mark Zuckerberg boasted that the amount Facebook spends on safety features exceeds Twitter’s entire annual revenue.”

Yet the claim that adult content is inextricably linked with child abuse is demonstrably untrue. NCOSE names OnlyFans specifically as enabling “child sexual abuse material, sex trafficking, rape videos, and a host of other crimes”—but the numbers tell a different story. In their latest annual report, NCMEC found that PornHub, for instance, reported 9,029 instances of CSEM in 2021, a figure dwarfed by Twitter’s 86,666. Facebook, despite its robust safety budget, reported 22,118,952 instances, nearly 2,500 times more than PornHub. OnlyFans reported zero.

And adult sites are the most careful of all about moderating content to prevent the circulation of CSEM. As every adult-content creator knows, identity verification and, more recently, biometric data are required to upload content to clip sites—an umbrella term for online platforms that monetize adult content through film clips, streaming channels, and fan sites—and to cash out on sales. Content on OnlyFans also has to pass through both algorithmic and human moderation before getting approved to confirm that all models have been verified. (OnlyFans maintains in-house content moderators to review each piece of content before it goes live, in part because the company is invested in protecting its content creators, who are also their only source of revenue.) Direct messages are also closely moderated, and any reference to in-person meetings, which could place OnlyFans at risk of a felony trafficking charge, can result in a permanent suspension.

As veterans of the industry can also confirm, clip sites have strict content guidelines—increasingly so since FOSTA’s passage—limiting what kinds of content can or cannot be sold. Hypnosis content, for example, has been restricted since at least 2017, when credit card companies deemed kinks “illegal or immoral” and began threatening platforms that allowed them with demonetization. As Ana Valens suggests in the Daily Dot, financial platforms’ trepidation is likely due to the fear of chargebacks and claims that customers were “coerced” by hypnosis into spending more money on content or clips.

Other restrictions are more closely tied to social mores and legislation than buyer’s remorse; blood, needles, and consensual nonconsent were among the other “immoral” fetishes axed alongside hypnosis. Fetish model Sophie Ladder manages a frequently updated Google doc listing which fetishes are currently acceptable on which clip sites. Blood, for instance, is prohibited or otherwise restricted on all but one tube site.

Meanwhile, I have scrolled past exceedingly graphic adult content on Twitter, which does not prohibit depictions of “bodily fluids” as long as users flag it as sensitive media. While Twitter does prohibit “violent sexual conduct,” the policy seems to apply only to rape and sexual assault as opposed to consensual violence, as in BDSM.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

Twitter doesn’t restrict adult content because it doesn’t have guidelines for adult content. At all. At the time of writing this article, there are no policies limiting adult content on Twitter as long as it’s consensually produced, appropriately flagged as sensitive, and not included in more publicly visible areas like profile pictures or header photos. (I do not include CSEM as “adult content” because it, by definition, does not feature adults.)

Historically, while Twitter was never as hostile to sex workers as, say, Meta, it maintains what one Twitter employee told me is a “Don’t Ask, Don’t Tell” policy toward them. And despite its lack of content policy, Twitter does algorithmically suppress adult content, whether intentionally or not.

Realistically, not even Twitter knows how its algorithms are applied, as is the case with most machine-learning algorithms used by social media and search engines. Machine-learning algorithms refine themselves based on user activity that, without interference, codifies the biases of those users. This results in what’s termed algorithmic bias, which, in this case, even if unintentional, impacts sex workers most severely. Already, sex workers and activists are more likely to be algorithmically flagged by Twitter for shadow-banning, a practice in which certain users or content are artificially suppressed. Additionally, and similar to Instagram’s algorithms that flag nudity by flesh-toned pixels, Twitter algorithmically identifies adult content not already marked as “sensitive” and restricts it.

Against this backdrop, Musk’s plan to monetize adult content theoretically makes sense from both a business and a social standpoint. From a business standpoint, it will bring in more revenue for Twitter. From a social standpoint, it could prevent algorithmic censorship of sex workers and theoretically destigmatize sex work. Fears that ACM would somehow transform Twitter into “a porn site” or, worse, “a child porn site” are unfounded. If anything, monetizing adult content correctly could greatly reduce the amount of CSEM on Twitter by putting in more safeguards to detect it before it is posted.

In practice, however, Musk would either have to commit to more content moderation to ensure that CSEM is not on the site—which is in tension with his championing of “free speech”—or double down on the importance of expression with minimal moderation, thus opening the floodgates for gruesome content otherwise kept in the shadows of the dark web and private Facebook groups.

If Twitter rolls out ACM, it will likely face a choice: require that all Twitter users provide age verification in order to prevent posting of or access to CSEM, or ban non-monetized adult content entirely in order to make money from its subscription service. Either way, algorithmic surveillance of all visual media will necessarily intensify.

If Twitter follows in OnlyFans’ footsteps, the platform will most likely prohibit non-monetized nudity on unprotected accounts. As a result, people who are not adult-content creators but occasionally post nudity—in-person sex workers, for example, or sex educators—will have to edit their content to comply with new policies, if they’re allowed to remain on Twitter at all.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

Another challenge awaits. Twitter’s function among sex workers as our last semipublic digital hideaway means that before introducing new content, Twitter will have decide whether or not to retroactively moderate its existing media. This is simply an impossible task for human moderators. As for AI, while it may seem easy enough to perform an algorithmic sweep of all visual media to detect nudity, this AI is notoriously biased against people of color, fat people, and sex educators, as has been shown with Instagram—and that’s even though Instagram has been moderating visual media since its inception. In contrast, Twitter has over a decade’s worth of material to reassess, and developing the technology for such a herculean task would set Musk back several more million dollars, not to mention time that far exceeds the two weeks he’s reportedly given employees to prepare.

Twitter may be able to introduce rudimentary algorithms to streamline the moderation process. But content moderation at scale without a large team of human moderators, especially considering the heightened scrutiny surrounding Twitter’s content-moderation policies—such a task would be unimaginable under any leadership and particularly that of Musk, who plans to lay off the vast majority of Twitter’s workforce and has already cut nearly half. In addition, Twitter and Meta offload most human-based content moderation responsibilities to third-party contractors, thus barring content moderators from company benefits, and Musk boasts a long history of labor violations.

In the end, Musk’s Twitter—with its lack of human moderators, bare-bones policies on doxing, and lack of interest in protecting sex workers as people instead of merely “content creators”—simply does not have the infrastructure or commitment to responsibly monetize adult content.

How Musk decides to treat sex workers will, as usual, offer a glimpse into how he plans to expand policies to encompass other users. We are, after all, marginalized and disposable, an ideal test population for new policies and technologies. The weeks and months ahead will test our ability to protect ourselves and one another from abuse. To us, these battles over our livelihoods and safety are nothing new. To Musk, however, this is a fresh hell likely to drag him into quagmires with workers, regulators, and the public. My fear is that he’s sacrificing us in this ridiculous war he has already set himself up to lose, and whether or not his latest gimmick manages to keep Twitter alive, the blood spilled to save it will be that of sex workers.

Related Articles

Latest Articles