[web_stories title="true" excerpt="false" author="false" date="false" archive_link="true" archive_link_label="" circle_size="150" sharp_corners="false" image_alignment="left" number_of_columns="1" number_of_stories="6" order="DESC" orderby="post_title" view="circles" /]
The worst time for Donald Trump to return to Twitter is now
Donald Trump to Return to Twitter: The “Vox Populi” has apparently spoken, and they want former President Trump back on Twitter. A slim majority of 51.8 percent of 15 million poll respondents (most likely bots) supported Trump’s return.
Musk has moved at breakneck speed since purchasing Twitter, with little regard for the consequences. Musk pushed out chief legal officer Vijaya Gadde, the leader of all things trust and safety, immediately after elevating himself to self-proclaimed chief twit. Musk then fired 3,000 of the contractors “behind the screen” who dealt with reports of hate speech, harassment, stalking, threats, nonconsensual intimate imagery, spam, and other violations of Twitter’s rules. Twitter’s content moderation was slashed in one fell swoop.
Before delving into what it might mean if Trump resumes tweeting, it is useful to understand what Musk has disassembled and what he will most likely try to reassemble once the risks are clear and advertisers flee.
Twitter has dedicated significant resources to addressing online harm over the years. This effort, on the other hand, started frustratingly slowly. Twitter banned only spam, impersonation, and copyright violations in 2009. Then, Del Harvey, the lone safety employee, asked one of us (Citron) to write a memo about threats, cyberstalking, and harm suffered by people under attack. Harvey wanted to address these issues, but the C-suite refused, citing the “free speech wing of the free speech party.”
Twitter largely followed this script until 2014, when cyber mobs began shoving women off the platform as part of the Gamergate campaign. Advertisers decided at that point that they did not want their products to appear alongside rape, death threats, and nonconsensual pornography. Gadde assembled an impressive trust and safety team, tripling its size. Policies prohibiting cyberstalking, threats, hate speech, and nonconsensual pornography was created by Harvey, Sarah Hoyle, and John Starr. Michelle Haq implemented those policies in the product. Thousands of moderators were hired, and product managers worked to improve the efficiency and responsiveness of reporting processes.
That was just the start. Gadde, Harvey, and Hoyle established a Trust and Safety Council in 2015, comprised of global civil rights and civil liberties organizations. (We have since served on that council on behalf of the Cyber Civil Rights Initiative, where we are on the Board of Directors and hold leadership positions.) That same year, Jack Dorsey returned as CEO and prioritized trust and safety.
This was especially apparent following the 2016 election. In response to the disinformation and hate speech that plagued the platform during the election season, Dorsey and Gadde assembled a small kitchen cabinet (Citron, former New York Times editor Bill Keller, and Berkeley journalism school dean Tom Goldstein) to chart a course forward to ensure that the platform enhanced rather than destroyed public discourse.
On December 2, 2016, Dorsey, along with Gadde and Harvey, met with this group to discuss how Twitter should best combat disinformation, which was eroding trust in democracies around the world. The group did not have all of the answers, but it was clear that the company was on high alert and would devote resources to dealing with harmful online behavior.
The council met for the next two years to provide advice on new products and services. After Harvey and Hoyle left in 2018, Gadde hired Nick Pickles. This group tackled new issues, such as deep fakes and other digitally manipulated imagery. They worked on the “Healthy Conversations” initiative, which solicited feedback on how to improve civil discourse. Gadde’s team revised its hate speech policy to prohibit “dehumanizing speech.” (Of course, this is a condensed history of Twitter’s work on content moderation.)
It’s worth noting Twitter’s Achilles’ heel when it comes to rule-breaking: public officials. In violation of the company’s rules, Trump (and others) were given free rein to spew hate speech, harassment, election lies, and health disinformation. Twitter and others maintained that public officials “were different,” contrary to our mantra that “great power comes with greater, not less responsibility.”
On Jan. 6, 2021, as a mob descended on the U.S. Capitol, many called for Trump’s long-overdue removal, after which Gadde convinced Dorsey, and Trump’s account was temporarily suspended.
We wrote together on February 6, 2021, to make the case for Trump’s permanent ban from social media. In our opinion, “enough was enough”: Trump used his social media presence to downplay and politicize the deadly pandemic; he also used it to incite a mob that resulted in five deaths, numerous serious injuries, and a nation and world shaken. Twitter and other social media platforms removed Trump’s online megaphone, better late than never.
Musk has now invited him back, but the former president has indicated that he is not interested. He has a new forum where the rules are literally written for him: Trump founded Truth Social in February 2022. His reach is anemic, with fewer than 2 million users, but that hasn’t stopped him from using it to spread conspiracy theories, election lies, hatred, and antisemitic tropes.
Despite his protests, Trump will undoubtedly be tempted to return to Twitter in order to reconnect with his 86 million followers. However, the platform he may return to is not the same as the one he left in February 2021, when we opposed his return. That alone would have been disastrous. Trump will now return to the campaign trail with a depleted trust and safety team. What possibly could go wrong?
Musk has been bulldozing and backtracking on real-time content moderation since taking over. He adopted a verification scheme that had no connection to its original purpose: protecting those most vulnerable to impersonation. He is now attempting to correct that error. He will quickly realize that tearing down an entire edifice of trust and safety will cause real harm to users and scare away (more) advertisers. Unlike some of his previous mistakes, he will also learn that it will be difficult to rebuild an entire team that took more than a decade to build.
Musk most likely interprets “Vox populi, vox Dei” to mean that the people are always right, but one of the earliest references to this phrase comes from Alcuin to Charlemagne in 798: “And those people should not be listened to who keep saying the voice of the people is the voice of God, because the riotousness of the crowd is always very close to madness.”
Till Then, keep yourself updated with all the latest news from our website journalworldwide.com.