With the recent acquisition of Twitter, potentially harmful content may be allowed back on the platform. Is ensuring freedom of speech worth the harm it may cause?
Social media platforms differ from one another based on the level of freedom of expression they allow. Facebook requires real names and limits the type of content allowed on their platform. The Facebook Community Standards have, for example, a clear definition about the amount of uncovered female nipple allowed in images. On the other extreme, on platforms like Reddit and 4chan, users can post anything while remaining anonymous.
Both of these approaches to content moderation have disadvantages. On the one hand, user identification and clear moderation guidelines may lead to censorship and power abuse by community leaders. On the other hand, having no constraints and full anonymity may lead to hate speed, racism, incitement to violence, and hatred.
Twitter so far has been one of the most liberal platforms, allowing adult content and performing limited moderation of content. For the most part, Twitter only forbids illegal content. The most famous example of moderation on Twitter relates to the ban of Donald Trump from the platform. Twitter removed the account on the basis that “Glorifying violent acts could inspire others to take part in similar acts of violence.” The last posts by Trump were considered highly likely to encourage and inspire the replication of the criminal acts that took place at the US Capitol on 6 January 2021.
While external regulation of social media platforms and allowable content may be useful to protect the public, it is unlikely to happen given the technological complexities involved in such regulations. It may be easy for a policymaker to define which content should be allowed on social media and which not. It is more challenging for platforms to actually implement such regulations as they would be required to monitor and moderate massive amounts of content, much of which is subjective judgements and open to interpretation.
Another common challenge of global social media platforms is the need to adhere to different regulations and standards across nations and continents. For example, the European Union is in the process of approving the Digital Services Act, which will regulate the moderation of illegal content on social media as well as disinformation. At the same time, in the United States of America, the Constitution’s First Amendment protects freedom of speech. Moreover, what is culturally acceptable for Western populations may be extremely offensive for Eastern populations, such as images related to the Tiananmen Square protests in 1989. For example, Twitter is banned Russia and China, among other countries. In most cases, such social media platforms are not allowed as they are seen as instruments that the public can use to share different points of view, for protesters to organise riots and unrest, and as channels for information propagation, where the information is not aligned with the local government version of reality. Thus, social media platforms cannot maintain a single global policy on allowed content and to create content moderation guidelines to satisfy all these constraints.
On 14 April 2022, Elon Musk announced his offer to purchase the social media platform Twitter. According to Musk, the main reason to purchase Twitter is to protect free speech, which would otherwise be at risk given the increasing restrictions and content moderation policies introduced by social media platforms. While the acquisition process is still ongoing, Musk started it by first purchasing a large number of shares on the market and then made a US$44 billion offer to take over the entire company.
With this move, the future of social media is even more being decided by few Western corporations led by non-diverse leaders. This has the potential to lead to unfair and unsafe online discussion environments. Minorities and underserved communities not fitting the “Western, educated, industrialised, rich, and democratic” pattern may feel isolated and not taken into account during platform design and development. This comes with the risk of people feeling misunderstood and unappreciated, thus reinforcing stereotypes and leading to extremism. Trending and upcoming platforms like TikTok that have been designed and are managed in Asia rather than the US West coast have so far been more inclusive.
Elon Musk will likely advance his goals for the Twitter by reducing or removing content moderation completely and increasing revenue streams by adding additional paid products on the platform (e.g., pay-to-post kind of features). This has the potential to increase radicalisation and polarisation of content on the platform, especially when content is paid for, and to reinforce the voices of those who can afford to pay to be heard.
The Future of Twitter
In the weeks since the announcement, Twitter users have started to wonder about future changes to the platform and what will happen to pilot programs for community moderation on the platform, like BirdWatch. BirdWatch is a crowd-sourced misinformation annotation pilot project. Selected users participate in the annotation of tweets by labelling them as misleading or not, providing a justification for their labels. Recently, Twitter conducted tests with some of their users for whom Birdwatch labels have been used to highlight potentially harmful content on the platform.They found that users who are shown these crowd-sourced fact-checking are around 30 percent less likely to agree with a misleading tweet.
A possible step forward is the use of human-in-the-loop artificial intelligence. That is, tools that involve humans, like BirdWatch, to label content over different dimensions and then having algorithms working at large scale to very efficiently provide custom end-users functionalities to surface content relevant and acceptable to them. When this happens, the level of transparency of such algorithms is critical. End users must not only be able to understand how content presented to them has been selected and why, but users should also be able to provide feedback and customise the algorithms to their liking, rather than having the algorithms serving the platform’s interests, such as maximising the time spent by users on the platform, their clicks, and platform advertisement revenue. This can be designed like a recommender system (think Netflix) that makes useful suggestions on how to customise the algorithm, still letting end users make the final decision on what content they want to consume.
So, what will happen?
It is hard to know how Twitter will change and how much Elon Musk will influence its design. Looking at Musk’s other business ventures, he might focus first on Twitter revenues and valuation rather than on content moderation guidelines. He has stated a desire to make more of the platform open source, which would enable transparency and customisation as an added value.
If Twitter allows more content and becomes less stringent, will users leave? Or will more users be motivated to join the platform? These are critical questions for the monetisation and revenue that Musk is focusing on. Today we can see that the social media platforms with more users are those which moderate content the most, so it does not seem a good idea to be more lenient on what is acceptable from a business point of view. On the other hand, a more forgiving platform may foster free discussion and make people more open minded by being encouraged to listen to each other, which is what Musk expects.
Will @realDonaldTrump come back on Twitter? Probably yes.
Dr Gianluca Demartini is an Associate Professor in Data Science at the University of Queensland, School of Information Technology and Electrical Engineering. His main research interests are Information Retrieval, Semantic Web, and Human Computation. His research has been supported by the Australian Research Council (ARC), the UK Engineering and Physical Sciences Research Council (EPSRC), the EU H2020 framework program, Facebook, and Google. He has published more than 150 scientific publications in Computer Science and received multiple Best Paper awards.
This article is published under a Creative Commons Licence and may be republished with attribution.