The Censorship Effect: Radicalization, Censorship, and Social Media Echo Chambers

Mar 14, 2022 | 5 min read
blog
1606237895975.jpeg

Justin E. Lane

Co-Founder & CEO

Do you hear what I hear? New research from CulturePulse featured on the Joe Rogan Experience


Over the past year, the co-founders of CulturePulse, Justin Lane (CEO) and LeRon Shults (CRO) have been working with Minds.com, Daryl Davis, and a group of deradicalization experts to study the effects of censorship on the spread of extreme views on social media. The results of this research, and the platform moderation policies that emerged from our discussions, are likely to surprise you. They were recently published in a paper The Censorship Effect that’s open for all to read. In addition, our collaborators Bill Ottman (CEO @ minds) and Daryl Davis went on the Joe Rogan Experience podcast last week to discuss the paper and other issues (you can find a link to the podcast here).


In short, our research found consistent evidence that the censorship practices of sites such as Facebook and Twitter, which were designed in part to hinder the spread of extreme views, had the unfortunate effect of reinforcing echo chambers and intensifying certainty about extreme beliefs among their members. This conclusion is supported by several studies, including computer simulations of online social networks done by CulturePulse’s founders (downloadable from arXiv.org here). 


We began our research with an extensive review of past research on the role of censorship and deplatforming policies in the formation of echo chambers on the major online social networks, and then conducted our own research in order to explore the potential of alternative approaches. Our overall conclusion, which many initially find counter-intuitive, is that promoting free speech and using less censorship are actually more likely to have a deradicalizing effect on social media.


Is deplatforming working?


The Minds.com paper also discusses new research looking at the spread of information by those who have been deplatformed. For example, a widely shared paper on deplatforming the far right claimed that kicking Alex Jones off of YouTube illustrated how well deplatforming can “work.” However, if one counts views of content related to Alex Jones on YouTube since he was deplatformed, it is clear that his ideas are reaching a larger audience. Moreover, the engagement (likes vs. dislikes) on these videos is overwhelmingly positive, suggesting that even though Jones himself no longer has a channel, the information he wants to promote is spreading more widely than ever on YouTube.  


The Minds.com paper was also intended to serve as a foundation for developing better social media site policies that facilitate healthier interactions on the Internet. One feature that Minds has already released on their platform enables individual users to build their own algorithm or alter the rules that influence what appears on their feed. This feature, which was developed in consultation with Justin and LeRon, enables users to have more control over their online experience. This approach allows people to shape their own news feed more directly, keeping them engaged without the need for ever-increasing sensationalism. 


The CulturePulse team is also working with Minds to engineer a new way to address radical or extreme views on the platform without resorting to censorship. Introducing even small amounts of centralized censorship (controlled by the site moderators) can result in dynamics that are not easily distinguishable from totally centralized censorship programs. On the other hand, decentralized censorship, such as the feature on Facebook and Twitter that allows individuals to block other individuals, does not seem to have the same radicalizing effects. However, these platforms also use AI to censor massive amounts of content (about 98% of all Facebook moderation is done by AI). This leads to far more belief radicalization and echo chamber formation than letting people decide for themselves who they want to hear from. One of the ways this can be most easily seen is graphing the number of links in the social network a person has on average against how certain they are that their radicalized beliefs are right. 


This is illustrated in the figure below, which shows some of the simulation results of the computational model mentioned above. It graphs the average number of links a person has in a social network and the certainty they have about the rightness of their beliefs. The colors represent types of censorship: centralized (red), decentralized (green), and mixed, i.e., both centralized and decentralized (blue). The vertical axis indicates the average number of connections that a person has (called degree in network analysis) who holds a radical belief, while the horizontal axis indicates the level of certainty about the belief. This graph suggests that if a social network uses centralized intervention to censor what users can see, there will be much higher certainty of beliefs (colored in blue and red below), with that certainty increasing as they have fewer links to others in the network (called degree). Here we can see the formation of echo chambers in action. Notice that the green swath in the graph, however, which represents the relationship between the number of links a person has and how certain they are that the radical belief is right, is largely stable and varied.


censorshipgraphrough.png

Graph of censorships effect

The simulation results of these AI-based computational models, or digital twins of social networks, suggest that when platforms push individuals with unconventional beliefs off their sites and into the darker corners of the Internet, they are likely pushing them toward further radicalization. They may no longer be making people uncomfortable on those particular platforms, but they are more likely to radicalize and become potentially problematic to society. 


What is the moral of the story? Even if defending free speech exposes us to aspects of humanity we find ugly, it is significantly better than censoring legal expression of ideas. 


Screenshot (266).png

Quote from Bill Ottman on the ugliness of censorship

Source

https://censorshipeffect.com

CulturePulse_white.svg
nvidia-inception-program
Proud NVIDIA Inception Partner since 2024

RESEARCH

    NEWSROOM

      Keep in touch

      This site is protected by reCAPTCHA and the Google

      Privacy Policy and Terms of Service apply.

      Copyright © 2022 CulturePulse