When X (formerly Twitter) changed its verification system in 2022, many foresaw its potential to impact the spread of political opinions on the platform. In a modeling study, researchers show that having verified users whose posts are prioritized by the platform’s algorithms can result in increased polarization and trigger the formation of echo chambers. Because X’s new verification system allows almost anybody to become verified, this side effect could be taken advantage of by users wishing to manipulate others’ opinions, the researchers say.
“Our findings confirm that ideologues and verified users play a crucial role in shaping the flow of information and opinions within the social network,” says first author Henrique Ferraz de Arruda (@hfarruda), a computer scientist at George Mason University. “When verified people post things, it can reach more people, which allows them to have a significant impact on the formation and reinforcement of echo chambers.”
Though many people speculated that X’s verification system might have ramifications, its actual impact hasn’t been studied in depth—in part because the platform no longer allows researchers to access its data. For this reason, the researchers used a computational model simulating how people post and receive messages on social media platforms to investigate how having a larger number of verified users might impact polarization and the formation of echo chambers. Within the model, they tweaked the number of verified users and also varied how stubborn these individuals were in their opinions.
They showed that verified users can actually facilitate consensus on the platform if they are not stubborn in their opinions. However, if verified users are “ideologues” with entrenched opinions that they hope to disseminate, their presence can drive polarization. When verified user ideologues held extreme views, their presence triggered the formation of echo chambers in addition to driving polarization. In contrast, the presence of verified centrist ideologues decreased polarization, while the presence of stubborn but unverified centrists drove polarization without triggering echo chambers.
“We found that even centrist ideologues, who may appear as a moderating force on the surface, can have a significant impact on the opinion dynamics when in enough numbers,” says Arruda.
These differences were driven because of changing connections within the network—essentially, how users followed or unfollowed others within the network.
“When the number of ideologues in the network becomes sufficiently large, regardless of whether they exhibit centrist or extremist behavior, we observed that a significant portion of the messages exchanged in the network are either sent to or received from these influential users,” says Arruda. “This suggests that, when social network algorithms prioritize visibility over content control, the users may be able to reach others to reinforce their opinions in groups, which could entrench echo chamber structures.”
Though the study was based on X’s framework, the researchers say that the results are probably also relevant to other social media platforms. They say that social media companies should be aware of the possible impact they have on political polarization and attempt to mitigate this within their algorithms.
Though in some cases social media moguls could be attempting to polarize their networks, Arruda speculates that for other platforms, this “happens as a side effect because they want to make us use the platform more.”
In future research, the team plans to increase the realism of their model by adding features such as news feeds and reposting and to incorporate data from other social media platforms such as Bluesky.