
Social networks often function as a space to which . It is still one of the platforms in that sense. With 4,965 profiles, the objective was to see what type of speeches the platform amplified. After seven weeks of analysis, the authors concluded that the feed The network’s algorithmic system, now owned by Elon Musk, tends to push more right-wing content and encourage users to follow conservative-oriented accounts.
The study, which is published this Wednesday in the journal Naturewas made six months after and approximately a year before the public endorsement he made in July 2024, when the Republican intended to return to the White House. That is, in the midst of the platform’s ideological shift, when the source code it had since its inception, twenty years ago, was modified, courtesy of its new owner, which was ministerial by the Republican.
For years, the “echo chamber” and “filter bubble” theories already suggested that social networks contained biased algorithms that increased polarization and misinformation by eliminating critical perspectives. However, this study provides stronger empirical evidence supporting these hypotheses.
For Mariluz Congosto, a researcher specialized in social network analysis and data visualization, the study constitutes proof that the platform’s interests would have evolved in line with Musk’s positions – which promotes transphobic, racist and supremacist theories such as -.
For Roland Hodler, professor of Economics at the University of St. Gallen (Switzerland) and co-author of the research, X’s algorithm “is not politically neutral.” The system, the economist points out, promotes more content from political activists than from the media, which means that “more people are potentially influenced” by .
Does not alter political attitudes
Users in the United States were randomly assigned to two different configurations: the feed algorithmic (“For you”) or the feed chronological (“Following”). And the content was analyzed using a browser extension voluntarily installed by the participants.
Researchers from the University of Bocconi (Italy), St. Gallen (Switzerland), the Paris School of Economics and the School of Higher Studies in Social Sciences (France) identified that users gave greater priority to issues traditionally highlighted by the Republican Party, such as inflation or immigration; they were more likely to consider the criminal investigations against Donald Trump “unacceptable”; They adopted positions more favorable to the Kremlin regarding the war in Ukraine and less favorable towards the Ukrainian president, Volodymyr Zelensky.
Although the results suggest a clear trend in the algorithm and an adjustment of perceptions on specific issues influenced by it, the research has not concluded that the system has a direct effect on people’s political positions.
Ramón Salaverría, a digital media researcher, has a cautious stance in this regard. “We should not convey an impression that goes beyond what the study itself suggests,” he says. “Users have not changed their affinity with the party. What they do detect is a shift in controversial and current issues during the period analyzed.” In a relatively short time during the summer of 2023, “those related to conservative ideas became more related” and “a certain incidence” was detected in those who started from liberal positions, points out the professor of Journalism at the University of Navarra.
The effect is long-lasting
Turning off the algorithm—that is, switching from the algorithmic to the chronological feed—did not produce reverse effects. Posts did not shift back toward more liberal positions. “If I use the algorithm, I activate it, I see more content from the right, from political activists, I follow these activists and add them as accounts that I want to follow. And when I deactivate the algorithm, I still follow these people,” explains Hodler, co-author of the research.
In Spain, Deputies and analysts from across the ideological spectrum have built a good part of their public notoriety on the platform. And many of them have stayed in it. Even after Elon Musk placed the Spanish Executive at the center of his criticism and amplified various hoaxes related to migration.
The attacks were also directed against Podemos, the party that agreed with the socialist Government to approve it could benefit more than 500,000 people.
“Business is business”
Congosto, for his part, affirms that “Twitter has always had bias.” In his opinion, before Musk’s purchase, the platform already showed a trend, but a progressive one – remember the closure of Donald Trump’s account after the assault on the Capitol. Now, the bias has shifted. “Platforms tend to align with power. Business is business,” he says.
The researchers also detected that users exposed to the algorithm had greater interaction. For Congosto, this is the main objective of the system: “It will always present you with things that hook you. That means more time on the platform and that has a very clear economic value.” In his opinion, the system does not “understand” ideologies, only virality. “If in a given context right-wing content generates more activity, more will be shown,” he maintains.
The agenda in the hands of the algorithm
The analysis reveals that the algorithm prioritizes political content, amplifies accounts of political activists—especially conservatives—and reduces the presence of traditional media. In Congosto’s opinion, this entails a great risk. If the visibility of sources that contrast information is reduced, the weight of more emotional or polarizing discourses increases.
Raúl Magallón, a researcher specialized in disinformation, technology and the public sphere, shares this diagnosis. In his opinion, the debate is not limited to a possible ideological bias, but rather points to a transformation in the construction of the public agenda. “Algorithms probably will not serve to change our political worldview,” he points out, “but they can be decisive when it comes to defining which issues are at the center of current events.” Magallón warns that we are moving “from the agenda of the media to the agenda of algorithms.”