A brand new research published in Nature has discovered that X’s (previously Twitter) algorithm – the hidden system or “recipe” that governs which posts seem in your feed and wherein order – shifts customers’ political views in a extra conservative route.
Led by Germain Gauthier from Bocconi College in Italy, it’s a uncommon, real-world randomised experimental research on a serious social media platform. And it builds on a rising physique of research that reveals how these platforms can form folks’s political attitudes.
Two totally different algorithms
The researchers randomly assigned 4,965 energetic US-based X customers to certainly one of two teams.
The primary group used X’s default “For You” feed. This options an algorithm that selects and ranks posts it thinks customers shall be extra more likely to interact with, together with posts from accounts that they don’t essentially observe.
The second group used a chronological feed. This solely reveals posts from accounts customers observe, displayed within the order they had been posted. The experiment ran for seven weeks throughout 2023.
Customers who switched from the chronological feed to the “For You” feed had been 4.7 proportion factors extra more likely to prioritise coverage points favoured by US Republicans (for instance, crime, inflation and immigration). They had been additionally extra more likely to view the felony investigation into US President Donald Trump as unacceptable.
In addition they shifted in a extra pro-Russia route regarding the conflict in Ukraine. For instance, these customers turned 7.4 proportion factors much less more likely to view Ukrainian President Volodymyr Zelenskyy positively, and scored barely increased on a pro-Russian perspective index general.
The researchers additionally examined how the algorithm produced these results.
They discovered proof that the algorithm elevated the share of right-leaning content material by 2.9 proportion factors general (and a couple of.5 factors amongst political posts), in contrast with the chronological feed.
It additionally considerably demoted the share of posts from conventional information organisations’ accounts whereas selling or boosting posts from political activists.
Probably the most regarding findings of the research is the longer-term results of X’s algorithmic feed. The research confirmed the algorithm nudged customers in the direction of following extra right-leaning accounts, and that the brand new following patterns endured even after switching again to the chronological feed.
In different phrases, turning the algorithm off didn’t merely “reset” what folks see. It had a longer-lasting affect past its day-to-day results.
One piece of a a lot larger image
This new research helps findings of comparable research.
For instance, a study in 2022, earlier than Elon Musk had purchased Twitter and rebranded it as X, discovered the platform’s algorithmic techniques amplified content material from the mainstream political proper greater than the left in six out of the seven nations.
An experimental study from 2025 re-ranked X feeds to cut back publicity to content material that expresses antidemocratic attitudes and partisan animosity. They discovered this shifted emotions in the direction of their political opponents by greater than two factors on a 0–100 “feeling thermometer”. It is a shift the authors argued would have usually taken about three years to happen organically within the common inhabitants.
My very own research affords one other piece of proof to this image of algorithmic bias on X. Together with my colleague Mark Andrejevic, I analysed engagement information (comparable to likes and reposts) from outstanding political accounts in the course of the remaining phases of the 2024 US election.
Our findings unearthed a sudden and strange spike in engagement with Musk’s account after his endorsement of Trump on July 13 – the day of the assassination try on Trump. Views on Musk’s posts surged by 138%, retweets by 238%, and likes by 186%. This far outstripped will increase on different accounts.
After July 13, right-leaning accounts on X gained considerably better visibility than progressive ones. The “enjoying subject” for consideration and engagement on the platform was tilted thereafter in the direction of right-leaning accounts – a pattern that continued for the rest of the time interval we analysed in that research.
Not a distinct segment product
This issues as a result of we’re not speaking a couple of area of interest product.
X has more than 400 million users globally. It has develop into embedded as infrastructure – a key supply of political and social communication. And as soon as technical techniques develop into infrastructure, they will develop into invisible – like background objects that we barely take into consideration, however which form society at its foundations and can be exploited underneath our noses.
Consider the overpass bridges Robert Moses designed in New York within the Nineteen Thirties. These appeared like inert objects. However they had been designed to be very low, to exclude folks of color from taking buses to recreation areas in Lengthy Island.
Just like this, the design and governance of social media platforms additionally has actual penalties.
The purpose is that X’s algorithms will not be impartial instruments. They’re an editorial power, shaping what folks know, whom they take note of, who the outgroup is and what “we” ought to do about or to them – and, as this new research reveals, what folks come to consider.
The age of taking platform corporations at their phrase in regards to the design and results of their very own algorithms should come to an finish. Governments around the globe – together with in Australia the place the eSafety Commissioner has powers to drive “algorithmic transparency and accountability” and require that platforms report on how their algorithms contribute to or cut back harms – have to mandate real transparency over how these techniques work.
When infrastructure develop into dangerous or unsafe, no one bats a watch when governments do one thing to guard us. The identical must occur urgently for social media infrastructures.
This text is republished from The Conversation underneath a Inventive Commons license. Learn the original article.

