Ranking algorithms can reinforce inequalities in social networks, says a new study from the Complexity Science Hub

News

Ranking algorithms can reinforce inequalities in social networks

THE WAY PEOPLE CONNECT ON TWITTER OR IN ACADEMIA CAN DISTORT AND INFLUENCE THE VISIBILITY AND RANKING OF MINORITIES, AND ALGORITHMS AMPLIFY THESE BIASES, ACCORDING TO A HUB STUDY

Online social networks claim to make connections and bring people together. But their ranking and recommender algorithms – which suggest people whom to connect with, or who the most relevant scientists in a particular field are – can, in fact, exacerbate inequalities by discriminating against certain groups of people in top ranks, a Hub study has found.

The study, published in Scientific Reports, investigated how social mechanisms influence the rank distributions of two well-known algorithms, PageRank – one of the main algorithms on which Google’s search engine is built – and Who-to-Follow, Twitter’s algorithm which suggests people you aren’t currently following that you may find interesting.

“The findings suggest that ranking and recommender algorithms in online social networks such as Twitter can distort the visibility of minorities in unexpected ways,” says one of the authors of the paper, Fariba Karimi, who leads the team at the Hub working on Network Inequality.

UNDERSTANDING ALGORITHMS

“It has been shown that ranking algorithms tend to increase the popularity of users that are already popular and that can lead to loss of opportunities for certain groups of people,” explains Lisette Espín-Noboa, a PostDoc at the Hub and first author of the paper. “We wanted to understand when these algorithms can go wrong” depending on the structure and characteristics of a network, according to Lisette.

In the study, Lisette and her team simulated different networks, composed of 2,000 people [avatars? “agents”?], and adjusted the social mechanisms of relationships between individuals in each. They were able to make variations in the properties assigned to each network, including the proportion of the minority, how active users were in connecting with other users, and the way people connected in the network – specifically, whether individuals associated with others who were already popular, and if members tended to link with those who were similar to them, a principle social scientists call homophily.

MAIN SOCIAL MECHANISM

The researchers found out the main social mechanism responsible for distorting the visibility of minorities in rankings was homophily together with the proportion of the minority. When the majority group associated mostly with other members of the majority, the minority group was underrepresented in top ranks, explains Lisette.

“However, minorities can overcome this underrepresentation by connecting strategically with others and can try to achieve at least statistical parity in top ranks,” says Lisette.

“Statistical parity means that if the minority represents 20 percent of people in the network, then the same ratio should be reflected in each top-k of the rank. Then, one way to increase the visibility of minorities in the rank is by making the minorities more active in the network. That is, minorities should create more connections to others,” points out Lisette. Another way is by diversifying the connections of the majority, by creating more connections from the majority group to the minority group.

MORE REALISTIC SCENARIOS

“We have seen in an earlier paper how homophily can influence the ranking of minorities. The difference is that Lisette’s paper assumes more realistic social network scenarios and is looking not only at ranking algorithms but also social recommender algorithms that social network platforms such as Twitter uses,” concludes Fariba.

The paper Inequality and inequity in network-based ranking and recommendation algorithms, whose other co-authors are CSH External Faculty members Claudia Wagner and Markus Strohmaier just appeared in Scientific Reports.

0 Pages 0 Press 0 News 0 Events 0 Projects 0 Publications 0 Person 0 Visualisation 0 Art

Signup

CSH Newsletter

Choose your preference
   
Data Protection*