Research involving over 1,000 teenagers aged 13 to 17—conducted by UCL, the University of Kent and The Association of School and College Leaders (ASCL)—highlights a worrying reality
Social media algorithms amplify misogynistic content to teens, study finds.
Image: Patrick T. Fallon / AFP©
TikTok's algorithm seems to have been caught red-handed when it comes to misogyny. In the space of just five days of monitoring, the level of misogynistic content suggested by TikTok increased fourfold, according to a study by UCL researchers.
Research involving over 1,000 teenagers aged 13 to 17—conducted by UCL, the University of Kent and The Association of School and College Leaders (ASCL)—highlights a worrying reality: exposure to misogynistic content on social networks creates a vicious circle where misogynistic attitudes are reinforced and propagated.
"Initial suggested content was in line with the stated interes²ts of each archetype, such as with material exploring themes of loneliness or self-improvement, but then increasingly focused on anger and blame directed at women. After five days, the TikTok algorithm was presenting four times as many videos with misogynistic content such as objectification, sexual harassment or discrediting women (increasing from 13% of recommended videos to 56%)," the study news release explains.
The results of the study are clear: teenagers exposed to more misogynistic content on social networks are more likely to adopt misogynistic attitudes themselves. Worse still, this exposure makes them more prone to engaging in online sexual harassment, underlining the deleterious impact of misogyny on teenagers' well-being. Dr Kaitlyn Regehr (UCL Information Studies), principal investigator on the study, said: "Algorithmic processes on TikTok and other social media sites target people’s vulnerabilities -- such as loneliness or feelings of loss of control -- and gamify harmful content. As young people micro dose on topics like self-harm, or extremism, to them, it feels like entertainment."