Social networking algorithms could be amplifying misogynistic content to teens
Research involving over 1,000 teenagers aged 13 to 17—conducted by UCL, the University of Kent and The Association of School and College Leaders (ASCL)—highlights a worrying reality
TikTok's algorithm seems to have been caught red-handed when it comes to misogyny. In the space of just five days of monitoring, the level of misogynistic content suggested by TikTok increased fourfold, according to a study by UCL researchers.
Research involving over 1,000 teenagers aged 13 to 17—conducted by UCL, the University of Kent and The Association of School and College Leaders (ASCL)—highlights a worrying reality: exposure to misogynistic content on social networks creates a vicious circle where misogynistic attitudes are reinforced and propagated.
"Initial suggested content was in line with the stated interes²ts of each archetype, such as with material exploring themes of loneliness or self-improvement, but then increasingly focused on anger and blame directed at women. After five days, the TikTok algorithm was presenting four times as many videos with misogynistic content such as objectification, sexual harassment or discrediting women (increasing from 13% of recommended videos to 56%)," the study news release explains.
The results of the study are clear: teenagers exposed to more misogynistic content on social networks are more likely to adopt misogynistic attitudes themselves. Worse still, this exposure makes them more prone to engaging in online sexual harassment, underlining the deleterious impact of misogyny on teenagers' well-being. Dr Kaitlyn Regehr (UCL Information Studies), principal investigator on the study, said: "Algorithmic processes on TikTok and other social media sites target people’s vulnerabilities -- such as loneliness or feelings of loss of control -- and gamify harmful content. As young people micro dose on topics like self-harm, or extremism, to them, it feels like entertainment."
The complicity of algorithms
The researchers point the finger at social networking algorithms, which, by encouraging the dissemination of content similar to that already consulted, lock teenagers into bubbles of misogyny. This echo chamber amplifies hate messages and limits exposure to contrary opinions, depriving young people of an open, inclusive vision of the world. "Harmful views and tropes are now becoming normalized among young people. Online consumption is impacting young people’s offline behaviors, as we see these ideologies moving off screens and into school yards. Further, adults are often unaware of how harmful algorithmic processes function, or indeed how they could feed into their own social media addictions, making parenting around these issues difficult," Dr Kaitlyn Regehr continues.Last Updated :
February 14, 24 12:33:52 PM IST