A new report by Amnesty International has revealed that social media platform X, formerly known as Twitter, has facilitated technology-facilitated gender-based violence (TfGBV) against the LGBTQ+ community in Poland. The findings, released on October 9, 2023, highlight that changes made under the ownership of Elon Musk have resulted in a significant increase in harmful content directed at this vulnerable population.
The report emphasizes that the alteration of X’s Community Guidelines has led to the platform becoming “awash with content constituting TfGBV.” It calls for “urgent, wide-ranging reforms” to address the situation. According to Alia Al Ghussain, a researcher and advisor on technology and human rights at Amnesty International, the platform’s inadequate content moderation and lack of human rights oversight have directly contributed to abuses against LGBTQ+ individuals in Poland.
Amnesty’s research discovered a disturbing prevalence of homophobic and transphobic content on X. This issue is particularly pronounced among accounts that follow politicians opposed to LGBTQ+ rights. The report points out that X’s algorithm, which determines the content shown to users on the “For You” feed, prioritizes engagement. This means content likely to provoke interaction—including hate speech—gets amplified.
The organization has characterized X’s business model as “surveillance-based,” reliant on intrusive data collection aimed at optimizing advertising effectiveness. Amnesty International has noted that the platform’s inadequate funding for Polish language content moderation has exacerbated the issue of TfGBV. Currently, only two Polish-speaking moderators are responsible for overseeing the content for a population of 37.45 million people, including 5.33 million X users.
The report further criticizes X for its failure to comply with the European Union’s Digital Services Act (DSA). Under Article 34(1) of the DSA, providers of Very Large Online Platforms (VLOPs) are required to proactively identify and assess systemic risks linked to their services. This includes evaluating potential human rights risks that may arise from their operational designs, particularly in relation to algorithmic systems.
The DSA mandates that risks to fundamental rights—including the right to human dignity, the right to a private and family life, and the right to freedom of expression—be taken into serious consideration. Amnesty International argues that X has undermined the ability of LGBTQ+ individuals to express themselves freely, live without discrimination, and feel safe in Polish society.
In response to these concerns, the European Commission initiated an investigation into X’s practices under the DSA in December 2023. The Commission is expected to require X to make substantial changes to its recommender system by January 2025. Amnesty International asserts that the ongoing inquiries should specifically examine X’s effectiveness in addressing the risk of TfGBV.
The implications of this report are significant, as it sheds light on the challenges faced by marginalized communities in navigating online platforms. The findings underscore the urgent need for social media companies to prioritize user safety and uphold human rights standards, particularly in regions where such issues are prevalent.
