Digital inequalities: Data Workers, Gender Bias and Digital Violence
Digital technologies are increasingly shaping our lives, our perceptions and our world of work. They are not neutral: Stereotypical gender images, power relations and global inequalities are reflected in digital spaces and in AI ‑ systems - and are often exacerbated by them. Contrary to the widespread idea of AI as a neutral tool, many applications reproduce and deepen existing discrimination.
Women and other marginalised groups are particularly affected by violence, devaluation and exclusion in digital spaces. At the same time, the foundation of many digital services and AI systems remains largely invisible: the work of data workers in the Global South who create training data, moderate content and keep systems running – often in precarious, stressful and sometimes traumatic conditions.
Data workers
Behind many digital applications are data workers who sort, check, tag or moderate large amounts of data. This work often takes place under invisible, insecure and poorly paid conditions. In addition, there is the psychological burden that can arise from filtering violent and pornographic images and is often not sufficiently accompanied by the ordering companies. In this new, precarious and unexplored field of work, it is important to strengthen the labour rights of all, but above all not to repeat the mistakes of other industries and to leave multiple discriminations among workers in the Global South undetected just because they are not asked.
Gender bias
KI‑systems often adopt and reinforce existing societal prejudices. This can be seen in distorted datasets, discriminatory decisions or a lack of representation of women and marginalised groups. The term bias This is systematic bias: Generative AI builds on huge amounts of data – such as texts, websites or images – and reproduces common patterns from them, even if they are sexist, racist or queer hostile.
This makes it visible how KI‑systems capture and exacerbate societal disadvantages rather than providing ‘neutral’ responses. Training data, system prompts and power structures in the background decide which perspectives are over-represented and which - for example by women, young people or LGBTIQ* ‑ persons - hardly occur at all. Gender bias in digital systems is therefore not a technical fringe problem, but a central question of power, participation and justice, which political decision-makers in particular must address.
Digital violence
Digital violence particularly affects women, activists and other marginalised groups. These include hate speech, harassment, threats, sexualized attacks, Nudify ‑Tools and Deepfake ‑ nude images without consent. Such forms of digital violence not only restrict individual security, but also public participation, debate and digital participation. Digital violence is a social and legal issue, not just a technical one. Therefore, the existing legal situation at EU ‑ and national level needs to be sharpened and extended in a targeted manner. The production and distribution of non-consensual sexualized images must in principle be punishable. In addition, there is a need for clear liability structures that are not only used by perpetrators., They also hold platforms and service providers accountable.
Our goals
This is what we want to achieve:
- Policy makers are familiar with the ‑, Power ‑ and exploitation relationships in the digital supply chain, such as Data Workers, and know how to make them visible from a feminist and global perspective and how to address them politically.
- Policy makers recognise how gender bias arises in AI ‑ systems and the power ‑ and gender relations are reflected in data, algorithm ‑Design and decisions on ‘neutral’ technology.
- Policy makers understand how misogynistic structures are emerging, spreading and affecting women and marginalised groups – from hate speech to digital violence to AI ‑-based assaults.
More information
Make digital spaces more feminist: FEMNET sets new priorities
Digital violence: What Europe can do and what Germany still needs to learn