Around the world various tech systems are reinforcing gender inequalities and entrenching racial, socio-economic systems of power, Amnesty International said in a briefing titled, “Gender and Human Rights in the Digital Age.”
Marginalized groups, including women and LGBTI people, face threats to their human rights due to extensive and improper data collection practices that do not reflect their individual realities.
Governments justify such data grab tactics as a cost-saving solution to roll-out automated systems in the public sector for benefit payments, while Big Tech companies hoard and deploy personal user data for their lucrative surveillance-based business models.
This under-regulated accumulation and processing of vast amounts of data not only constitutes harmful mass surveillance but also entrenches discrimination against women and LGBTI people.
“From the unchecked implementation of digital ID systems to algorithms being used in social benefits systems, there is a growing trend to integrate technology into every aspect of daily life amidst a pre-existing global gendered ‘digital divide’, where access to technology has been restricted for some, driven by patterns of historical inequality,” said Imogen Richmond-Bishop, Technology & Economic, Social and Cultural Rights Researcher at Amnesty International.
“Any technology introduced for governance is embedded within a discriminatory context of this existing digital gap.” For example, in Pakistan, the National Database and Registration Authority (NADRA) suspended the ‘X’ category on its Computerized National Identity Cards (CNICs). This category had allowed individuals to identify with a gender other than male or female.
The decision left thousands of transgender and gender-diverse individuals without valid identity documents, preventing them from exercising their fundamental rights, such as voting or accessing healthcare and employment opportunities. The registrations under ‘X’ category, however, were resumed in September 2023.
In addition to the digital divide, there are several other barriers that women, girls and LGBTI people face when exercising their human rights in the digital space, including accessing information about sexual and reproductive health, rights and services, such as abortion.
When governments or social media platforms limit access to health information, particularly for key services for women and LGBTI people, it may constitute a violation of the right to health. This trend is on the rise in the United States, where abortion rights activists and organizations have reported removals of abortion-related content on Meta and TikTok, effectively preventing people from accessing life-saving information.
Systems that use algorithms to promote content on social media platforms can also facilitate bias by amplifying harmful and discriminatory content. Research by Amnesty International on Tik Tok found that the company infers a user’s personal characteristics, including gender and interests, based on the information it has about them to personalize and customize content and advertisements.
Targeted digital surveillance through the use of spyware can also constitute a form of tech-facilitated gender-based violence (TfGBV). Women and LGBTI people are targeted and surveilled for engaging in human rights activism and face a range of gendered impacts because of such targeting.
Amnesty International’s research in Thailand exposed how activists have been maliciously and unlawfully targeted with digital surveillance and online harassment by state and non-state actors, resulting in deeply harmful gendered impacts on women and LGBTI human rights defenders. Such pernicious targeting, including with the notorious Pegasus spyware, has created a “chilling effect”, which has led to self-censorship, or a withdrawal from activism in some cases.
Women and LGBTI activists in Thailand were also subjected to forms of online harassment, including doxing, smear campaigns, threats, and abusive messages, with the aim of intimidating, causing distress and silencing them.
“It is vital that governments and private actors take an explicitly gender inclusive approach to regulating technologies and addressing its harms. If these systems perpetuate discrimination and inequality for women and LGBTI people, then they should not be deployed,” said Imogen Richmond-Bishop.
In 2024, Amnesty International published a technical explainer about the Samagra Vedika systems being used in the Telangana state of India. The technical explainer follows media reports blaming Samagra Vedika for allegedly excluding thousands of people from accessing social protection measures, including those related to food security, income, and housing.
In 2023, Amnesty International’s research, Trapped by Automation: Poverty and Discrimination in Serbia’s Welfare State, documented how many people, particularly Roma and those with disabilities, were unable pay bills, put food on the table, and struggled to make ends meet after being removed from social assistance support following the introduction of the Social Card registry.