This publication is licensed under the terms of the Creative Commons Attribution License 4.0 which permits unrestricted use, provided the original author and source are credited.

Trigger warning: this article discusses themes of gender-based violence and sexual assault.

Introduction

Violence against women and girls (VAWG) is a growing national and international security problem that has consistently remained an elephant in the policymakers’ room. Or perhaps a mammoth in the room, given the scale of the threat.

This week, however, the Home Secretary announced plans to treat extreme misogyny as a form of extremism in a review of UK counter-extremism strategy. The recognition of extreme misogyny as a specific extremist offence is a step in the right direction, but more needs to be done to address the root causes and enablers of extreme misogyny and the underlying role it plays in perpetuating VAWG.

This article characterises the threat of extreme misogyny and highlights the ways in which technology can enable gender-based violence, before providing recommendations on further steps towards tackling this problem. 

Context

Violence against women and girls (VAWG) is an umbrella term adopted from the 1993 UN Declaration on the Elimination of Violence against Women. The declaration defines violence against women as: 

‘[…] any act of gender-based violence that results in, or is likely to result in, physical, sexual or psychological harm or suffering to women, including threats of such acts, coercion or arbitrary deprivation of liberty, whether occurring in public or in private life.’

VAWG is a form of gender-based violence and covers a wide range of physical and psychological abuses against women and girls including domestic abuse, harassment, stalking, sexual exploitation, and honour-based abuse.

Statistics paint a grim picture for women and girls in the UK. Between 2018 and 2023, VAWG-related crimes have increased by 37%. 40% of all female homicide involves gendered violence. At least 1 in every 12 women per year will be a victim of VAWG and 1 in 20 people are estimated to be perpetrators of VAWG per year. Between 2022 and 2023, the charity Stop Hate UK reported an increase of 322% in incident reports where the motivation was gender. Deputy Chief Constable Maggie Blyth, National Police Chiefs Council (NPCC) lead for VAWG, recently described VAWG as a ‘national emergency’

The new Labour government has pledged to reduce VAWG by 50% over the next decade. But to meet this goal, the government will need to tackle the root causes of VAWG and keep pace with technological trends and their potential impacts on VAWG.

The messy web of misogyny, far-right extremism, and organised crime

Patriarchal masculinity emphasises the authority of men over women and the superiority of stereotypical masculine characteristics (such as strength or breadwinning) over stereotypical feminine characteristics. This rhetoric is a key driver of gender-based violence and can lead to misogyny, defined as ‘the dislike of, contempt for, or ingrained prejudice against women or girls’. Studies have found that boys and young men are being recommended or shown misogynistic content online, despite not actively searching for this type of content. One study found that teenage boys seeking out content on the gym, video games, and sport were presented with extremist and anti-feminist content.

A growing body of evidence has also suggested that misogyny is linked to organised crime and far-right extremist activity. Research suggests that notions of patriarchal masculinity can be used to motivate or coerce young men (and women) to join criminal groups. White supremacist and neo-Nazi ideologies promote shared misogynist narratives that women are tools for reproduction and feminism is an existential threat. One study found that gendered threat narratives were espoused across a number of far-right online groups in the UK and Australia. Misogyny has also been described as ‘the gateway, the driver, or the early warning sign’ of violent extremism.

One notorious misogynist influencer is currently charged with human trafficking, rape, and forming an organised crime group to sexually exploit women. This influencer also harbours close links to far-right figures in the UK and US. In 2023, frontline counter-extremism practitioners warned of a rapid rise in referrals concerning misogynist views held by school pupils who had consumed content from this particular influencer. 

The above example shows the importance of situating VAWG in the context of extreme misogyny. Further, it is necessary to utilise a gender lens to explore and understand overlapping extremist ideologies and their links to criminal groups or terrorist organisations.

Technology facilitated gender-based violence (TFGBV)

Today’s rapid pace of technological development provides additional spaces and means for VAWG. The misuse of online platforms and technology for VAWG has been termed technology-facilitated gender-based violence (TFGBV). According to the Institute of Development Studies, 58% of women globally have experienced some form of online harassment on social media. Women are more negatively affected by hateful online content and are disproportionately affected by online harms such as coercive behaviour and intimate image abuse. It is also crucial to understand and highlight the intersectional nature of gender-based violence. Women and girls from minority ethnic backgrounds are more at risk of experiencing online harm.

Non-exhaustive list of examples of TFGBV

Online harassment

The use of online platforms to repeatedly threaten, scare, abuse, or pester someone using degrading, offensive, or insulting comments or images.

Cyberstalking

The use of online platforms for surveillance, sabotage, identity theft, and persistent, unwanted contact using technological means.

Online impersonation

Assuming someone’s online identity with malicious intent to harass, scare, or threaten an individual.

Intimate image-based abuse

The use of sexual imagery to harass, humiliate, or exploit. This can include non-consensual taking or sharing of intimate imagery or the act of sending someone an unsolicited sexual image (cyberflashing).

Online grooming

Utilising online platforms to build relationships or connections with someone (usually a child) with the intent of exploiting or harming them sexually.

Sextortion

A form of blackmail where attackers demand money, sexual acts, or explicit images and victims are threatened with the exposure of intimate images or private information.

Doxxing

The public posting of sensitive or personal information (such as home addresses or telephone numbers) without permission.

Online sexual harassment

The use of online platforms and unwanted sexual behaviour to create an intimidating, hostile, degrading, humiliating, or offensive environment that makes a person feel threatened, exploited, coerced, humiliated, upset, sexualised or discriminated against.

Coercion and control

The use of the internet, social media, spyware and software to track and monitor the whereabouts of a victim and control or limit their contact with others.

TFGBV can be perpetrated using new and old technologies and is constantly evolving. As emerging technologies become more accessible, the barrier to entry for perpetrators of VAWG is lowered. Online encryption and anonymisation services can also make it harder to investigate and prosecute TFGBV. While gender-based violence can occur anywhere both online and offline, technology can make it easier for perpetrators to remain unknown, enable them to carry out abuse remotely, allow certain forms of abuse to be automated, make it easier to collectively organise or incite abuse, and significantly increase the scale and speed of such abuse. 

The use of generative AI to create sexually explicit deepfakes has become a serious problem globally. One study found that 96% of online deepfake videos are pornographic, with 99.9% depicting women. Tracking devices and spyware are increasingly being used for stalking or coercive control. Several instances of sexual harassment and assault in metaverse environments have also been reported. Shockingly, in January it was reported that British police opened an investigation into the alleged virtual gang rape of a minor in an immersive game.

An important second-order effect of TFGBV is the silencing of women and girls online – known as the ‘chilling effect’. A poll by Amnesty International showed that between 63% and 83% of women altered their behaviours on social media platforms after experiencing online abuse or harassment. Targeted abuse campaigns and narratives can lead to self-censorship or discourage women and girls from participating in shared digital spaces. A UNESCO survey in 2020 showed that 73% of women journalists globally have experienced online violence. Women in public-facing roles such as human rights activists and politicians also face increased levels of TFGBV. This is particularly concerning as TFGBV could be seriously impacting gender equality and representation in politics. 

What needs to be done

There is a great deal of work to be done to tackle the myriad complex issues raised in this article. Tackling the root causes and technological enablers of extreme misogyny and understanding how it relates to other forms of serious crime must be a long-term priority for Government. However, a few more steps in the right direction could have a positive impact in the short term:

  • Build trust in policing and increase capabilities: The Government has pledged to increase the number of police officers and police community support officers (PCSOs) on the streets. However, women in England and Wales have very low confidence in the ability of police to address misogyny within the police service and tackle crimes relating to VAWG. It will be crucial both to build trust in policing and increase the capability of forces to utilise digital tools to investigate VAWG. Hiring specialist investigators that can work at the intersection of VAWG and emerging technology will be critical.

  • Hold technology companies to account: Government should hold technology companies to account for the role they play in amplifying or promoting misogynistic content. Social media platforms using advertisement-driven business models and algorithms that promote ‘high-engagement’ content must do more to prevent the dissemination of misogynistic content to boys and young men. The Online Safety Act should be enforced effectively and platforms that fail to protect users from harm must be appropriately penalised. 

  • Invest in further research on TFGBV: Finally, Government should recognise the threat and impact of TFGBV and invest in further research to understand the drivers and enablers. Further research must also be conducted on the impact of VAWG and TFGBV on women who experience intersecting discriminations due to ethnicity, race, sexual orientation, gender identity, age, nationality, disability, religion, and citizenship or refugee status.

Misogyny is pervasive and has a negative impact across all parts of society. Extreme misogynistic narratives can fuel other forms of extremism and violence, and be a driver of other national security threats. This threat must be addressed through research and evidence-based policy that specifically considers the evolving technological landscape. Much like the mammoth, extreme misogyny may one day face extinction.

The views expressed in this article are those of the author, and do not necessarily represent the views of The Alan Turing Institute or any other organisation.

Authors

Citation information

Megan Hughes, "How Technology can enable Violence Against Women and Girls," CETaS Expert Analysis (August 2024).