Text-Based (Sexual) Abuse and Online Violence Against Women: Toward Law Reform?
The Emerald International Handbook of Technology-Facilitated Violence and Abuse
ISBN: 978-1-83982-849-2, eISBN: 978-1-83982-848-5
Publication date: 4 June 2021
Abstract
The ideal of an open, all-inclusive, and participatory internet has been undermined by the rise of gender-based and misogynistic abuse on social media platforms. Limited progress has been made at supranational and national levels in addressing this issue, and where steps have been taken to combat online violence against women (OVAW), they are typically limited to legislative developments addressing image-based sexual abuse. As such, harms associated with image-based abuse have gained recognition in law while harms caused by text-based abuse (TBA) have not been conceptualized in an equivalent manner.
This chapter critically outlines the lack of judicial consideration given to online harms in British courts, identifying a range of harms arising from TBAs which currently are not recognized by the criminal justice system. We refer to non-traditional harms recognized in cases heard before the British courts, assessing these in light of traditionally recognized harms in established legal authorities. This chapter emphasizes the connection between the harms suffered and the recognition of impact on the victims, demonstrated through specific case studies. Through this assessment, this chapter advocates for greater recognition of online harms within the legal system – especially those which take the forms of misogynistic and/or gendered TBA.
Keywords
Citation
Barker, K. and Jurasz, O. (2021), "Text-Based (Sexual) Abuse and Online Violence Against Women: Toward Law Reform?", Bailey, J., Flynn, A. and Henry, N. (Ed.) The Emerald International Handbook of Technology-Facilitated Violence and Abuse (Emerald Studies In Digital Crime, Technology and Social Harms), Emerald Publishing Limited, Leeds, pp. 247-264. https://doi.org/10.1108/978-1-83982-848-520211017
Publisher
:Emerald Publishing Limited
Copyright © 2021 Kim Barker and Olga Jurasz. Published by Emerald Publishing Limited. This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode.
License
This chapter is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of these chapters (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode.
Introduction
The ideal of an open, all-inclusive, and participatory internet has been undermined by the rise of gender-based and misogynistic abuse on social media platforms. In June 2018, the United Nations Special Rapporteur on Violence Against Women presented a thematic report to the UN Human Rights Council on online violence against women (OVAW), in which she noted that
Despite the benefits and empowering potential of the internet and [information and communications technology (ICT)], women and girls across the world have increasingly voiced their concern at harmful, sexist, misogynistic and violent content and behaviour online.
(UN, 2018, p. 5)
OVAW has indeed grown in prevalence especially on social media platforms and has an everyday impact on women and girls who participate online, be it in a private or professional capacity. In the United Kingdom, despite the increase in public focus and campaigning on this issue (Change.org, 2020; Mason, Davies, & Bakare, 2020), as well as the Government's attempts to address online harms more generally (UK Government, 2019), little has been done to address OVAW through the legal system in a comprehensive, holistic, and gender-sensitive manner. Instead, a fragmented and ad hoc approach to addressing OVAW has been deployed, resulting in the emergence of a legal framework and proposals for further law reform (e.g., UK Government, 2019), which demonstrates a narrow understanding of the phenomenon of OVAW, the real and continuing impact it has on its victims, as well as the limited and old-fashioned conceptualization of harms resulting from online abuse (Barker & Jurasz, 2019c). Typically, reform proposals that focus on the online abuse suffered by women tend to place the emphasis on selected aspects of OVAW. For example, when it involves image-based sexual abuse (IBSA) or when it is committed in domestic violence/intimate partner violence contexts. These proposals also largely exclude a gender-focused analysis of the responsibilities of platform providers (Barker & Jurasz, 2020c). Equally, the proposals for reform addressing online harms (not only those suffered by women) do not engage with a full spectrum of OVAW and exclude gendered and non-traditional harms resulting from it. Instead, they engage with selected and politically appealing aspects of online abuse and online harms, such as the protection of children (to the exclusion of the fact that children and adolescents perpetrate online abuse too), or place a disproportionate focus on cases where online abuse results in physical harms such as suicide (Mason et al., 2020; UK Government, 2019).
However, one of the key exclusions within the proposed legal frameworks both domestically and internationally is the continuous omission of online, gendered text-based abuse (TBA) and the harms arising from it (Barker & Jurasz, 2019a, 2020c). Despite the prevalence of this form of online abuse directed at women and girls, TBA has not been conceptualized or enshrined in the domestic laws in the United Kingdom and harms arising from it have not been afforded a thorough and modern judicial consideration. This chapter outlines the problems, such as omission causes, and advocates for the inclusion of a TBA as a distinct category. Such recognition would give credence to the harmful impact such abuses have on those subjected to them – something also missing from the current paradigm and redress mechanisms.
The chapter begins by offering an overview of text-based (sexual) abuse (TB(S)A) and its correlation to the much more widely known phenomenon of IBSA. The discussion then moves on to assess the harms caused by TBA, outlining their wide-ranging impact, especially upon prominent women, before exploring the misperceptions that pervade these harms leading to their lack of capture by the legal and judicial systems. The argument concludes by highlighting that a failure to capture TBA, and a failure to address the harms caused by such behaviors, is a failure to prioritize digital equality. This chapter argues for increased recognition of, and redress for, TB(S)As within the legal system to ensure digital and participatory rights for women.
Text-Based (Sexual) Abuse
TBA can be defined as “written, electronic communication containing threatening and/or disruptive and/or distressing content, such as, e.g. textual threats to kill, rape, or otherwise inflict harm on the recipient of such messages” (Barker & Jurasz, 2019a, p. xiv). Further examples of TBA can include hateful posts or tweets online, and textual content aimed to degrade or distress the recipient. TBA is frequently directed against women participating online and can take sexualized forms, including sexualized messages and threats. In such instances, TBA becomes text-based sexual abuse (TB(S)A). However, both sexualized and non-sexualized forms of TBA amount to OVAW. TB(S)A can also amount to online hate speech if the content of the message demonstrated prejudice because of a person's specific characteristics, such as race, religion, or gender. That said, while hate crime and hate speech are closely related, the concept of online hate speech is much broader than the legal categorization of hate crime – in case of the latter, there is a requirement for an underlying criminal act to be committed and motivated by hostility or prejudice against a person's (or persons') characteristic or perceived characteristic (Barker & Jurasz, 2019a; CPS, 2017).
TB(S)A is demonstrative of the persisting gender inequalities that are mirrored in online environments. It is also an indicator of the persistent normalization of violence against women within modern societies – be it in its physical, structural, or symbolic forms – and the reluctance of the legal system to adequately address gendered harms suffered by women, including prevention, and adequate reparation. Instead, the exclusion of TB(S)A from the legal framework on OVAW and online harms reinforces the existing status quo, whereby the harms suffered by women are not adequately addressed in the legal system, the gender-based and wide-ranging aspects of harms involved in TBA/TB(S)A are largely missing, and victims are left without adequate recourse to justice and reparation. The prevalence of TB(S)A has repercussions not only at an individual but also a collective level. For example, the hypervisibility of TB(S)A and gender-based forms of TBA on social media platforms legitimises the social acceptance of violence against women, and the perpetuation of gender discrimination and inequalities. Paradoxically, the hypervisibility of TB(S)A leads to a growing absence of women from public/online spaces and the invisibility of their voices within those spaces. As such, TB(S)A has silencing effects on women and girls participating online and contributes to the creation of hostile spaces for women.
This is a significant concern from the perspective of ensuring non-discrimination and equality of participation as TB(S)A leads many women and girls to effectively withdraw from participating online and in public life (Barker & Jurasz, 2019b; Barker & Jurasz, 2019d). For instance, the UK Girlguiding Girls' Attitudes Survey (2016) showed that 49% of the 1,600 surveyed girls aged 11–16 and 44% of young women aged 17–21 felt unable to express their views in an online environment (pp. 17–19). The impact of online TBA on women's participation online and in public life was also evidenced in the responses of women politicians who were subjected to online abuse in the lead up to the UK 2017 General Election (Dhordia, 2017), which pointed toward an extremely high volume of online abuse affecting women's participation and expressing themselves online. This has not been confined to the 2017 General Election. In the 2019 General Election, 18 of the 50 Members of Parliament (MPs) standing down were female, and several attributed their departure to the abuse and harassment suffered while in elected office (Barker & Jurasz, 2020a; Oppenheim, 2019). As such, TB(S)A affects women's rights to full and equal participation in public and political life, the ensuring of which is also an international obligation of states that are parties to the 1979 UN Convention on Elimination of All Forms of Discrimination Against Women (CEDAW). Under Article 7 of CEDAW, states are obliged to take steps to prevent and eliminate discrimination of women in public and political life and to ensure their equal participation. While this obligation was articulated over 40 years ago (and therefore before the rise of social media and the internet), its contemporary interpretation ought to extend to online spaces as a primary location of modern day political and public life (Barker & Jurasz, 2019c, p. 5).
Text-Based Sexual Abuse versus Image-Based Sexual Abuse
Despite the widespread and everyday nature of TB(S)A and its harrowing impact on victims, the law in the United Kingdom (and many other jurisdictions worldwide) has not been reformed to address this pressing issue. This stands in stark contrast with the attitude of legislators in the United Kingdom and in selected overseas jurisdictions toward addressing a different type of OVAW – image-based sexual abuse (IBSA). IBSA, which involves the nonconsensual creation and distribution of private sexual images (McGlynn & Rackley, 2017), has been the subject of speedy legal reform in the United Kingdom and also in other jurisdictions, such as Australia, Canada, France, Israel, and Japan (Flynn & Henry, 2019). 1 The introduction of new laws tackling IBSA (or, the problematic term, “revenge pornography”) has become a legislative priority which is rarely seen in legal reforms aimed at addressing sexual abuse or other forms of gender-based violence. Although new legal provisions dealing with IBSA have attracted some criticism from academic commentators (e.g., Henry et al., 2020; McGlynn & Rackley, 2015; Powell, Scott, Henry, & Flynn, 2020), the provisions nonetheless signified a remarkable progress in addressing a modern form of violence that is perpetrated typically using the internet or otherwise facilitated by communications technology.
While harms associated with IBSA have gained recognition in law (leading to expeditious law reforms and an increase in accountability), harms caused by TBA have not been conceptualized in an equivalent manner, nor given the same attention. The juxtaposition of the prompt IBSA reforms with the lethargic response of the legal system to TB(S)A raises questions about the ultimate hierarchy of harms within the legal system and the visibility of non-image based gendered violence.
Both IBSA and TB(S)A while involving somewhat different acts lead to a number of harms in online and offline contexts. Harms associated with IBSA have been categorized to include wide ranging consequences, both for the individual (e.g., the violation of a person's dignity, privacy, and physical integrity) and broader society, including cultural harms (McGlynn & Rackley, 2017). Powell and Henry (2017) also note emotional distress, reputational harms, loss of job opportunities and when IBSA happens in the context of domestic violence/IPV nonconsensual acts (pp. 137, 138) as likely implications of IBSA. As we demonstrate later in this chapter, many of the harms associated with IBSA also happen when TB(S)A is committed. However, unlike IBSA, the victims are left without appropriate channels of redress because of severe shortcomings in the legal system which do not capture this particular form of online abuse. Although TB(S)A can be directed against both men and women (by both men and women alike 2 ), TB(S)A directed against women typically takes sexualized and gender-based forms which are rarely addressed within the existing laws, nor by the key actors in the justice system.
The IBSA-TB(S)A dualism created by the contrasting approach of the legal and justice system toward these two forms of online abuse, if unaddressed, leads to the perpetuation of gender inequalities within the law. The unequal approach of the legislators toward IBSA as compared to TB(S)A results in the creation of a hierarchy of harms within the legal system, in which more credence and gravitas are given to forms of online abuse involving photographic representations of the victim than textual and frequently very violent abuse. By maintaining such a status quo through legislative inaction, the legal system creates a self-reinforcing but false perception that IBSA is more harmful and serious than non-image based (sexual) abuse. As such, the legislative gap results in non-image-based forms of abuse and violence against women remaining invisible within the legal system. This contributes to the erasure of the full narrative of OVAW from the legal system, coupled with a resulting lack of reliable data capturing the prevalence of TB(S)A. Ultimately this results in very limited accountability and redress for these violent acts. In turn, TB(S)A becomes a normalized form of OVAW and women's experiences of it remain on the periphery of the legal and justice systems.
Text-Based Sexual Abuse Harms: An Early Assessment
The early assessment of the treatment of online harms – especially those associated with TB(S)A – in British courts shows very slow progress toward an understanding of these modern behaviors and their consequences. Notably, in the few cases that actually involved TB(S)A and/or TBA directed against women, there is a remarkable absence of gender analysis of TB(S)A and harms in the judgments. The judgments also lack substantive judicial reasoning on the broader harmfulness of online TB(S)A – especially in relation to promoting and normalizing violence against and abuse of women and girls, as well as, where applicable, promoting the hatred of women and/or inciting violence against women through misogynistic TBA. The 2014 case of R v Nimmo and Sorley – the first judgment in England & Wales marking a successful prosecution for TB(S)A – is an early exemplar of such shortcomings, despite some notable steps in recognizing and beginning to conceptualize non-traditional harms. 3
Nimmo and Sorley (2014) dealt with online abuse in the form of abusive, threatening, and violent tweets sent by the defendants to the feminist campaigner Caroline Criado-Perez and Stella Creasy MP in relation to their involvement in the campaign to bring more women onto the Bank of England banknotes in 2013 (Barker & Jurasz, 2019a). The sentencing remarks of Judge Riddle have made a significant contribution in that they highlighted the issue of the transference of harms between online and offline worlds. For example, Judge Riddle commented that the harm, while serious, was “entirely predictable,” and was exacerbated by the manner in which the threats and harassment were communicated online, at all hours, and promising physical harm, too ( R v Nimmo and Sorley, 2014, p. 2). Riddle identified these as indicators of aggravating features in the sentencing considerations especially because, “[t]he victims had no way of knowing how dangerous the people making the threats were” ( R v Nimmo and Sorley, 2014, p. 3). Riddle's remarks also focused on the anonymity of the perpetrators as a factor heightening the victims' fear for their personal safety, as well as outlining several non-traditional harms, such as the economic and residential consequences of online TBA and the severe psychological impact on the victims. Riddle stated that, “the effects of the harassment [they] received have been life-changing” ( R v Nimmo and Sorley, 2014, p. 2). However, none of the judicial remarks engaged with the sexualized and violent nature of many of the tweets received by Criado-Perez and Creasy, nor elaborated on how the visibility of such content promotes (O)VAW and normalizes such forms of gendered abuse and violence within society.
Similarly, the 2017 judgment and sentencing comments in R v Viscount St Davids – a case dealing with extreme online abuse suffered by the anti-Brexit campaigner Gina Miller – lack the gender analysis of TB(S)A received by Miller, and its impact. Miller was subjected to racist, sexist, and misogynistic abuse, yet the judgment focused only on the racist aspect most likely because of the aggravation based on race that resulted in a harsher sentence for the perpetrator ( R v Viscount St Davids , Sentencing Comments, para 30). 4 As such, the intersectional nature of Miller's TBA was rendered invisible in the official legal documents concerning the case, therefore only partially providing an accurate record of the nature, the extent, and the impact of the overwhelming amount of the abuse received. For instance, Miller received messages stating that the “Jo Cox killing would be good for you” (Anthony, 2018) – such messages suggesting that the Jo Cox murder is being used as a symbolic point of reference for the extreme acts of violence targeting women. The harmful nature of such tweets and the socio-cultural harms they perpetuate have not been thus far addressed, nor condemned by the legal system, overlooking incitement to VAW. Despite a number of social campaigns against VAW and the “in principle” commitment to tackling VAW by the Government and agencies of the justice system, these everyday, symbolic acts of violence continue to be inflicted upon women on social media.
While not addressing gender, Judge Arbuthnot made an important recognition in her sentencing remarks in R v Viscount St Davids (2017) when noting that the public profile of the victim does not justify such “warped behaviour” (para 3) being directed at them. This is significant, especially in light of proposals in the United Kingdom to afford higher levels of protection to politicians (UK Government, 2018) and high-profile individuals (Barker & Jurasz, 2020b). As we argue elsewhere (Barker & Jurasz, 2017)
…the profile of the alleged victim should be entirely irrelevant in terms of the alleged abuse received – the harm and the impact of the harm should be the determining features in tackling issues of online abuse.
(p. 15)
It is equally important to recognize that online abuse is not only a matter concerning celebrities and politicians. As Miller’s case evidences, the extreme levels of abuse can be – and are – frequently directed at women involved in high-profile legal proceedings, and cause harms not only to the individual, but which also undermine democratic values and fundamental rights. The harmful impact of online abuse, especially TBA, on the latter was explicitly recognized by the then President of the UK Supreme Court, Lord Neuberger, at the opening of the Article 50 appeal case brought by Gina Miller. While making the court order to ban the names of former claimants in the proceedings, prohibiting the publication of the home address of Gina Miller and other parties, and prohibiting publication of any information which could identify the interested parties in the proceedings, Lord Neuberger stated:
We have made this order largely because various individuals have received threats of serious violence and unpleasant abuse in emails and other electronic communications. Threatening and abusing people because they are exercising their fundamental right to go to court undermines the rule of law. Anyone who communicates such threats or abuse should be aware that there are legal powers designed to ensure that access to the courts is available to everyone
Given the overwhelming absence of substantive judicial reasoning concerning social, democratic, and participatory harms associated with online abuse and TB(S)A, Lord Neuberger's statement (although not a part of the Miller judgment) provides a significant recognition of the impact of online TBA on the rule of law, as well as on one's fundamental rights. It is nonetheless regrettable that similar recognition has not been made in relation to the impact of TBA on the normalization and promotion of (O)VAW.
Misperceiving Text-Based (Sexual) Abuse: Time to Capture Its Harmful Impact?
In addressing the phenomenon of TB(S)A, the approach that has been witnessed to date is one that focuses on the ability of various and disparate legal provisions to capture behaviors (not harms) that amount to TB(S)A conduct. The essential weakness of this attitude is one that permeates misperceptions about the behaviors, the motivations behind such behaviors, and the harm inflicted by allowing this to go unchecked, unchallenged, and uncaptured. For instance, in tackling TB(S)A that includes rape and sexualized death threats as in Nimmo and Sorley (2014), the predominant legal response in England and Wales is to address this through the criminal law provisions dealing with the misuse of a public communications network through section 127 of the Communications Act 2003. In such convictions, and in those of R v Nunn (2014) that followed, no consideration nor criminal charge was levied in respect of the more serious criminal behavior of making threats to kill (Barker, 2018). The inaction in tackling these behaviors through the criminal law is broadly reflective of the wider perception that none of this is “serious” and resides purely “online” (Barker & Jurasz, 2020c). This attitude was particularly evident in the response of the local police when Stella Creasy MP initially reported the abusive and threatening tweets to her local police via Twitter. In their response, the police indicated that she would need to report the issue if it was not an overreaction (Barnett, 2014). By extension, responses indicating that such behaviors are trivial imply that there are no significant repercussions, and it is therefore acceptable to engage in such activity. This perpetuates the misperception that TB(S)A is not a “real” problem, and when compared to IBSA also not a visible one.
The conceptions of TBAs have been frequently shrugged off as less serious than offline and are “part and parcel” of what happens online (Barker & Jurasz, 2014), including by the actors in the criminal justice system (Holt, Lee, Liggett, Holt, & Bossler, 2019). Such an attitude – as noted in early reports of what amounts to the trivilization of TBSA – perpetuates the idea that such behaviours are to be expected. Given the significant impact that reported threats and harassment can cause those subjected to it, there is growing recognition of the harmful and long-lasting impact and increasing prevalence of TB(S)A. The Inter-Parliamentary Union in 2018 highlighted that 85.2% of the women parliamentarians surveyed have been subjected to some form of psychological violence – an overwhelming majority of those surveyed, and more disappointingly, an increase from 81.8% just two years earlier (Inter-Parliamentary Union, 2016). Similarly, Amnesty International highlighted the scope of the problem in 2017 (Dhordia, 2017), and Tim Berners-Lee – the founding father of the internet – reiterated on the 31st birthday of the internet's creation that it remains an incredibly hostile environment for women (Berners-Lee & Berners-Lee, 2020). All of these indicators demonstrate a lingering notion that TB(S)A is not “real,” is not harmful, and does not require attention. This comes almost a decade on from the Gamergate fiasco, 5 which first saw Zoe Quinn, and then Anita Sarkeesian, raise widely the topic of the online abuse of women, something which Sarkeesian (quoted in Webber, 2017), says is “very easily done and there are very few consequences for it” (para 5; see also Barker & Jurasz, 2014). The notion that TB(S)A is not harmful persists, and is not changing despite the increased attention that the phenomenon receives – awareness raising does not lead to redress nor reform, and instead often confuses the legal landscape (Barker & Jurasz, 2020d) and to date, little meaningful action has been taken to challenge the idea that persistent volumes of TB(S)A lead to harms not traditionally conceptualized in law.
The transferability and intrusiveness of TB(S)A is also something that is frequently misperceived – with suggestions to “mute” or “hide” tweets (and by extension the TB(S)A)) not uncommon, and encouraged, in particular by Twitter itself (Cuthbertson, 2019). Given the “always on” nature of digital life, TB(S)A is a personal, pervasive, and intrusive issue that is not something that can be left outside of our homes. While there have been few prosecutions for the high-profile abuses suffered by prominent women, TB(S)A is much more widespread, and much more encompassing in its impact. For instance, while courts have given judicial recognition to reputational harms that can manifest themselves in civil suits – such as the harmful and defamatory impact of tweets in Monroe v Hopkins (2017), no such recognition has been forthcoming in a criminal context for similarly abusive tweets. Indeed, in Monroe (2017), the court found that there had been “serious harm” inflicted on the claimant
…on the straightforward basis that the tweets complained of have a tendency to cause harm to [Ms. Monroe's] reputation in the eyes of third parties, of a kind that would be serious to her
(para 70)
If serious harm can be recognized in reputational circumstances in civil law cases, there is clear precedent for the recognition of the harmful impact TB(S)A can have. This, along with other (legally) recognized harms, suggests that there remains a mental block on the recognition of the general harm that can be caused by TB(S)A. For instance, while there is no specific judicial recognition given to TB(S)A as a hate crime, the harm caused by disablist abuse is recognized by the hate crime framework in England and Wales. Again though, no recognition of the online aspect of this has been judicially considered, and not beyond the hate crime perspective.
TB(S)A also makes a stark distinction between the recognition of physical harms, such as assault, and the digital equivalents. Where physical harms have been forthcoming, such as the murder of Jo Cox, there is clear capturing of the harm by the legal system. Yet, where there is a real text-based threat of physical harm – which could in itself be sufficient to give rise to criminal charges – this receives much less attention when the threat is communicated electronically, again indicating that TBSA is still considered to be something of a lesser element in the legal conceptions of harm.
As Table 14.1 demonstrates, TB(S)A invades homes, social groups, families, and non-work interactions. It is a continuous barrage of abuse that causes ongoing impacts; something Jess Phillips MP (2018) has made very clear when she indicated that its invasiveness can catch you off-guard because it is unexpected in the “warmth and safety” (p. 208) of the family home. This point is also recognized by former MPs Heidi Allen and Nicky Morgan in their decisions to stand down from front-line politics. What is perhaps the most harmful aspect of this – but which goes almost unrecognized by the existing legal structures – is the expectation that prominent and visible women – especially online – will be subjected to TB(S)A because, “any publicity will come with a whole load of abuse” (Phillips, 2017, p. 217). The enduring harm that perpetuates has significant psychological impacts – on both the victims and those who witness the abuse, reiterating the dangers of engaging online, but also the risks associated with being engaged digitally. Harm to bystanders, and harms to other “followers” of the target, are harms that are not categorized nor recognized by the courts in respect of TB(S)A either. These harms signal to women that they are not welcome online, and their opinions are dismissed, shouted down, or manipulated. This is not, of course, confined purely to online spaces, with such behaviors replicating the shutting down of women's voices in many offline spaces, too (Vera-Gray, 2016).
Type of Harm a | Impact | Victim | Context |
---|---|---|---|
Residential | Extra security measures, panic button installed in the home. | Stella Creasy MP | Campaign to include more women on the Bank of England banknotes (2014). |
Economical | Money and time spent on ensuring to be as untrackable as possible. | Caroline Criado-Perez | Campaign to include more women on the Bank of England banknotes (2014). |
Emotional | Long-term personality changes; feelings of panic, fear, and horror; distress. | Caroline Criado-Perez | As above. |
Physical | Online abuse; violent threats; assassination in her constituency in June 2016. | Jo Cox MP | Jo Cox was a Labour politician who was an outspoken feminist and who stood up to online abuse, and violence against female politicians. She campaigned for greater social equality and identified as a remainer in the lead up to the 2016 Brexit referendum. |
Social | The invasiveness of “always on” mobile devices, allowing the intrusion of text-based sexual abuse into family and social life. | Jess Phillips MP | Jess Phillips first received tranches of rape threats when wrapping presents for her children in her home. |
Reputational | Regularly subjected to abuse that alleges she is too ugly to be raped, with repeated slurs regarding her sexuality. | Mhairi Black (S)MP | In one of her speeches in Parliament, Black used the word cunt four times when reading out examples of the online abuse she receives. |
Intersectional | Extreme racist and misogynistic abuse; distress; fear for security of herself and her family, resulting in employing security detail. | Gina Miller | Campaign to stop Brexit (2016–2017), leading the case which led to the Parliament voting on triggering Article 50 of the EU Treaty. |
Democratic | As above. Broader impact of the online text-based abuse on democratic values and the rule of law (as suggested by Lord Neuberger). | Gina Miller | As above. |
Disablist | Disablist online abuse because of Asperger's syndrome; comparisons between Thunberg and the Nazi Propaganda; portrayal as “deeply disturbed”; death threats and death wishes (e.g., Arron Banks's tweets in August 2019); cartoon portraying Thunberg being sexually assaulted. | Greta Thunberg | Campaigning for addressing climate change at a local and global level. |
Ageism | Sexist and ageist abuse (e.g., a sexist tweet from a French philosopher Bernard Pivot) b ; comparisons to a “little brat,” “prophetess in shorts,” “deeply disturbed teen.” | Greta Thunberg | As above. |
Two other categories of harm which are pervasive and affect all of the individuals listed in this table are psychological harm, and emotional distress. For instance, Joanna Cherry QC (S)MP, Nicky Morgan, and Aileen Campbell MSP in particular have outlined the severe impact that TBSA has had on their lives.
“In my generation, the boys were looking for the little Swedes who had the reputation of being less stuck than the French girls. I imagine our astonishment, our fear, if we had approached a Greta Thunberg” (@bernardpivot1, 25 September 2019, 7.38 a.m.: Dans ma génération, les garçons recherchaient les petites Suédoises qui avaient la réputation d’être moins coincées que les petites Françaises. J'imagine notre étonnement, notre trouille, si nous avions approché une Greta Thunberg…). Translation by the authors.
With an unchanging perception of TB(S)A, the inevitable consequences are also perpetuated in their impact and spread of harm. To protect themselves – especially where the criminal justice system has proven unwilling or unable to act against the perpetrators – women subjected to TB(S)A are opting out of online visibility and online interactions (Barker & Jurasz, 2020b). Inaction underlines the social harms (Table 14.1) that come from making online spaces more hostile to women than to other groups. This is, in itself, a further harmful response to the original harm – but one which manifests its impact more widely than with an individual victim. In a democratic, modern society, the internet is a source of information but also access to networks, interactions, family, friends, and work. By essentially being driven to online invisibility through online hostility and TB(S)A, women are disengaging as a means of self-protection first, and a response second, resulting in social, democratic, and reputational harms, too. This is damaging not just to the participatory rights of the victims themselves, but to society more broadly, where visible women with powerful messages are forced into self-silencing. This damages wider societal debates and interactions, particularly relating to topics of interest to other women.
Consequently, where the criminal justice system fails to protect women from TBSA, the original harm from the threats and abuse persists, manifesting itself in democratic, social, and participatory harms (see Table 14.1). This is particularly evident through the – unfortunately increasingly frequent – examples of the UK General Election in 2019, and the forthcoming Scottish Parliament Election of 2021, where a number of high-profile women politicians – including former government ministers (Oppenheim, 2019) – have opted, or are opting, out of seeking re-election because of the impact of TB(S)A, especially trolling and online abuse (Barker & Jurasz, 2020b; McLaughlin, 2020; Paton, 2020). Not only are these women victims of crime, but they are also victims of participatory and democratic silencing – harms very evident but not (yet) judicially recognized in this context. Disappointingly, given the UK Government's (2019) commitment to addressing online harms, none of these harms have been specifically listed nor explicitly discussed in the Online Harms White Paper (which lists 23 types of online harm (p. 31)), which further evidences that such harmful online behaviors should be recognizable but are not a priority for the legislative reform agenda. The lackluster reaction of the Government in this respect reflects the manifestation of silencing and continues to contribute to the production of a system which excludes women, and the voices of women. This active silencing – arguably necessary as a mechanism of self-care by victims in the absence of other responses is the realization of gender stereotyping moving from the offline to the online, reflecting the transference of harm moving from online TB(S)A to offline lives.
Beyond this, omitting to recognize and address TB(S)A fails to appreciate fully the core elements of digital democracy, and the protections that are required. Selective inaction for TB(S)A compared with – for example – offline public order offenses, 6 sends a clear signal that online criminality will go unpunished and unprosecuted. Not only does this undermine the equality that the internet (purportedly) offers but also it weakens the usefulness of a virtual common space for – otherwise impossible – gatherings, solidarity, and protests for change, such as the Arab Awakening 7 (Barker & Baghdady, 2017). This is broadly reflective of a much larger problem in that the existing mechanisms – including reporting – are designed to mask, rather than tackle, the problem of TB(S)A itself. This approach, through technical tools such as muting and blocking (Arthur & Kiss, 2013), suggests that the problem is one that is not really a “big deal” and is really just a form of social “banter,” rather than anything more menacing, despite the clear mens rea that is presented in the sending of the threatening and abusive communications. Given that the threats are clear statements as to the intention of the sender, the (mis)perceptions of TB(S)A are even more difficult to justify. That said, the existing criminal law framework does not make provision for addressing online forms of gender-based violence such as TB(S)A and is steadfast in its insistence that those behaviors are captured by existing legislative provisions (Communications Select Committee, 2014–2015), despite arguments that this is not the case (Barker & Jurasz, 2019a).
Conclusion: Text-Based Sexual Abuse as Digital Equality?
The participatory harm of TB(S)A is significant in a number of ways, both in terms of silencing women as victims, but also as a representation to other women of the target they will become should they dare to step up, speak out, and use the internet as a platform for debate and engagement. This in turn leads to a reduction in the volume and quality of voices online, especially those in politics, and democracy more broadly, which is significantly lessened. The internet has been battling on this point for 31 years and is still a place that is hostile to and for women (Berners-Lee & Berners-Lee, 2020). These harms culminate in a much broader social harm whereby interactions and opportunities are lost, but also where the ability to socialize and network online are compromised. This is tantamount to the silencing and exclusion of not only individuals but also groups, and entire sectors of society, reinforcing gender inequality in the digital arena. It is also evident of the intersectional discrimination (Table 14.1) that is prevalent in the context of the harms suffered by prominent women online.
All of these non-traditional harms contribute to a much broader, and more problematic, further harm; that of the bystander. The misperception and the inability to address the original problem appropriately causes other, non-direct harms, especially to those to whom it is visible. In other instances, dealing with problematic tweets, including Monroe v Hopkins (2017) , senior courts have shown the capability to assess non-traditional harms, particularly serious harm to reputation or, as in Chambers v DPP (2012) when (joke) tweets were posted threatening to bomb airports for canceled flights, the harm to the wider public. Clearly there is a distinction between threatening communications that may amount to the disruption of travel plans and multiple victims of physical harms – where action was swiftly taken, and prosecution followed – and, the TBSA received by women that results in the relocation of families, panic button installations, and personal security recruitment, but where little action is taken without repeated requests for assistance.
Misperceptions, and a failure to capture the harmful impact of TB(S)A mean that it is a phenomenon which goes largely unchecked. Other areas, including IBSA, have seen legal responses fast tracked to address the harms they cause, though these have not been without criticism in their capacity to fully address the harms or lived experiences of IBSA victims, particularly in England and Wales (see McGlynn & Rackley, 2015; Powell et al., 2020). The inequality of treatment between these two related phenomena reinforces the message that disparity, vitriol, and abuse is an expectation and a natural consequence for women and girls participating online.
To protect participatory rights, and tackle TB(S)A, urgent reforms are needed. These reforms need to spread beyond the alteration or introduction of legislative provisions – the law itself cannot (and should not) be charged with tackling the phenomenon. The legal system needs to pay direct attention to the issue, rather than suggesting (again) that the existing law is capable of addressing the issue; it must capture TB(S)A specifically within the online harms landscape as a matter of urgency, and before the internet becomes a place entirely too hostile to women. TB(S)A is a pressing issue for the digital era, but also for digital equality.
Notes
France: Digital Republic Law 2016; Israel: Prevention of Sexual Harassment Law, 5758-1998, amended in 2014; Canada: section 162.1 Criminal Code (amended through Bill C-13 Protecting Canadians from Online Harm Act 2015); Japan: Shiji Seiteki Gazou Kiroku No Teikyotō Niyoru Higai No Boushi Nikansuru Hōritsu [Act on Prevention of Victimization Resulting from Provision of Private Sexual Image], Law No. 126 of 2014; England and Wales: section 33 Criminal Justice and Courts Act 2015; Scotland: Part 1, s2 Abusive Behavior and Sexual Harm Act 2016; Northern Ireland: Part 3, section 51 Justice Act (Northern Ireland) 2016.
For instance, the Daily Mail columnist, Katie Hopkins, sent a tweet to journalist Laurie Penny that referred to Penny as a woman of ISIS and also implied she is subjected to sexual violence perpetrated against women by ISIS members. The tweet and the context in which it was sent was outlined in Monroe v Hopkins (2017): “Hopkins tweeted a photograph of the vandalised war memorial, and a link to @PennyRed's tweet of 8:04 p.m. with these observations: ‘@PennyRed thinks this is OK. Burn her passport, bulk buy her lube & make her a woman of ISIS.’” (para 14).
Here, we define non-traditional harms as harms which have not been recognized in the legislation nor in judicial statements (for instance, participatory harms). We list a number of non-traditional harms, together with specific examples, in Table 14.1.
Given that neither sex nor gender is protected as a characteristic in the hate crime framework of England and Wales, there is no statutory aggravation based on either sex or gender.
Gamergate is the catch-all term for the vicious online hate, trolling, and harassment campaigns conducted largely through the use of hashtags, and which arose in light of the release of game Depression Quest by Zoe Quinn in 2013. The dichotomy between fans and critics highlights the contrast between the traditional preserves of online gamers, and the newer era of gamers. The orchestrated and anonymous “campaign” known now as Gamergate extends to include targeting prominent feminist media critics, as well as independent game developers.
Under the Public Order Act 1986 (POA), offenses usually require the perpetrator and the victim (or another who could be similarly affected) to be in the physical proximity of the abusive behavior – something quite distinct from the online environment in which TBSA is directed at women and – yet prosecutions under the POA provisions are frequent and numerous.
Alternatively known as the Arab Spring, a series of pro-democracy and anti-government protests, demonstrations, rebellions, uprisings, and mass gatherings in the Middle East, and Northern Africa in 2010 and 2011.
References
Anthony, 2018 Anthony, A. (2018, August 19). Gina Miller: I was absolutely shocked, I didn't know those attitudes still existed. The Observer. Retrieved from https://www.theguardian.com
Arthur and Kiss, 2013 Arthur, C. , & Kiss, J. (2013, July 29). Twitter abuse: What can be done to stop trolling? The Guardian. Retrieved from https://www.theguardian.com
Barker, 2018 Barker, K. (2018). R v Nimmo and Sorley (2014). In E. Rackley , & R. Auchmuty (Eds.), Women's legal landmarks: Celebrating 100 years of women and law in the UK and Ireland (pp. 608–610). Oxford: Hart.
Barker and Baghdady, 2017 Barker, K. , & Baghdady, C. (2017). Building online hybrid identities. In N. Lemay-Herbert , & R. Freedman (Eds.), Hybridity: Law, culture and development (pp. 218–234). London, New York, NY: Routledge.
Barker and Jurasz, 2014 Barker, K. , & Jurasz, O. (2014). Gender, human rights and cybercrime: Are virtual worlds really that different? In M. Asimow , K. Brown , & D. Papke (Eds.), Law and popular culture: International perspectives. Newcastle upon Tyne: Cambridge Scholars Publishing.
Barker and Jurasz, 2017 Barker, K. , & Jurasz, O. (2017, November). Submission of evidence on online violence against women to the UN special rapporteur on violence against women, its causes and consequences, Dr Dubravka Šimonović. Retrieved from http://oro.open.ac.uk/52611/
Barker and Jurasz, 2019a Barker, K. , & Jurasz, O. (2019a). Online misogyny as a hate crime: A challenge for legal regulation? London, New York: Routledge.
Barker and Jurasz, 2019b Barker, K. , & Jurasz, O. (2019b). Online misogyny: A challenge for digital feminism? Journal of International Affairs, 72(2), 95–113.
Barker and Jurasz, 2019c Barker, K. , & Jurasz, O . (2019c, June). Online harms white paper consultation response (on file with authors).
Barker and Jurasz, 2019d Barker, K. , & Jurasz, O. (2019d). Online violence against women: Addressing the responsibility gap? [Blog post]. LSE WPS. Retrieved from https://blogs.lse.ac.uk/wps/2019/08/23/online-violence-against-women-addressing-the-responsibility-gap/
Barker and Jurasz, 2020a Barker, K. , & Jurasz, O. (2020a). Gendered misinformation & online violence against women in politics: Capturing legal responsibility? CoInform. Retrieved from https://coinform.eu/gendered-misinformation-online-violence-against-women-in-politics-capturing-legal-responsibility/
Barker and Jurasz, 2020b Barker, K. , & Jurasz, O. (2020b). Online harms and Caroline's law – What's the direction for law reform? OU News. Retrieved from http://www.open.ac.uk/research/news/online-harms-and-carolines-law%E2%80%93whats-direction-law-reform
Barker and Jurasz, 2020c Barker, K. , & Jurasz, O. (2020c). Online violence against women as an obstacle to gender equality: A critical view from Europe. European Equality Law Review, 1, 47–60.
Barker and Jurasz, 2020d Barker, K. , & Jurasz, O. (2020d, March 26). Why misogyny and hate crime reforms need more than slick campaigns. The Conversation. Retrieved from https://theconversation.com/why-misogyny-and-hate-crime-reforms-need-more-than-slick-campaigns-134265#comment_2180791
Barnett, 2014 Barnett, E. (2014, September 29). Stella Creasy hits out at police over Twitter troll case failures. The Telegraph. Retrieved from https://www.telegraph.co.uk
Berners-Lee and Berners-Lee, 2020 Berners-Lee, T. , & Berners-Lee, R. (2020, March 12). Why the web needs to work for women and girls. Web Foundation. Retrieved from https://webfoundation.org/2020/03/web-birthday-31/
Chambers v DPP, 2017 Chambers v DPP . (2012). EWHC 2157.
Change.org, 2020 Change.org . (2020). Caroline's law. Retrieved from https://www.change.org/p/instagram-make-social-media-a-safer-place
Communications Select Committee, 2014– 2015 Communications Select Committee . (2014–2015). Social media and criminal offences. (HL Paper 37). Retrieved from http://www.publications.parliament.uk/pa/ld201415/ldselect/ldcomuni/37/3704.htm
CPS., 2017 CPS . (2017). Hate crime. Retrieved from www.cps.gov.uk/hate-crime
Cuthbertson, 2019 Cuthbertson, A. (2019, March 1). Twitter reveals new ‘hide tweet’ feature to let users moderate replies. The Independent. Retrieved from https://www.independent.co.uk/life-style/gadgets-and-tech/news/twitter-new-feature-hide-tweet-update-moderate-replies-a8803301.html
Dhordia, 2017 Dhordia, A. (2017, September 3). Unsocial media: Tracking twitter abuse against women MPs. Medium. Retrieved from https://medium.com/@AmnestyInsights/unsocial-media-tracking-twitter-abuse-against-women-mps-fc28aeca498a
Flynn, 2019 Flynn, A. , & Henry, N. . (2019). Image-based sexual abuse: An Australian reflection. Women and Criminal Justice, online first, doi:10.1080/08974554.2019.1646190
Girlguiding, 2016 Girlguiding . (2016). Girls' Attitudes Survey 2016. Retrieved from www.girlguiding.org.uk/globalassets/docs-and-resources/research-and-campaigns/girls-attitudes- survey-2016.pdf
Henry et al., 2020 Henry, N. , McGlynn, C. , Flynn, A. , Johnson, K. , Powell, A. , & Scott, A. J. (2020). Image-based sexual abuse: A study on the causes and consequences of non-consensual nude or sexual imagery. London, New York: Routledge.
Holt et al., 2019 Holt, T. J. , Lee, J. R. , Liggett, R. , Holt, K. M. , & Bossler, A. (2019). Examining perceptions of online harassment among constables in England and Wales. International Journal of Cybersecurity Intelligence & Cybercrime, 2(1), 24–39.
Inter-Parliamentary Union, 2016 Inter-Parliamentary Union . (2016, October). Sexism, harassment, and violence against women parliamentarians. Issues Brief. Retrieved from https://www.ipu.org/resources/publications/issue-briefs/2016-10/sexism-harassment-and-violence-against-women-parliamentarians
Inter-Parliamentary Union, 2018 Inter-Parliamentary Union . (2018, October). Sexism, harassment and violence against women in parliaments in Europe. Issues Brief. Retrieved from https://www.ipu.org/resources/publications/issue-briefs/2018-10/sexism-harassment-and-violence-against-women-in-parliaments-in-europe
Mason et al., 2020 Mason, R. , Davies, H. J. , & Bakare, L. (2020, February 17). No 10 calls on social media firms to act after Caroline Flack death. The Guardian. Retrieved from https://www.theguardian.com
McGlynn and Rackley, 2015 McGlynn, C. , & Rackley, E. (2015, October 29). More than just ‘revenge porn’: Tackling the misuse of private sexual images. The Holyrood. Retrieved from https://www.holyrood.com/comment/view,more-than-just-revenge-porn-tackling-the-misuse-of-private-sexual-images_6029.htm
McGlynn and Rackley, 2017 McGlynn, C. , & Rackley, E. (2017). Image-based sexual abuse. Oxford Journal of Legal Studies, 37(3), 534–561.
McLaughlin, 2020 McLaughlin, M. (2020, March 9). Quitting MSPs blame online trolls and long hours. Sunday Times. Retrieved from https://www.thetimes.co.uk
Monroe v Hopkins, 2017 Monroe v Hopkins . (2017). EWHC 433 (QB).
Oppenheim, 2019 Oppenheim, M. (2019, October 31). General Election: Women MPs standing down over ‘horrific abuse’ campaigners warn. The Independent. Retrieved from https://www.independent.co.uk
Paton, 2020 Paton, C. (2020, March 15). Vitriol is driving women out of politics says finance minister Kate Forbes. Sunday Times. Retrieved from https://www.thetimes.co.uk
Phillips, 2017 Phillips, J. (2017). Everywoman: One woman's truth about speaking the truth. London: Hutchinson.
Powell and Henry, 2017 Powell, A. , & Henry, N. (2017). Sexual violence in a digital age. Basingstoke: Palgrave Macmillan.
Powell et al., 2020 Powell, A. , Scott, A. J. , Henry, N. , & Flynn, A. (2020). Image-based sexual abuse: An international study of victims and perpetrators. Summary report. Melbourne: RMIT University.
R v Nimmo and Sorley (Westminster Magistrates’ Court24, 2014 R v Nimmo and Sorley . (2014, January 24). Westminster Magistrates’ Court.
R v Viscount St Davids (Westminster Magistrates’ Court11, 2017 R v Viscount St Davids . (2017, July 11). Westminster Magistrates’ Court.
R v Nunn, 2014 R v Nunn . (2014, September 29). City of London Magistrates’ Court.
Scottish Legal News, 2016 Scottish Legal News . (2016, December 5). Lord Neuberger warns against publication of parties' details in Article 50 case and emphasises role of Supreme Court. Retrieved from https://www.scottishlegal.com/article/lord-neuberger-warns-against-publication-of-parties-details-in-article-50-case-and-emphasises-role-of-supreme-court
UK Government, 2018 UK Government . (2018, February 6). PM speech on standards in public life.Retrieved from https://www.gov.uk/government/speeches/pm-speech-on-standards-in-public-life-6-february-2018. Accessed on February 6, 2018.
UK Government, 2019 UK Government . (2019, April). Online harms white paper (CP 57). Retrieved from https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/793360/Online_Harms_White_Paper.pdf
United Nations Human Rights Council, 2018 United Nations Human Rights Council . (2018, June 18). Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective, A/HRC/38/47. Retrieved from https://www.ohchr.org/EN/HRBodies/HRC/RegularSessions/Session38/Documents/A_HRC_38_47_EN.docx
Vera-Gray, 2016 Vera-Gray, F. (2016). Men's stranger intrusions: Rethinking street harassment. Women's Studies International Forum, 58, 9–17.
Webber, 2017 Webber, J. (2017, October 16). Anita Sarkeesian: ‘It's frustrating to be known as the woman who survived #Gamergate’. The Guardian. Retrieved from https://www.theguardian.com
- Prelims
- Technology-Facilitated Violence and Abuse: International Perspectives and Experiences
- Section 1 TFVA Across a Spectrum of Behaviors
- Chapter 1 Introduction
- Chapter 2 Is it Actually Violence? Framing Technology-Facilitated Abuse as Violence
- Chapter 3 “Not the Real World”: Exploring Experiences of Online Abuse, Digital Dualism, and Ontological Labor
- Chapter 4 Polyvictimization in the Lives of North American Female University/College Students: The Contribution of Technology-Facilitated Abuse
- Chapter 5 The Nature of Technology-Facilitated Violence and Abuse among Young Adults in Sub-Saharan Africa
- Chapter 6 The Face of Technology-Facilitated Aggression in New Zealand: Exploring Adult Aggressors' Behaviors
- Chapter 7 The Missing and Murdered Indigenous Women Crisis: Technological Dimensions
- Chapter 8 Attending to Difference in Indigenous People's Experiences of Cyberbullying: Toward a Research Agenda
- Section 2 Text-Based Harms
- Chapter 9 Introduction
- Chapter 10 “Feminism is Eating Itself”: Women's Experiences and Perceptions of Lateral Violence Online
- Chapter 11 Claiming Victimhood: Victims of the “Transgender Agenda”
- Chapter 12 Doxxing: A Scoping Review and Typology
- Chapter 13 Creating the Other in Online Interaction: Othering Online Discourse Theory
- Chapter 14 Text-Based (Sexual) Abuse and Online Violence Against Women: Toward Law Reform?
- Section 3 Image-Based Harms
- Chapter 15 Introduction
- Chapter 16 Violence Trending: How Socially Transmitted Content of Police Misconduct Impacts Reactions toward Police Among American Youth
- Chapter 17 Just Fantasy? Online Pornography's Contribution to Experiences of Harm
- Chapter 18 Intimate Image Dissemination and Consent in a Digital Age: Perspectives from the Front Line
- Section 4 Dating Applications
- Chapter 19 Introduction
- Chapter 20 Understanding Experiences of Sexual Harms Facilitated through Dating and Hook Up Apps among Women and Girls
- Chapter 21 “That's Straight-Up Rape Culture”: Manifestations of Rape Culture on Grindr
- Chapter 22 Navigating Privacy on Gay-Oriented Mobile Dating Applications
- Section 5 Intimate Partner Violence and Digital Coercive Control
- Chapter 23 Introduction
- Chapter 24 Digital Coercive Control and Spatiality: Rural, Regional, and Remote Women's Experience
- Chapter 25 Technology-Facilitated Violence Against Women in Singapore: Key Considerations
- Chapter 26 Technology as Both a Facilitator of and Response to Youth Intimate Partner Violence: Perspectives from Advocates in the Global-South
- Chapter 27 Technology-Facilitated Domestic Abuse and Culturally and Linguistically Diverse Women in Victoria, Australia
- Section 6 Legal Responses
- Chapter 28 Introduction
- Chapter 29 Human Rights, Privacy Rights, and Technology-Facilitated Violence
- Chapter 30 Combating Cyber Violence Against Women and Girls: An Overview of the Legislative and Policy Reforms in the Arab Region
- Chapter 31 Image-Based Sexual Abuse: A Comparative Analysis of Criminal Law Approaches in Scotland and Malawi
- Chapter 32 Revenge Pornography and Rape Culture in Canada's Nonconsensual Distribution Case Law
- Chapter 33 Reasonable Expectations of Privacy in an Era of Drones and Deepfakes: Expanding the Supreme Court of Canada's Decision in R v Jarvis
- Chapter 34 Doxing and the Challenge to Legal Regulation: When Personal Data Become a Weapon
- Chapter 35 The Potential of Centralized and Statutorily Empowered Bodies to Advance a Survivor-Centered Approach to Technology-Facilitated Violence Against Women
- Section 7 Responses Beyond Law
- Chapter 36 Introduction
- Chapter 37 Technology-Facilitated Violence Against Women and Girls in Public and Private Spheres: Moving from Enemy to Ally
- Chapter 38 As Technology Evolves, so Does Domestic Violence: Modern-Day Tech Abuse and Possible Solutions
- Chapter 39 Threat Modeling Intimate Partner Violence: Tech Abuse as a Cybersecurity Challenge in the Internet of Things
- Chapter 40 Justice on the Digitized Field: Analyzing Online Responses to Technology-Facilitated Informal Justice through Social Network Analysis
- Chapter 41 Bystander Apathy and Intervention in the Era of Social Media
- Chapter 42 “I Need You All to Understand How Pervasive This Issue Is”: User Efforts to Regulate Child Sexual Offending on Social Media
- Chapter 43 Governing Image-Based Sexual Abuse: Digital Platform Policies, Tools, and Practices
- Chapter 44 Calling All Stakeholders: An Intersectoral Dialogue about Collaborating to End Tech-Facilitated Violence and Abuse
- Chapter 45 Pandemics and Systemic Discrimination: Technology-Facilitated Violence and Abuse in an Era of COVID-19 and Antiracist Protest