First Monday

Feminist Data Studies and the emergence of a new Data Feminist knowledge domain by Hana Marcetic and Jan Nolin



Abstract
Mass participation in social networking sites and online life combined with the development of tracking technology facilitates gathering data on unprecedented scales. The uptake of data collecting during the 2010s coincided with the emergence of data science and data studies, along with critical perspectives such as critical data and critical algorithm studies. This paper explores one such critical perspective. Data Feminism merges the theories of intersectional feminism and critical data studies. Bibliometric text analysis of articles, conference papers, essays, and commentary was conducted in VOSviewer software, which found commonalities between terms within texts. The most prominent terms and keywords in the research area of Data Feminism identified in such a manner informed the close reading that followed. Six clusters of terms were identified, with the two largest clusters formed around the terms “big data” and “artificial intelligence” respectively. We also explored the boundaries, movements and centralities within the six clusters.

Contents

Introduction
Previous research
Feminist Data Studies and Data Feminism
Method
Results
Discussion
Conclusion

 


 

Introduction

Types of knowledge and subject areas form knowledge domains, allowing researchers to develop a specific language to talk about subject areas in unique ways (Hjørland, 2002). According to Hjørland, each domain has its own approaches, paradigms and schools, and their logic should be understood from within. This article focuses feminist theories and perspectives that study contemporary critical issues on how data interacts with people. The starting point is that such studies serve as the setting for the emergence of a new knowledge domain within feminist studies.

It is difficult to evaluate one knowledge domain from the standpoint of another. For this reason, we approach our task with caution. Our ambition is to provide new insights into the emerging knowledge domain of Data Feminism through a combination of bibliometrics and close reading. Bibliometric tools are used to understand boundaries, tensions and movements within a field (Farooq, 2021; Flensburg and Lonburg, 2021; Hammerfelt, 2018), research subject (e.g., Mora, et al., 2017), a journal or a database (e.g., Macan, 2011).

There are several obstacles to grasping the breadth and character of the emerging Data Feminist knowledge domain. For instance, citation databases favor Western science and the English language (Albarillo, 2014; Van Leeuwen, et al., 2001), have bias issues and create the Matthew effect (Merton, 1968). The Matthew effect implies that articles and authors that are already well cited will be referenced even more. Similarly, knowledge domains that are already well represented as major subject areas will grow exponentially more than the fledgling knowledge domain. With that in mind, it is perhaps not surprising that subject indexing in major databases tends to disregard feminist research (Bergenmar and Golub, 2020). Block (2020) argued that the absence of symmetrical indexing has led to systematic misrepresentation and erasure of intersectional and feminist research.

Given that, this article provides a much-needed overview and discussion of key emerging themes of research. Extensive citation chaining was used in order to reconstruct what was not visible in key databases. This approach considerably added to conventional searches in a variety of discovery systems. We found that, on the one hand, databases have started recognizing Data Feminism as a term over the last two years. This coincides with the publication of the book Data Feminism (D’Ignazio and Klein, 2020a). On the other hand, different feminist approaches to studying data occurred before the term was coined.

Academic studies that contributed to feminist discussions of what is increasingly understood as a datafied society largely appear scattered across fields and journals. The key critical concept of datafication alludes to “the transformation of social action into online quantified data” as a means to “access, understand and monitor people’s behaviour” [1]. In 2020, Data Feminism was launched as an umbrella, offering principles in place of a programmatic agenda (D’Ignazio and Klein, 2020a). Due to the diversity of scholarly work, studies conducted before the development of the knowledge domain of and outside of this pragmatic umbrella are in the text below referred to as Feminist Data Studies. This term is used in several instances within the literature, which motivated us to choose it as such a delineator. With the emergence of Data Feminism, the outlines of a new knowledge domain are beginning to take shape. Given that, the question of how Data Feminism is shaping arises. The research question that follows from that is:

Q1: What issues have been the focus of the longer trend of Feminist Data Studies and the new umbrella of Data Feminism?

This study is an exploratory attempt at outlining boundaries, tensions, movements and centralities within Data Feminism and Feminist Data Studies. Publications and materials collected using scholarly databases, discovery and recommendation systems as well as citation chaining were analyzed using bibliometric software and combined with close reading. Using said software we have clustered the terms that occurred together in the texts and asked the following question:

Q2: What are the strengths within Feminist Data Studies and Data Feminist approach when studying datafied society?

In this paper, we provide an overview of previous research of critical discussions unravelling within the domains of technological research, followed by an overview of Feminist Data Studies and Data Feminism. We then describe the tools and methods used in this study. The findings are presented through five clusters of terms that were identified with bibliometric tools. Finally, in our discussion we outline the conversations happening within each of the clusters as informed by close reading, allowing us to draw careful conclusions.

 

++++++++++

Previous research

Over the past decade, a multitude of disciplines and multidisciplinary fields has critically engaged with numerous issues connected to the emergence of data-fueled targeted advertising. A shift in the utilization of user data became commercially viable already in the mid-2000s. The spread of targeted advertising across companies has led to a plethora of applications in all sectors of society (O’Neil, 2017). Targeted advertisement has been driven by big data surveillance (Zuboff, 2019), setting the foundation for a datafied society.

Critical scholars have warned about the impact of data-fueled systems on vulnerable groups and communities. Surveillance through the use of metadata has been termed dataveillance (Raley, 2013; van Dijck, 2014). There has been concern about both the character and legitimacy of the divide in knowledge between data collectors and those whose data is being harvested (boyd and Crawford, 2012; Andrejevic, 2014; Zuboff, 2019). Predictive models based on uncritically utilized mathematical models and algorithms, frequently use biased and re-purposed data to influence decisions about employment (O’Neil, 2017) and drive excessive surveillance (Zuboff, 2019). The use of predictive models for policing minorities (Browne, 2015; Alexander, 2019), as well as the continual development of technology without taking minorities and their cultures into account (Benjamin, 2019), call for rethinking and reimagining the way data-driven innovation is thought about and discussed.

The potential of the Internet as a tool for “activism, communication, media-making, and culture is often the focus of research about the Internet (...) but without sufficient power critique” (Noble, 2016). The feminist research tradition has a unique approach in criticizing and analyzing power structures, allowing for specific and unique discussions of datafication.

 

++++++++++

Feminist Data Studies and Data Feminism

Various theories and strands of feminism were utilized in the critique of datafication and automation since the 1990s, increasingly so in the 2000s. Feminist data values are expressed comprehensively for the first time in a “Manifest-no” declaration, refusing harmful data regimes and commitment to new data futures (Cifor, et al., 2019). Feminist Data Studies follow a diverse set of perspectives and theories. They have been scattered within numerous disciplines and publishing areas (to be discussed later) and with surprisingly scant co-citation between articles. Notably, across those studies there have been references to the likes of Haraway (1988) and Harding (1991, 1986). For example, standpoint and situated knowledge theories have been influential in arguing for better representation of women in tech environments as well as for more inclusion of minorities’ perspectives online (Atanasoski and Vora, 2019). Another theme common in feminist studies has been a consistent criticism of hierarchies (Yoder, 2018; Terborg-Penn, 1995). This was the context in which Data Feminism was introduced.

The concept of Data Feminism was introduced to scholarly bibliographic databases quite recently. The first time it appeared in the Web of Science and Scopus was in 2020. D’Ignazio and Klein’s (2020a) book Data feminism popularized the concept, catching the attention of scholars in various disciplines. Data Feminism was described as “a way of thinking about data, both their uses and their limits, that is informed by direct experience, by commitment to action, and by intersectional feminist thought” [2]. It remains to be seen what strands and directions will influence the development and future directions of Data Feminism and Feminist Data Studies.

 

++++++++++

Method

In this study, we combined bibliometrics with a close reading of texts retrieved from databases and by citation chaining. Relevant hits were identified in the Web of Science, Scopus, ProQuest, Taylor & Francis, Sage, Emerald, EBSCO Discovery Service and Google Scholar. The initial searches were conducted in November and December of 2020. Final items were added to the collection during Spring 2021. The search terms and combinations of terms that resulted in relevant literature are listed in Table 1. The stemming of the terms is shown according to the standard in Web of Science and was adjusted to the requirements of each database when searches were conducted.

 

Table 1: Search terms.
“data femins*” or feminis* AND“surveillance studies” AND
◦ “critical data studies”
◦ “data studies”
◦ “data science”
◦ “data activis*”
◦ “data ethics”
◦ datafication
◦ algorithm*
◦ “Big Data”
◦ “artificial intelligence”
◦ “machine learning”
◦ data AND feminis*
◦ “data science” AND gender
◦ “critical race studies” AND “data stud*”

 

Searching additional combinations and terms (e.g., data economy) did not provide any relevant hits.

After sorting the datasets, the number of relevant results amounted to two dozen, which served as a basis for further citation chaining. The final number amounted to 103 documents (see Appendix).

For the purpose of visualizing data, VOSviewer, software for the visualization of similarities in bibliographic networks, was selected in order to identify the most important concepts as well as core terms related to each other. A master file of titles, abstracts and author-assigned keywords was created. We used textual analysis functionality, introduced in 2011, to create a term map, a “two-dimensional map in which terms are located in such a way that the distance between two terms can be interpreted as an indication of the relatedness of the terms” [3]. The relatedness was determined based on the co-occurrence of the terms in the documents. The dataset was imported and the binary counting method selected. Of 3,004 terms identified, 172 met the threshold of five occurrences that we chose for the analysis. We then manually excluded the word data, which occurred in the dataset significantly more often than the rest of the terms, and those referring to the document structure instead of the topic (e.g., abstract, research, researcher, article, paper, study and author). Ultimately, we were left with 130 terms, for which we generated network and density visualizations in VOSviewer. In the network visualizations, the size of a node (circle) represents the higher importance (weight) of the term it represents (van Eck and Waltman, 2020). Different colors represent clusters within the network, and each term was assigned to one cluster.

Following the analysis of keywords in VOSviewer, the collected literature was categorized according to clusters that emerged from the visualizations. We identified prominent terms within clusters and engaged in a close reading of the texts. During close reading, we identified the ways in which said terms were used and the contexts in which they figured in the texts. With these procedures, we were able to investigate the context of important concepts through both quantitative and qualitative approaches.

 

++++++++++

Results

Six clusters of terms were identified, with the two largest formed around the terms “Big Data” and “Artificial Intelligence”. Figure 1 depicts the network of terms, each color representing one of the clusters, and each node in the network signifying one of 130 terms. The term Big Data occurred most frequently within the dataset, followed by gender and ethic(s). The largest cluster, with 59 terms, is shown in red. The smallest, with 10 terms, is shown in the lighter shade of blue (Figure 1). In the following sections, we will take a closer look at each of the six clusters.

 

Table 2: Six clusters with most prominent terms and number of occurrences.
Cluster 1 — Big DataCluster 2 — Artificial IntelligenceCluster 3 — Ethics
Big Data (71)
gender (70)
technology (51)
algorithm(s) (43)
care (33)
feminism (33)
body (32)
knowledge (30)
bias (25)
Artificial Intelligence (34)
system (30)
machine (18)
power (18)
discrimination (16)
inequality (12)
algorithmic bias (6)
ethic(s) (63)
practice (55)
discourse (23)
challenge (20)
value (20)
justice (15)
equity (11)
race (11)
Cluster 4 — IdentityCluster 5 — Individual and societalCluster 6 — Algorithmic culture and the pandemic
user (30)
woman (30)
person (25)
identity (16)
software (15)
time (15)
lgbtq (6)
transgender (6)
process (44)
experience (28)
individual (25)
decision (16)
society (15)
use (14)
group (11)
institution (8)
world (17)
culture (16)
covid (9)
policy (9)
algorithmic culture (8)
health (6)

 

Cluster 1: Big Data

Big Data was a dominant concept in the literature within this dataset. Long-standing feminist discussions on gender and minorities were applied to Big Data in this cluster (e.g., van Oost, 2020; Schiller and McMahon, 2019). Micro-sociological approaches concerned with body politics in local settings were present in research on LGBTQ+ groups and identities (e.g., Gieseking, 2018; Snapp, et al., 2016) and body autonomy (Bogers, et al., 2020). The velocity of data available to data brokers creates conditions that enable obscuring agency, context and subjectivity of “individuals who produce ‘data points’” [4]. Macro-sociological approaches were also visible. Processing Big Data to garner economic gain and social control for a handful of companies was criticized as unjust and problematic (Suárez-Gonzalo, 2019; Luka and Millette, 2018). Different authors identify four to five companies reaping the majority of the benefits of datafication. Those companies are known under acronyms GAFA or GAFAM (i.e., Google, Amazon, Facebook, Apple and Microsoft), with Google and Facebook at the forefront. Feminist empiricism (McQuillan, 2016) and employment of situated knowledges (by Draude, et al., 2019; Cooky, et al., 2018; Luka and Millette, 2018) occurred within the literature as feminist tools for inspecting and challenging the imbalance in power and harms stemming from the misuse of Big Data. The micro-sociological contributions were frequently and uniquely feminist while some macro-sociological studies were similar in character as other critical discussions on Big Data where GAFA had been targeted since at least 2013 (De Filippi, 2013).

Algorithms, technology and gender were prominent terms in Cluster 1 as well. Figure 1 depicts the algorithm node as deeply intertwined with others within the network. Algorithms were discussed within the dataset as social constructs that subtly wield power. The literature within this cluster included topics of algorithmic visibility (Bishop, 2019; Agostinho, 2018), reproduction of societal values (Hayes, et al., 2020) and social relations of power (Prietl, 2019). Agostinho discussed visibility in the context of algorithmic role in making racialized bodies visible. Algorithms that were developed on biased datasets overwhelmingly represented bodies and experiences of white people, primarily men (Browne, 2015). Agostinho’s (2018) study touched on the issue of surveillance of racialized bodies on the one hand, and the lack of their visibility on the other. Bishop (2019), in turn, described how content creators managed algorithms to achieve visibility on a social media platform, and the ways in which gender played a role in strategies that were used. Reproduction of societal values occurred through algorithms and their bias, visible, for example, in talent acquisition software and a systematic lack of hiring underrepresented candidates (Yarger, et al., 2020). This provided examples of algorithms leading to the neglect of minorities. In other cases, such as predictive policing, minorities and low income communities, received disproportionately amount of attention (Hayes, et al., 2020). This has been discussed largely in an American context (Eubanks, 2018; O’Neil, 2017; Browne, 2015).

 

Terms network visualization
 
Figure 1: Terms network visualization.

 

Bogers, et al. (2020) used a term suggested by Noble (2016), algorithmic oppression, when referring to inscription of oppression into coded systems. They remarked, “[t]oo often, computational analytics like algorithms are accompanied by the firm belief that data are intrinsically neutral and bias-free” [5]. Algorithms have the power to shape material realities and are hence of concern to feminist inquiries (Adams, et al., 2020). Feminist inquiry and reasoning support a call for more accountability in studies of talent recruitment software (Yager, et al., 2020), online representations of pregnancy (Bogers, et al., 2020), algorithmic decision-making systems (Prietl, 2019) and gender bias in social media (Schroeder, 2021).

Cluster 2: Artificial intelligence

The artificial intelligence (AI) node occurred as the most prominent term in Cluster 2. Critical research in this area incorporated issues of data used for training AI and reflected the intended or unintended bias of data and algorithms. If training data was selected from contexts rife with asymmetrical power structures, these would be reproduced and amplified through machine learning.

The AI node was intertwined with some of the topics that reoccurred within the literature. This was why we could see nodes such as algorithmic bias, power and inequality in this cluster as well. The issue of inequality in power distribution across society and its reflection in AI systems figured in much of the reviewed literature (e.g., Ciston, 2019; Mohamed, et al., 2020). In mainstream discussions, bias in AI was often overlooked since the “[c]ultural misunderstandings of AI allow their resulting data to assume a status of impartial fact, even as they operate by human intervention at every level” [6]. Much like algorithms, artificial intelligence has been a subject of inquiry with regards to agency and accountability. An intersectional framework was suggested in the literature to “analyze the biases and problems built into existing artificial intelligence, as well as to uncover alternative ethics from its counter-histories.” [7]. Algorithmic oppression and patriarchal oppression (Johnson, 2005) are, at least to an extent, similar in character. AI is increasingly invested with the power to sort people through algorithms while at the same time being omitted from processes of accountability. Feminist and gender research appeared to be well positioned to identify and critically discuss new and evolving forms of bias, creating room for further research.

The trajectory of AI has been notoriously difficult to grasp. Nonetheless, critical scholars have been compelled to try. Studies of the future of AI (Ferrando, 2014) and the ways in which AI and humans mutually shape each other (Krupiy, 2020; Mohamed, et al., 2020) appeared in this cluster. Ferrando stated that futures “do not appear out of nowhere”; instead “they are based on the presents, the pasts, and the ways they are being envisioned” [8]. She asserted that the future of AI was still developing under a predominantly male imagination. From such a perspective, the emergence of a new feminist knowledge domain would appear vital.

Cluster 3: Ethics

The topic of ethics appeared in Cluster 3. Large amounts of personal, intimate data were produced about people when they became hospitalized. Such data had traditionally been seen as highly sensitive and strictly protected. However, with the explosion of Internet-based tracking technologies, geolocation and social media there was a serious leaking of this sensitive information. Several researchers argued for the importance of feminist ethics of care for analyzing and scrutinizing the manifestations of power relations in datafied systems. Feminist ethics of care builds on Puig de la Bellacasa (2017) for whom care incorporated active involvement in maintenance and empowerment of the object of care. In Cluster 3 we noted that researchers advocated an application of such ethics to handle research data (Leurs, 2017), Big Data in social media settings (Luka and Milette, 2018) and to understand citizen data practices (Fotopoulou, 2019). Health related data practices have received special attention since the start of the global pandemic in 2020. The adoption of surveillance technologies provoked by the pandemic has been a cause of particular concern (Taylor, 2020; Robinson and Johnson, 2021). Taylor (2020) argued that policy and science handling of the pandemic moved forward with a profound lack of data on low-wage workers, the elderly and various minorities. This appeared to be another area where feminist and gender research could make prolific contributions.

Cluster 4: Identity

Cluster 4 dealt with identities, containing the nodes identity, woman, transgender and LGBTQ. Data Feminist studies have and will continue to make important contributions about representation and coding gender binary in social media (Bivens and Haimson, 2016; Bivens, 2017) and software (Spiel, et al., 2019). Cluster 4 was connected to Cluster 3 through the node race. The node was situated within the Ethics cluster, but was relevant for the Identity cluster as well. However, the articles in this cluster were concerned largely with issues related to gender and sexual orientation.

In this literature, identities were predominantly understood as complex, fluid and nuanced, which could be contrasted with the coding of Big Data categories which predominantly followed traditional binaries. The new ways of conceptualization and capturing identity through information technology was therefore creating a friction between LGBTQ+ identities and the way they have been represented (Hamidi, et al., 2018; Ruberg and Ruelos, 2020). Technology driven misrepresentations and reductionism of queer and non-binary identities have been common and have real life consequences for groups and minorities pushed further to the margins through datafication. Subsequently, several articles called for reimagining and queering (Gieseking, 2020; Ruberg and Ruelos, 2020) identity-based data.

There were also concerns raised by the appearance and potential perils of automatic gender recognition systems (AGR) (Hamidi, et al., 2018). Here we found an interesting widening of a long-standing criticism of societal institutions built upon limited conceptions of gender. Similar avenues of thought are intriguing in the context of automatic machine decision-making applied to gendered identities.

Clusters 5 and 6

Clusters 5 and 6 were the two smallest clusters in the network. Cluster 5 contained nodes including individual, experience, decision, society and institution. Nodes covid, policy and health were situated in Cluster 6, as well as culture and algorithmic culture.

Issues of ethics within institutions connected Cluster 5 to Cluster 3. Cluster 5 accentuated the interplay between the individual and institutions within the datafied society. In essence, the argument was that a new social contract between citizens and governing institutions will need to be developed. Suarez-Gonzalo (2019) argues that “it is critical to reduce the factors that give a few corporations the power to undermine citizens’ ability to act autonomously for the protection of their personal data. This requires that public institutions replace the logic of data business for the logic of fundamental rights, limiting the unbridled expansion of the markets to data” [9]. It was a strong statement but quite similar to what other critical traditions have put forward, for instance, within surveillance studies (Zuboff, 2019). What are the contemporary digital rights and obligations of the citizen?

The concern with ethics also signaled a penchant for philosophical reasoning. The institutionalization of ethics was criticized for reducing ethics to a set of operationalized practices and procedures. Subsequently, there was a tendency to resolve open-ended and irresolvable ethical questions prematurely (Metcalf, et al., 2019). Building further on the private-public dichotomy, the nodes individual and experience appeared in relation to identities and tie it to Cluster 5. The notion of direct experience used by D’Ignazio and Klein should have been invoked here, to postulate how the lack of diversity in the tech industry reflected representation of diverse human experiences and minority identities. While the issue of race was often intricately connected and interwoven in the studies and arguments within our material, it was seldom the main focus.

Cluster 6, the smallest in the map of terms, contained nodes announcing topics of the latest pandemic, but also culture. The algorithmic culture node contained an instance in which the “interconnectedness and mutual shaping of society and technology” [10] was highlighted, while the pandemic related literature focuses on issues including pandemic driven technology adoption (Robinson and Johnson, 2021) and the politics of pandemic data (Taylor, 2020). The global pandemic has influenced the direction of much research being done in recent years, and that trend is becoming visible in Data Feminist research as well. In this cluster we find a suggestion for incorporating seven aforementioned principles in pandemic related research (D’Ignaizo and Klein, 2020b). Special attention is given to working with data in the face of increasing of remote work. In this way, the principles of Data Feminism are applied to and contribute to the ongoing discussion of datafication and how it has developed during the pandemic.

 

++++++++++

Discussion

The term Big Data had the most occurrences (71) in the network of terms (Figure 1). It appeared in cluster 1, the “Big Data cluster”. Cluster 1 also had a central position in the network, connecting the remaining five clusters. The term gender was also situated in this cluster, and figures as the central term of the network with 70 occurrences. Gender studies point out the prevalence of masculine concepts within tech jargon (van Oost, 2000; Harding, 1986). Big Data is one such concept, which corresponds to the (Western) masculine attempts to control nature through technology (van Oost, 2000). Notably, the Big Data feminist literature explores themes beyond gender, establishing a critical tradition that is both closely connected to the strengths of this approach and more broadly oriented towards intersectional problems on the Internet.

Trendy technological concepts of Big Data, Algorithms and Artificial Intelligence were prevalent in the material, and imported from other disciplines and technological jargon. They stemmed from techno-optimistic discourses on whose boundaries critical discussions developed. Data Feminist Studies and Data Feminism utilized and contributed to those critical discussions. Much of the material used in this study pertained to Big Data (Favaretto, et al., 2019; Cooky, et al., 2018; Hill, et al., 2016), algorithms (Adams, et al., 2020; Prietl, 2019; Yarger, et al., 2020; McQuillan, 2016) and AI (Ciston, 2019; Baker, 2018; Ferrando, 2014). There was space for reformulating these technooptimistic concepts, and good reason to reconfigure them in ways that communicated the traditional criticisms of feminist studies. Critical work on the concept of algorithm was an example of that. In mainstream technological discussions about algorithms were portrayed as neutral, driven by pure mathematics without any politics. Feminist investigations have demonstrated that skewed input into algorithms manifests itself as bias. Such work can lead to appropriation of the algorithm concept for critical discussions. Coining and utilizing terms such as algorithmic oppression and algorithmic bias (Noble, 2016; Haider and Sundin, 2019) are examples of reconfiguring the concept of algorithms to foster more critical discussions. Further development of Data Feminist studies might find more ways to contribute to such reconstructions.

Hegemonic tendencies in the developments of the Internet (Frenkel and Kang, 2021) driven by masculine thinking were intertwined with a lack of representatives of minority professionals in developing technological structures (Yarger, et al., 2020; Way, et al., 2016). Big Tech companies claim they want to do better with regards to diversity and ethics (Hoffmann, 2020; Metcalf, et al., 2019). However, employing women and minorities, in particular at jobs that require competences beside tech development rarely results in real change in company cultures. We can see this in the recent firing of co-leaders of Google’s ethical team (Simonite, 2021) and the positions minorities occupy in Big Tech in general (Yarger, et al., 2020).

Deconstruction of identity is a strength of feminist research (Cluster 4). Queer and feminist theories hold that identity changes and reshapes (Nicholas, 2014; Butler, 1990), and is not bound by biological boundaries a person is born with. Data Feminist researchers argue for queering datafied representations of gender (Hoffmann, 2020; Snapp, et al., 2016) and oppose the gender binary reinforced in datafied systems (Ruberg and Ruelos, 2020; Schroeder, 2021; Bivens, 2017). This leads to an interesting dilemma. On the one hand, feminists want to rectify systematic misrepresentation of women and other minorities by algorithms. On the other hand, richer data representations of minorities expose vulnerable groups to dataveillance and policing through data (Yu, 2020; Weinberg, 2017). In other words, the issue of representation coexists with the perils of dataveillance in these texts.

The tradition of emphasizing power distribution and inequality in feminist studies is another advantage when critically studying the tech industry. Filtering the contents it delivered on individual feeds and feeding into power structures poses opportunities for feminist researchers to re-identify and deconstruct power relationships and implicit power structures.

The topics of healthcare arose from Cluster 6, particularly in the context of the global pandemic. As a female dominated discipline (Shannon, et al., 2019), healthcare research often utilized feminist theories and perspectives (e.g., Healy, 2021; Gilberg, 2018). The increase of automatisation of healthcare and medicine (Cheney-Lippold, 2017; O’Neil, 2017) makes healthcare an interesting domain for feminist inquiries. A side effect of the pandemic has been an increased interest in these research areas inside and outside of the academia. There is therefore a certain window of opportunity within which critical feminist perspectives within healthcare research may find a wider audience.

The pandemic has influenced Data Feminism in several ways. Firstly, it has drawn the attention of Data Feminist researchers to study pandemic-related topics (e.g., D’Ignazio and Klein, 2020b; Robinson and Johnson, 2021; Taylor, 2020; Yu, 2020). Secondly, a portion of scholars interested in Data Feminist research may be due to challenges posed as a result of a global pandemic (e.g., Oleschuk, 2020; Staniscuaski, et al., 2020). And lastly, while some of those interested in Data Feminism may have become hampered by the pandemic, others may still have become involved in pandemic research, rather than joining the development of this knowledge domain.

 

++++++++++

Conclusion

Internet studies have grown to involve a wealth of different approaches. Various critical knowledge domains are still evolving. This study focused on a new critical knowledge domain within feminist and gender studies. The feminist tradition of criticizing politics, technology and science under scrutiny for being male dominated has been given a new object of study in the datafied society. As this knowledge domain so far lacks conventional institutionalization, it is difficult for interested researchers to get a grasp of it. This paper has served to track boundaries, centralities, tensions and movements over time by applying bibliometrics.

Despite the issues with representation of Data Feminist and Feminist Data Studies that modern indexing systems and search tools encounter, it is possible to note an increase in the number of published studies and the outlines of a new knowledge domain. Major terms identified in our dataset include studies of Big Data, gender, algorithms, Artificial Intelligence and bias.

Our review of the literature has shown that feminist approaches make numerous and different kinds of contributions to discussions of the datafied society.

Will there be a merging of Data Feminism and Feminist Data Studies? Will one assimilate the other or will the two strands coexist, one with a strong program and the other that might not want to be a part of it? Whatever path Data Feminist studies take, these discussions will still be necessary and require more scrutiny. This article is merely exploratory in character, an attempt at outlining these knowledge domains for further study. This work has combined quantitative and qualitative approaches; future research could include more analyses qualitative in character. End of article

 

About the authors

Hana Marčetić is a doctoral student at the Swedish School of Library and Information Science at the University of Borås, Sweden. Her research interests include critical data and algorithm studies, identity and social media.
Send comments to: hana [dot] marcetic [at] hb [dot] se

Jan Nolin is a professor at the Swedish School of Library and Information Science at the University of Borås, Sweden. His research interests include Internet studies, transparency movements, and strategies concerning academic involvement in Agenda 2030.
E-mail: jan [dot] nolin [at] hb [dot] se

 

Notes

1. van Dijck, 2014, p. 198.

2. D’Ignazio and Klein, 2020, p. 3.

3. van Eck and Waltman, 2011, p. 1.

4. Cooky, et al., 2018, p. 2.

5. Bogers, et al., 2020, p. 1,039.

6. Ciston, 2019, p. 3.

7. Ibid.

8. Ferrando, 2014, p. 1.

9. Suarez-Gonzalo, 2019, p. 188.

10. Draude, et al., 2019, p. 325.

 

References

Heather Brook Adams, Risa Applegarth, and Amber Hester Simpson, 2020. “Acting with algorithms: Feminist propositions for rhetorical agency,” Computers and Composition, volume 57, 102581.
doi: https://doi.org/10.1016/j.compcom.2020.102581, accessed 14 August 2021.

Daniela Agostinho, 2018. “Chroma key dreams: Algorithmic visibility, fleshy images and scenes of recognition,” Philosophy of Photography, volume 9, number 2, pp. 131–155.
doi: https://doi.org/10.1386/pop.9.2.131_1, accessed 16 August 2021.

Frans Albarillo, 2014. “Language in social science databases: English versus non-English articles in JSTOR and Scopus,” Behavioral & Social Sciences Librarian, volume 33, number 2, pp. 77–90.
doi: https://doi.org/10.1080/01639269.2014.904693, accessed 21 June 2022.

Michelle Alexander, 2019. The new Jim Crow: Mass incarceration in the age of colourblindness. London: PenguinBooks.

Mark Andrejevic, 2014. “The Big data divide,” International Journal of Communication, volume 8, pp. 1,673–1,689, and at https://ijoc.org/index.php/ijoc/article/view/2161, accessed 21 June 2022.

Neda Atanasoski and Kalinidi Vora, 2019. Surrogate humanity: Race, robots, and the politics of technological futures. Durham, N.C.: Duke University Press.
doi: https://doi.org/10.1215/9781478004455, accessed 21 June 2022.

Sarah Elise Baker, 2018. “Post-work futures and full automation: Towards a feminist design methodology,” Open Cultural Studies, volume 2, number 1, pp. 540–552.
doi: https://doi.org/10.1515/culture-2018-0049, accessed 21 June 2022.

Ruha Benjamin, 2019. Race after technology: Abolitionist tools for the New Jim Crow. Cambridge: Polity Press.

Jenny Bergenmar and Koraljka Golub, 2020. “Subject indexing: The challenge of LGBTQI literature,” In: Sanita Reinsone, Inguna Skadia, Anda Baklāne, and Jānis Daugavietis (editors). Digital humanities in the Nordic countries: Proceedings of the Digital humanities in the Nordic countries 5th Conference CEUR-WS ’20, pp. 203–210, and at http://ceur-ws.org/Vol-2612/short4.pdf, accessed 21 June 2022.

Sophie Bishop, 2019. “Managing visibility on YouTube through algorithmic gossip,” New Media & Society, volume 31, numbers 11–12, pp. 2,589–2,606.
doi: https://doi.org/10.1177/1461444819854731, accessed 16 August 2021.

Rena Bivens, 2017. “The gender binary will not be deprogrammed: Ten years of coding gender on Facebook,” New Media & Society, volume 19, number 6, pp. 880–898.
doi: https://doi.org/10.1177/1461444815621527, accessed 21 June 2022.

Rena Bivens and Oliver L. Haimson, 2016. “Baking gender into social media design: How platforms shape categories for users and advertisers,” Social Media + Society (12 October).
doi: https://doi.org/10.1177/2056305116672486, accessed 16 August 2021.

Sharon Block, 2020. “Erasure, misrepresentation and confusion: Investigating JSTOR topics on women’s and race histories,” Digital Humanities Quarterly, volume 14, number 1, at http://www.digitalhumanities.org/dhq/vol/14/1/000448/000448.html, accessed 20 May 2020.

Loes Bogers, Sabine Niederer, Federica Bardelli and Carlo De Gaetano, 2020. “Confronting bias in the online representation of pregnancy,” Convergence, volume 26, numbers 5–6, pp. 1,037–1,059.
doi: https://doi.org/10.1177/1354856520938606, accessed 21 June 2022.

danah boyd and Kate Crawford, 2012. “Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon,” Information, Communication & Society, volume 15, number 5, pp. 662–679.
doi: https://doi.org/10.1080/1369118X.2012.678878, accessed 21 June 2022.

Simone Browne, 2015. Dark matters: On the surveillance of blackness. Durham, N.C.: Duke University Press.
doi: https://doi.org/10.1215/9780822375302, accessed 21 June 2022.

Judith Butler, 1990. Gender trouble: Feminism and the subversion of identity. New York: Routledge.

John Cheney-Lippold, 2017. We are data: Algorithms and the making of our digital selves. New York: New York University Press.

M. Cifor, P. Garcia, T.L. Cowan, J. Rault, T. Sutherland, A. Chan, J. Rode, A.L. Hoffmann, N. Salehi and L. Nakamura, 2019. “Feminist Data Manifest-No,” at https://www.manifestno.com/, accessed 21 June 2022.

Sarah Ciston, 2019. “Intersectional AI is essential: Polyvocal, multimodal, experimental methods to save artificial intelligence,” Journal of Science and Technology of the Arts, volume 11, number 2, pp. 3–8.
doi: https://doi.org/10.7559/citarj.v11i2.665, accessed 21 June 2022.

Cheryl Cooky, Jasmine R. Linabary and Danielle J. Corple, 2018. “Navigating Big Data dilemmas: Feminist holistic reflexivity in social media research,” Big Data & Society (30 October).
doi: https://doi.org/10.1177/2053951718807731, accessed 16 August 2021.

Primavera De Filippi, 2013. “Taxing the cloud: Introducing a new taxation system on data collection,” Internet Policy Review, volume 2, number 2.
doi: https://doi.org/10.14763/2013.2.124, accessed 21 June 2022.

Catherine D’Ignazio and Lauren F. Klein, 2020a. Data feminism. Cambridge, Mass.: MIT Press.
doi: https://doi.org/10.7551/mitpress/11805.001.0001, accessed 21 June 2022.

Catherine D’Ignazio and Lauren F. Klein, 2020b. “Seven intersectional feminist principles for equitable and actionable COVID-19 data,” Big Data & Society (30 July).
doi: https://doi.org/10.1177/2053951720942544, accessed 21 June 2022.

Claude Draude, Goda Klumbyte, Phillip Lücking and Pat Treusch, 2019. “Situated algorithms: A sociotechnical systemic approach to bias,” Online Information Review, volume 44, number 2, pp. 325–342.
doi: https://doi.org/10.1108/OIR-10-2018-0332, accessed 21 June 2022.

Virginia Eubanks, 2018. Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press.

Rayees Farooq, 2021. “Mapping the field of knowledge management: A bibliometric analysis using R,” VINE Journal of Information and Knowledge Management Systems (15 September).
doi: https://doi.org/10.1108/VJIKMS-06-2021-0089, accessed 21 June 2022.

Maddalena Favaretto, Eva de Clerq and Bernice Simone Elger, 2019. “Big Data and discrimination: Perils, promises and solutions. A systematic review,” Journal of Big Data, volume 6, article number 12.
doi: https://doi.org/10.1186/s40537-019-0177-4, accessed 21 June 2022.

Francesca Ferrando, 2014. “Is the post-human a post-woman? Cyborgs, robots, artificial intelligence and the futures of gender: A case study,” European Journal of Futures Research, volume 2, article number 43.
doi: https://doi.org/10.1007/s40309-014-0043-8, accessed 21 June 2022.

Sofie Flensburg and Stine Lomborg, 2021. “Datafication research: Mapping the field for a future agenda,” New Media & Society (24 September).
doi: https://doi.org/10.1177/14614448211046616, accessed 21 June 2022.

Aristea Fotopoulou, 2019. “Understanding citizen data practices from a feminist perspective: Embodiment and the ethics of care,” In: Hilde C. Stephansen and Emiliano Treré (editors). Citizen media and practice: Currents, connections, challenges. London: Routledge, pp. 227–242.
doi: https://doi.org/10.4324/9781351247375, accessed 21 June 2022.

Sheera Frenkel and Cecilia Kang, 2021. An ugly truth: Inside Facebook’s battle for domination. New York: HarperCollins.

Jen Jack Gieseking, 2018. “Size matters to lesbians, too: Queer feminist interventions into the scale of Big Data,” Professional Geographer, volume 70, number 1, pp. 150–156.
doi: https://doi.org/10.1080/00330124.2017.1326084, accessed 20 May 2020.

Claudia Gilberg, 2018. “Feminism and healthcare: Toward a feminist pragmatist model of healthcare provision,” In: Pranee Liamputtong (editor). Handbook of research methods in health social sciences. Singapore: Springer. pp. 1–18.
doi: https://doi.org/10.1007/978-981-10-2779-6_64-1, accessed 21 June 2022.

Jutta Haider and Olof Sundin, 2019. Invisible search and online search engines: The ubiquity of search in everyday life. London: Routledge.
doi: https://doi.org/10.4324/9780429448546, accessed 21 June 2022.

Foad Hamidi, Morgan Klaus Scheuerman and Stacy M. Branham, 2018. “Gender recognition or gender reductionism? The social implications of automatic gender recognition systems,” CHI ’18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, paper number 8, pp. 1–13.
doi: https://doi.org/10.1145/3173574.3173582, accessed 21 June 2022.

Björn Hammarfelt, 2018. “What is a discipline? The conceptualization of research areas and their operationalization in bibliometric research,” STI2018: Proceedings of the 23rd International Conference on Science and Technology Indicators, pp. 197–203, and at https://www.diva-portal.org/smash/get/diva2:1248151/FULLTEXT01.pdf, accessed 21 June 2022.

Donna Haraway, 1988. “Situated knowledges: The science question in feminism and the privilege of the partial,” Feminist Studies, volume 14, number 3, pp 575–599.
doi: https://doi.org/10.2307/3178066, accessed 21 June 2022.

Sandra Harding, 1991. Whose science? Whose knowledge? Thinking from women’s lives. Ithaca, N.Y.: Cornell University Press.

Sandra Harding, 1986. The science question in feminism. Milton Keynes: Open University Press.

Rachel Louise Healy, 2021. “Zuckerberg, get out of my uterus! An examination of fertility apps, data-sharing and remaking the female body as a digitalized reproductive subject,” Journal of Gender Studies, volume 30, number 4, pp. 406–416.
doi: https://doi.org/10.1080/09589236.2020.1845628, accessed 21 June 2022.

Rosemary Lucy Hill, Helen Kennedy and Ysabel Gerrard, 2016. “Visualizing junk: Big Data visualizations and the need for Feminist Data Studies,” Journal of Communication Inquiry, volume 40, number 4, pp. 331–350.br>doi: https://doi.org/10.1177/0196859916666041, accessed 21 June 2022.

Birger Hjørland, 2002. “Domain analysis in information science: Eleven approaches — traditional as well as innovative,” Journal of Documentation, volume 58, number 4, pp. 422–462.
doi: https://doi.org/10.1108/00220410210431136, accessed 20 May 2020.

Anna Lauren Hoffmann, 2020. “Terms of inclusion: Data, discourse, violence,” New Media & Society, volume 23, number 12, pp. 3,539–3,556.
doi: https://doi.org/10.1177/1461444820958725, accessed 17 November 2021.

Allan G. Johnson, 2005. The gender knot: Unraveling our patriarchal legacy. Revised and updated edition. Philadelphia, Pa.: Temple University Press.

Tetyana Krupiy, 2020. “A vulnerability analysis: Theorising the impact of artificial intelligence decision-making processes on individuals, society and human diversity from a social justice perspective,” Computer Law & Security Review, volume 38, 105429.
doi: https://doi.org/10.1016/j.clsr.2020.105429, accessed 21 June 2022.

Lawrence Lessig, 2006. Code. Version 2.0. New York: Basic Books.

Koen Leurs, 2017. “Feminist data studies: Using digital methods for ethical, reflexive and situated socio-cultural research,” Feminist Review, volume 115, number 1, pp. 130–154.
doi: https://doi.org/10.1057/s41305-017-0043-1, accessed 16 August 2021.

Mary Elizabeth Luka and Melanie Millette, 2018. “(Re)framing Big Data: Activating situated knowledges and a feminist ethics of care in social media research,” Social Media + Society (2 May).
doi: https://doi.org/10.1177/2056305118768297, accessed 16 August 2021.

Bojan Macan, 2011. “Bibliometric analysis of the journal Kemija u industriji for the period 2000–2009,” Kemija U Industriji, volume 60, number 2, pp. 81–88, and at http://www.hdki.hr/kui/vol60/broj02/81.pdf, accessed 21 June 2022.

Dan McQuillan, 2016. “Algorithmic paranoia and the convivial alternative,” Big Data & Society (24 November).
doi: https://doi.org/10.1177/2053951716671340, accessed 16 August 2021.

Jacob Metcalf, Emanuel Moss and danah boyd, 2019. “Owning ethics: Corporate logics, Silicon Valley, and the institutionalization of ethics,” Social Research, volume 86, number 2, pp. 449–476.

Robert Merton, 1968. “The Matthew Effect in science,” Science, volume 159, number 3810 (5 January), pp. 56–63.
doi: https://doi.org/10.1126/science.159.3810.56, accessed 21 June 2022.

Shakir Mohamed, Marie-Therese Png and William Isaac, 2020. “Decolonial AI: Decolonial theory as sociotechnical foresight in artificial intelligence,” Philosophy & Technology, volume 33, pp. 659–684.
doi: https://doi.org/10.1007/s13347-020-00405-8, accessed 21 June 2022.

Luca Mora, Roberto Bolici and Mark Deakin, 2017. “The first two decades of smart-city research: A bibliometric analysis,” Journal of Urban Technology, volume 24, number 1, pp. 3–27.
doi: https://doi.org/10.1080/10630732.2017.1285123, accessed 21 June 2022.

Lucy Nicholas, 2014. Queer post-gender ethics: The shape of selves to come. London: Palgrave Macmillan.
doi: https://doi.org/10.1057/9781137321626, accessed 21 June 2022.

Safiya Umoja Noble, 2016. “A future for intersectional Black feminist technology studies,” S&F Online, at https://sfonline.barnard.edu/traversing-technologies/safiya-umoja-noble-a-future-for-intersectional-black-feminist-technology-studies/, accessed 21 June 2022.

Merin Oleschuk, 2020. “Gender equity considerations for tenure and promotion during COVID19,” Canadian Review of Sociology, volume 57, number 3, pp. 502–515.
doi: https://doi.org/10.1111/cars.12295, accessed 21 June 2022.

Catherine O’Neil, 2017. Weapons of math destruction: How big data increases inequality and threatens democracy. New York: Broadway Books.

Bianca Prietl, 2019. “Algorithmische Entscheidungssysteme revisited: Wie Maschinen gesellschaftliche Herrschaftsverhältnisse reproduzieren können,” Feministische Studien, volume 37, number 2, pp. 303–319.
doi: https://doi.org/10.1515/fs-2019-0029, accessed 21 June 2022.

Maria Puig de la Bellacasa, 2017. Matters of care: Speculative ethics in more than human worlds. Minneapolis: University of Minnesota Press.

Rita Raley, 2013. “Dataveillance and countervailance,” In: Lisa Gitelman (editor). “Raw data” is an oxymoron. Cambridge, Mass.: MIT Press, pp. 121–146.
doi: https://doi.org/10.7551/mitpress/9302.003.0009, accessed 21 June 2022.

Pamela Robinson and Peter A. Johnson, 2021. “Pandemic-driven technology adoption: Public decision makers need to tread cautiously,” International Journal of E-Planning Research, volume 10, number 2, pp. 59–65.
doi: https://doi.org/10.4018/IJEPR.20210401.oa5, accessed 21 June 2022.

Bonnie Ruberg and Spencer Ruelos, 2020. “Data for queer lives: How LGBTQ gender and sexuality identities challenge norms of demographics,” Big Data & Society (18 June).
doi: https://doi.org/10.4018/IJEPR.20210401.oa5, accessed 21 June 2022.

Amy Schiller and John McMahon, 2019. “Alexa, alert me when the revolution comes: Gender, affect, and labor in the age of home-based artificial intelligence,” New Political Science, volume 41, number 2, pp. 173–191.
doi: https://doi.org/10.1080/07393148.2019.1595288, accessed 20 May 2020.

Jonathan E. Schroeder, 2021. “Reinscribing gender: Social media, algorithms, bias,” Journal of Marketing Management, volume 37, numbers 3–4, pp. 376–378.
doi: https://doi.org/10.1080/0267257X.2020.1832378, accessed 16 August 2021.

Geordan Shannon, Nicole Minckas, Des Tan, Hassan Haghparast-Bidgoli, Neha Batura and Jenevieve Mannell, 2019. “Feminisation of the health workforce and wage conditions of health professions: an exploratory analysis,” Human Resources for Health, volume 17, article number 72.
doi: https://doi.org/10.1186/s12960-019-0406-0, accessed 21 June 2022.

Tom Simonite, 2021. “What really happened when Google ousted Timnit Gebru,” Wired (8 June), at https://www.wired.com/story/google-timnit-gebru-ai-what-really-happened/, accessed 21 June 2022.

Shannon D. Snapp, Stephen T. Russell and Mariella Arredondo, 2016. “A right to disclose: LGBTQ youth representation in data, science, and policy,” Advances in Child Development and Behavior, volume 50, pp. 135–159.
doi: https://doi.org/10.1016/bs.acdb.2015.11.005, accessed 16 August 2021.

Katta Spiel, Os Keyes and Pinar Barlas, 2019. “Patching gender: Non-binary utopias in HCI,” CHI EA ’19: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, paper number alt05, pp. 1–11.
doi: https://doi.org/10.1145/3290607.3310425, accessed 21 June 2022.

Fernanda Staniscuaski, Fernanda Reichert, Fernanda P. Werneck, Letícia de Oliveira, Pâmela B. Mello-Carpes, Rossana C. Soletti, Camila Infanger Almeida, Eugenia Zandona, Felipe Klein Ricachenevsky, Adriana Neumann, Ida Vanessa D. Schwartz, Alessandra Sayuri Kikuchi Tamajusuku, Adriana Seixas and Livia Kmetzsch, 2020. “Impact of COVID-19 in academic mothers,” Science, volume 368, number 6492 (15 May), p. 724.
doi: https://doi.org/10.1126/science.abc2740, accessed 21 June 2022.

Sara Suárez-Gonzalo, 2019. “Personal data is political: A feminist view on privacy and big data,” Recerca: Revista de Pensament i Anàlisi, volume 24, number 2, pp. 173–192.
doi: http://dx.doi.org/10.6035/Recerca.2019.24.2.9, accessed 20 May 2021.

Linnet Taylor, 2020. “The price of certainty: How the politics of pandemic data demand an ethics of care,” Big Data & Society (22 July).
doi: https://doi.org/10.1177/2053951720942539, accessed 20 May 2021.

Rosalyn Terborg-Penn, 1995. “Social hierarchies — Theorizing Black feminisms: The visionary pragmatism of Black women edited by Stanlie M. James and Abena P. A. Busia with a foreword by Johnnetta Betsch Cole,” Contemporary Sociology, volume 24, number 4, p. 341.

Siva Vaidhyanathan, 2022. Antisocial media: How Facebook disconnects us and undermines democracy. Oxford: Oxford University Press.
doi: https://doi.org/10.1093/oso/9780190056544.001.0001, accessed 21 June 2022.

José van Dijck, 2014. “Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology,” Surveillance & Society, volume 12, number 2, pp. 198–208.
doi: https://doi.org/10.24908/ss.v12i2.4776, accessed 21 June 2022.

José van Dijck, Thomas Poell and Martijn de Waal 2018. The platform society: Public values in a connective world. New York: Oxford Unviersity Press.
doi: https://doi.org/10.1093/oso/9780190889760.001.0001, accessed 21 June 2022.

Nees Jan van Eck and Jan Waltman, 2011. “Text mining and visualization using VOSviewer,” arXiv:1109.2058 (9 September), at https://arxiv.org/abs/1109.2058, accessed 21 June 2022.

Thed N. Van Leeuwen, Henk F. Moed, Robert J. W. Tijssen, Martijn S. Visser and Anthony F.J. Van Raan, 2001. “Language biases in the coverage of the Science Citation Index and its consequences for international comparisons of national research performance,” Scientometrics, volume 51, number 1, pp. 335–346.
doi: https://doi.org/10.1023/A:1010549719484, accessed 21 June 2022.

Ellen van Oost, 2000. “Making the computer masculine: The historical roots of gendered representations,” In: Ellen Balka and Richard Smith (editors). Women, work and computerization: Charting a course to the future. Boston, Mass.: Springer, pp. 9–16.
doi: https://doi.org/10.1007/978-0-387-35509-2_2, accessed 21 June 2022.

Samuel F. Way, Daniel B. Larremore and Aaron Clauset, 2016. “Gender, productivity, and prestige in computer science faculty hiring networks,” WWW ’16: Proceedings of the 25th International Conference on World Wide Web, pp. 1,169–1,179.
doi: https://doi.org/10.1145/2872427.2883073, accessed 21 June 2022.

Lindsay Weinberg, 2017. “Rethinking privacy: A feminist approach to privacy rights after Snowden,” Westminster Papers in Communication and Culture, volume 12, number 3, pp. 5–20.
doi: https://doi.org/10.16997/wpcc.258, accessed 21 June 2022.

Lynette Yarger, Fay Cobb Payton and Bikalpa Neupane, 2020. “Algorithmic equity in the hiring of underrepresented IT job candidates,” Online Information Review, volume 44, number 2, pp. 383–395.
doi: https://doi.org/10.1108/OIR-10-2018-0334, accessed 21 June 2022.

Janice Yoder, 2018. “Challenging the gendered academic hierarchy: The artificial separation of research, teaching, and feminist activism,” Psychology of Women Quarterly, volume 42, number 2, pp. 127–135.
doi: https://doi.org/10.1177/0361684318762695, accessed 21 June 2022.

Ai Yu, 2020. “Digital surveillance in post-coronavirus China: A feminist view on the price we pay,” Feminist Frontiers, volume 27, number 5, pp. 774–777.
doi: https://doi.org/10.1111/gwao.12471, accessed 21 June 2022.

Shoshana Zuboff, 2019. In the age of surveillance capitalism: The fight for a human future at the new frontier of power. London: Profile Books.

 

Appendix

  1. A. Adam, 1995a. “Artificial intelligence and women’s knowledge: What can feminist epistemologies tell us?” Women’s Studies International Forum, volume 18, number 4, pp. 407–415.
    doi: https://doi.org/10.1016/0277-5395(95)80032-K, accessed 21 June 2022.

  2. A. Adam, 1995b. “A feminist critique of artificial intelligence,” European Journal of Women’s Studies, volume 2, number 3, 355–377.
    doi: https://doi.org/10.1177/135050689500200305, accessed 21 June 2022.

  3. A. Adam, 2002. “Gender/body/machine,” Ratio, volume 15, number 4, pp. 354–375.
    doi: https://doi.org/10.1111/1467-9329.00197, accessed 21 June 2022.

  4. H.B. Adams, R. Applegarth, and A.H. Simpson, 2020. “Acting with algorithms: Feminist propositions for rhetorical agency,” Computers and Composition, volume 57, 102581.
    doi: https://doi.org/10.1016/j.compcom.2020.102581, accessed 14 August 2021.

  5. D. Agostinho, 2018. “Chroma key dreams: Algorithmic visibility, fleshy images and scenes of recognition,” Philosophy of Photography, volume 9, number 2, pp. 131–155.
    doi: https://doi.org/10.1386/pop.9.2.131_1, accessed 16 August 2021.

  6. D. Agostinho, 2019. “The optical unconscious of Big Data: Datafication of vision and care for unknown futures,” Big Data & Society (5 February).
    doi: https://doi.org/10.1177/2053951719826859, accessed 21 June 2022.

  7. S. Amrute, 2019. “Of techno-ethics and techno-affects,” Feminist Review, volume 123, number 1, pp. 56–73.
    doi: https://doi.org/10.1177/0141778919879744, accessed 21 June 2022.

  8. S.E. Baker, 2018. “Post-work Futures and full automation: Towards a feminist design methodology,” Open Cultural Studies, volume 2, number 1, pp. 540–552.
    doi: https://doi.org/10.1515/culture-2018-0049, accessed 21 June 2022.

  9. E. Balka and R. Smith (editors), 2000. Women, work and computerization: Charting a course to the future. New York: Springer.
    doi: https://doi.org/10.1007/978-0-387-35509-2, accessed 21 June 2022.

  10. K. Ball, 2005. “Organization, surveillance and the body: Towards a politics of resistance,” Organization, volume 12, number 1, pp. 89–108.
    doi: https://doi.org/10.1177/1350508405048578, accessed 21 June 2022.

  11. N. Barnes, 2020. “Trace publics as a qualitative critical network tool: Exploring the dark matter in the #MeToo movement,” New Media & Society, volume 22, number 7, pp. 1,305–1,319.
    doi: https://doi.org/10.1177/1461444820912532, accessed 21 June 2022.

  12. F.D. Berman and P.E. Bourne, 2015. “Let’s make gender diversity in data science a priority right from the start,” PLoS Biology, volume 13, number 7, e1002206.
    doi: https://doi.org/10.1371/journal.pbio.1002206, accessed 21 June 2022.

  13. S. Bishop, 2019. “Managing visibility on YouTube through algorithmic gossip,” New Media & Society, volume 31, numbers 11–12, pp. 2,589–2,606.
    doi: https://doi.org/10.1177/1461444819854731, accessed 16 August 2021.

  14. R. Bivens, 2017. “The gender binary will not be deprogrammed: Ten years of coding gender on Facebook,” New Media & Society, volume 19, number 6, pp. 880–898.
    doi: https://doi.org/10.1177/1461444815621527, accessed 21 June 2022.

  15. R. Bivens and O.L. Haimson, 2016. “Baking gender into social media design: How platforms shape categories for users and advertisers,” Social Media + Society (12 October).
    doi: https://doi.org/10.1177/2056305116672486, accessed 16 August 2021.

  16. L. Bogers, S. Niederer, F. Bardelli and C. De Gaetano, 2020. “Confronting bias in the online representation of pregnancy,” Convergence, volume 26, numbers 5–6, pp. 1,037–1,059.
    doi: https://doi.org/10.1177/1354856520938606, accessed 21 June 2022.

  17. S. Buchmüller, C. Bath and R. Henze, 2018. “To whom does the driver’s seat belong in the future? A case of negotiation between gender studies and automotive engineering,” GenderIT ’18: Proceedings of the 4th Conference on Gender & IT, pp. 165–174.
    doi: https://doi.org/10.1145/3196839.3196866, accessed 21 June 2022.

  18. J. Buolamwini and T. Gebru, 2018. “Gender shades: Intersectional accuracy disparities in commercial gender classification,” Proceedings of Machine Learning Research, volume 81, pp. 1–15, at https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf, accessed 21 June 2022.

  19. J.-M. Chenou and C. Cepeda-Másmela, 2019. “#NiUnaMenos: Data activism from the global south,” Television & New Media, volume 20, number 4, pp. 396–411.
    doi: https://doi.org/10.1177/1527476419828995, accessed 21 June 2022.

  20. S. Ciston, 2019. “Intersectional AI is essential: Polyvocal, multimodal, experimental methods to save artificial intelligence,” Journal of Science and Technology of the Arts, volume 11, number 2, pp. 3–8.
    doi: https://doi.org/10.7559/citarj.v11i2.665, accessed 21 June 2022.

  21. S.J. Concannon, M. Balaam, E. Simpson and R. Comber, 2018. “Applying computational analysis to textual data from the wild: A feminist perspective,” CHI ’18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, paper number 226, pp. 1–13.
    doi: http://doi.org/10.1145/3173574.3173800, accessed 21 June 2022.

  22. C. Cooky, J.R. Linabary and D.J. Corple, 2018. “Navigating Big Data dilemmas: Feminist holistic reflexivity in social media research,” Big Data & Society (30 October).
    doi: https://doi.org/10.1177/2053951718807731, accessed 16 August 2021.

  23. D.J. Corple and J.R. Linabary, 2020. “From data points to people: Feminist situated ethics in online Big Data research,” International Journal of Social Research Methodology, volume 23, number 2, pp. 155–168.
    doi: https://doi.org/10.1080/13645579.2019.1649832, accessed 21 June 2022.

  24. T.M. Cruz, 2020. “Perils of data-driven equity: Safety-net care and Big Data’s elusive grasp on health inequality,” Big Data & Society (9 June).
    doi: https://doi.org/10.1177/2053951720928097, accessed 21 June 2022.

  25. V. Davion, 2002. “Anthropocentrism, artificial intelligence, and moral network theory: An ecofeminist perspective,” Environmental Values, volume 11, number 2, 163–176.
    doi: https://doi.org/10.3197/096327102129341037, accessed 21 June 2022.

  26. C. D’Ignazio and L.F. Klein, 2020. “Seven intersectional feminist principles for equitable and actionable COVID-19 data,” Big Data & Society (30 July).
    doi: https://doi.org/10.1177/2053951720942544, accessed 21 June 2022.

  27. L. Dillon, R. Lave, B. Mansfield, S. Wylie, N. Shapiro, A.S. Chan and M. Murphy, 2019. “Situating data in a Trumpian era: The environmental data and governance initiative,” Annals of the American Association of Geographers, volume 109, number 2, pp. 545–555.
    doi: https://doi.org/10.1080/24694452.2018.1511410, accessed 21 June 2022.

  28. É. Dionne, 2019. “Algorithmic mediation, the digital era, and healthcare practices: A feminist new materialist analysis,” Global Media Journal — Canadian Edition, volume 11, number 2, pp. 49–65.

  29. E. Dixon-Román, T.P. Nichols and A. Nyame-Mensah, 2020. “The racializing forces of/in AI educational technologies,” Learning, Media and Technology, volume 45, number 3, pp. 236–250.
    doi: https://doi.org/10.1080/17439884.2020.1667825, accessed 21 June 2022.

  30. C. Draude, G. Klumbyte, P. Lücking and P. Treusch, 2019. “Situated algorithms: A sociotechnical systemic approach to bias,” Online Information Review, volume 44, number 2, pp. 325–342.
    doi: https://doi.org/10.1108/OIR-10-2018-0332, accessed 21 June 2022.

  31. S. Dyer and G. Ivens, 2020. “What would a feminist open source investigation look like?” Digital War, volume 1, pp. 5–17.
    doi: https://doi.org/10.1057/s42984-020-00008-9, accessed 21 June 2022.

  32. S. Elwood and A. Leszczynski, 2018. “Feminist digital geographies,” Gender, Place & Culture, volume 25, number 5, pp. 629–644.
    doi: https://doi.org/10.1080/0966369X.2018.1465396, accessed 21 June 2022.

  33. M. Favaretto, E. de Clerq and B.S. Elger, 2019. “Big Data and discrimination: Perils, promises and solutions. A systematic review,” Journal of Big Data, volume 6, article number 12.
    doi: https://doi.org/10.1186/s40537-019-0177-4, accessed 21 June 2022.

  34. F. Ferrando, 2014. “Is the post-human a post-woman? Cyborgs, robots, artificial intelligence and the futures of gender: A case study,” European Journal of Futures Research, volume 2, article number 43.
    doi: https://doi.org/10.1007/s40309-014-0043-8, accessed 21 June 2022.

  35. D. Ferreira and M. Vale, 2022. “Geography in the big data age: An overview of the historical resonance of current debates,” Geographical Review, volume 112, number 2, pp. 250–266.
    doi: https://doi.org/10.1080/00167428.2020.1832424, accessed 21 June 2022.

  36. A. Fotopoulou, 2019. “Understanding citizen data practices from a feminist perspective: Embodiment and the ethics of care,” In: Hilde C. Stephansen and Emiliano Treré (editors). Citizen media and practice: Currents, connections, challenges. London: Routledge, pp. 227–242.
    doi: https://doi.org/10.4324/9781351247375, accessed 21 June 2022.

  37. J.J. Gieseking, 2018a. “Operating anew: Queering GIS with good enough software,” Canadian Geographer/Le Géographe Canadien, volume 62, number 1, pp. 55–66.
    doi: https://doi.org/10.1111/cag.12397, accessed 21 June 2022.

  38. J.J. Gieseking, 2018b. “Size matters to lesbians, too: Queer feminist interventions into the scale of Big Data,” Professional Geographer, volume 70, number 1, pp. 150–156.
    doi: https://doi.org/10.1080/00330124.2017.1326084, accessed 20 May 2020.

  39. D. Ging, T. Lynn and P. Rosati, 2020. “Neologising misogyny: Urban Dictionary’s folksonomies of sexual abuse,” New Media & Society, volume 22, number 5, pp. 838–856.
    doi: https://doi.org/10.1177/1461444819870306, accessed 21 June 2022.

  40. F. Hamidi, M.K. Scheuerman and S.M. Branham, 2018. “Gender recognition or gender reductionism? The social implications of automatic gender recognition systems,” CHI ’18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, paper number 8, pp. 1–13.
    doi: https://doi.org/10.1145/3173574.3173582, accessed 21 June 2022.

  41. P. Hayes and D. Jackson, 2020. “Care ethics and the responsible management of power and privacy in digitally enhanced disaster response,” Journal of Information, Communication and Ethics in Society, volume 18, number 1, pp. 157–174.
    doi: https://doi.org/10.1108/JICES-02-2019-0020, accessed 21 June 2022.

  42. P. Hayes, I. van de Poel and M. Steen, 2020. “Algorithms and values in justice and security,” AI & Society, volume 35, number 3, pp. 533–555.
    doi: https://doi.org/10.1007/s00146-019-00932-9, accessed 21 June 2022.

  43. R.L. Healy, 2021. “Zuckerberg, get out of my uterus! An examination of fertility apps, data-sharing and remaking the female body as a digitalized reproductive subject,” Journal of Gender Studies, volume 30, number 4, pp. 406–416.
    doi: https://doi.org/10.1080/09589236.2020.1845628, accessed 21 June 2022.

  44. R.L. Hill, H. Kennedy and Y. Gerrard, 2016. “Visualizing junk: Big Data visualizations and the need for Feminist Data Studies,” Journal of Communication Inquiry, volume 40, number 4, pp. 331–350.
    doi: https://doi.org/10.1177/0196859916666041, accessed 21 June 2022.

  45. A.L. Hoffmann, 2020. “Terms of inclusion: Data, discourse, violence,” New Media & Society, volume 23, number 12, pp. 3,539–3,556.
    doi: https://doi.org/10.1177/1461444820958725, accessed 17 November 2021.

  46. R. Hong, 2016. “Soft skills and hard numbers: Gender discourse in human resources,” Big Data & Society (12 October).
    doi: https://doi.org/10.1177/2053951716674237, accessed 21 June 2022.

  47. E. Jones, 2018. “A posthuman-xenofeminist analysis of the discourse on autonomous weapons systems and other killing machines,” Australian Feminist Law Journal, volume 44, number 1, pp. 93–118.
    doi: https://doi.org/10.1080/13200968.2018.1465333, accessed 21 June 2022.

  48. A. Kerr, R.L. Hill and C. Till, 2018. “The limits of responsible innovation: Exploring care, vulnerability and precision medicine,” Technology in Society, volume 52, pp. 24–31.
    doi: https://doi.org/10.1016/j.techsoc.2017.03.004, accessed 21 June 2022.

  49. S. Kirkwood, V. Cree, D. Winterstein, A. Nuttgens and J. Sneddon, 2018. “Encountering #Feminism on Twitter: Reflections on a research collaboration between social scientists and computer scientists,” Sociological Research Online, volume 23, number 4, pp. 763–779.
    doi: https://doi.org/10.1177/1360780418781615, accessed 21 June 2022.

  50. T. Krupiy, 2020. “A vulnerability analysis: Theorising the impact of artificial intelligence decision-making processes on individuals, society and human diversity from a social justice perspective,” Computer Law & Security Review, volume 38, 105429.
    doi: https://doi.org/10.1016/j.clsr.2020.105429, accessed 21 June 2022.

  51. A. Larrondo, J. Morales-i-Gras and J. Orbegozo-Terradillos, 2019. “Feminist hashtag activism in Spain: Measuring the degree of politicisation of online discourse on #YoSíTeCreo, #HermanaYoSíTeCreo, #Cuéntalo y #NoEstásSola,” Communication & Society, volume 32, number 4.
    doi: https://doi.org/10.15581/003.32.4.207-221, accessed 21 June 2022.

  52. A. Leszczynski, 2020. “Digital methods III: The digital mundane,” Progress in Human Geography, volume 44, number 6, pp. 1,194–1,201.
    doi: https://doi.org/10.1177/0309132519888687, accessed 21 June 2022.

  53. K. Leurs, 2017. “Feminist data studies: Using digital methods for ethical, reflexive and situated socio-cultural research,” Feminist Review, volume 115, number 1, pp. 130–154.
    doi: https://doi.org/10.1057/s41305-017-0043-1, accessed 16 August 2021.

  54. J.R. Linabary, D.J. Corple and C. Cooky, 2021. “Of wine and whiteboards: Enacting feminist reflexivity in collaborative research,” Qualitative Research, volume 21, number 5, pp. 719–735.
    doi: https://doi.org/10.1177/1468794120946988, accessed 21 June 2022.

  55. J. Llamas-Rodriguez, 2017. “The datalogical drug mule,” Feminist Media Histories, volume 3, number 3, pp. 9–29.
    doi: https://doi.org/10.1525/fmh.2017.3.3.9, accessed 21 June 2022.

  56. N.N. Loideain and R. Adams, 2020. “From Alexa to Siri and the GDPR: The gendering of virtual personal assistants and the role of data protection impact assessments,” Computer Law & Security Review, volume 36, 105366.
    doi: https://doi.org/10.1016/j.clsr.2019.105366, accessed 21 June 2022.

  57. E. Losh, 2015. “Feminism reads big data: ‘Social physics,’ atomism, and Selfiecity,” International Journal of Communication, volume 9, at https://ijoc.org/index.php/ijoc/article/view/3152, accessed 21 June 2022.

  58. M.E. Luka and M. Millette, 2018. “(Re)framing Big Data: Activating situated knowledges and a feminist ethics of care in social media research,” Social Media + Society (2 May).
    doi: https://doi.org/10.1177/2056305118768297, accessed 16 August 2021.

  59. A.N. Markham, K. Tiidenberg and A. Herman, 2018. “Ethics as methods: Doing ethics in the era of big data research — Introduction,” Social Media + Society (19 July).
    doi: https://doi.org/10.1177/2056305118784502, accessed 21 June 2022.

  60. C.D. Martínez, P.D. García and P.N. Sustaeta, 2020. “Hidden gender bias in big data as revealed by neural networks: Man is to woman as work is to mother?” Revista Española de Investigaciones Sociológicas (REIS), number 172, pp. 41–60.
    doi: https://doi.org/10.5477/cis/reis.172.41, accessed 21 June 2022.

  61. M. McGill, 2020. “The public is personal: Reflections on the ethical dimensions of selfie research,” Visual Studies, volume 35, numbers 2–3, pp. 193–200.
    doi: https://doi.org/10.1080/1472586X.2020.1779607, accessed 21 June 2022.

  62. D. McQuillan, 2016. “Algorithmic paranoia and the convivial alternative,” Big Data & Society (24 November).
    doi: https://doi.org/10.1177/2053951716671340, accessed 16 August 2021.

  63. D. McQuillan, 2017. “The Anthropocene, resilience and post-colonial computation,” Resilience, volume 5, number 2, pp. 92–109.
    doi: https://doi.org/10.1080/21693293.2016.1240779, accessed 21 June 2022.

  64. J. Metcalf, E. Moss and d. boyd, 2019. “Owning ethics: Corporate logics, Silicon Valley, and the institutionalization of ethics,” Social Research, volume 86, number 2, pp. 449–476.

  65. M. Michailidou, 2018. “Feminist methodologies for the study of digital worlds,” International Journal of Media & Cultural Politics, volume 14, number 1, pp. 19–33.
    doi: https://doi.org/10.1386/macp.14.1.19_1, accessed 21 June 2022.

  66. A. Middleton, 2022. “The datafication of pain: Trials and tribulations in measuring phantom limb pain,” BioSocieties, volume 17, pp. 123–144.
    doi: https://doi.org/10.1057/s41292-020-00203-7, accessed 21 June 2022.

  67. S. Mohamed, M.-T. Png and W. Isaac, 2020. “Decolonial AI: Decolonial theory as sociotechnical foresight in artificial intelligence,” Philosophy & Technology, volume 33, pp. 659–684.
    doi: https://doi.org/10.1007/s13347-020-00405-8, accessed 21 June 2022.

  68. K. O’Meara, D. Culpepper and L.L. Templeton, 2020. “Nudging toward diversity: Applying behavioral design to faculty hiring,” Review of Educational Research, volume 90, number 3, pp. 311–348.
    doi: https://doi.org/10.3102/0034654320914742, accessed 21 June 2022.

  69. E. van Oost, 2000. “Making the computer masculine: The historical roots of gendered representations,” In: E. Balka and R. Smith (editors). Women, work and computerization: Charting a course to the future. Boston, Mass.: Springer, pp. 9–16.
    doi: https://doi.org/10.1007/978-0-387-35509-2_2, accessed 21 June 2022.

  70. M. Pettit, 2016. “Historical time in the age of big data: Cultural psychology, historical change, and the Google Books Ngram Viewer,” History of Psychology, volume 19, number 2, pp. 141–153.
    doi: https://doi.org/10.1037/hop0000023, accessed 21 June 2022.

  71. C. Pinel, B. Prainsack and C. McKevitt, 2020. “Caring for data: Value creation in a data-intensive research laboratory,” Social Studies of Science, volume 50, number 2, pp. 175–197.
    doi: https://doi.org/10.1177/0306312720906567, accessed 21 June 2022.

  72. N. Pirani, B.A. Ricker and M.J. Kraak, 2020. “Feminist cartography and the United Nations Sustainable Development Goal on gender equality: Emotional responses to three thematic maps,” Canadian Geographer/Le Géographe Canadien, volume 64, number 2, pp. 184–198.
    doi: https://doi.org/10.1111/cag.12575, accessed 21 June 2022.

  73. B. Prietl, 2019. “Algorithmische Entscheidungssysteme revisited: Wie Maschinen gesellschaftliche Herrschaftsverhältnisse reproduzieren können,” Feministische Studien, volume 37, number 2, pp. 303–319.
    doi: https://doi.org/10.1515/fs-2019-0029, accessed 21 June 2022.

  74. L. Richardson, 2018. “Feminist geographies of digital work,” Progress in Human Geography, volume 42, number 2, pp. 244–263.
    doi: https://doi.org/10.1177/0309132516677177, accessed 21 June 2022.

  75. R. Risam, 2015. “Beyond the margins: Intersectionality and the digital humanities,” Digital Humanities Quarterly, volume 9, number 2, at http://www.digitalhumanities.org/dhq/vol/9/2/000208/000208.html, accessed 21 June 2022.

  76. P. Robinson and P.A. Johnson, 2021. “Pandemic-driven technology adoption: Public decision makers need to tread cautiously,” International Journal of E-Planning Research, volume 10, number 2, pp. 59–65.
    doi: https://doi.org/10.4018/IJEPR.20210401.oa5, accessed 21 June 2022.

  77. B. Ruberg and S. Ruelos, 2020. “Data for queer lives: How LGBTQ gender and sexuality identities challenge norms of demographics,” Big Data & Society (18 June).
    doi: https://doi.org/10.4018/IJEPR.20210401.oa5, accessed 21 June 2022.

  78. A. Schiller and J. McMahon, 2019. “Alexa, alert me when the revolution comes: Gender, affect, and labor in the age of home-based artificial intelligence,” New Political Science, volume 41, number 2, pp. 173–191.
    doi: https://doi.org/10.1080/07393148.2019.1595288, accessed 20 May 2020.

  79. B. Schinzel, 2018. “IT-driven transcriptions: About gender and ethically relevant usage of speech and metaphors in computing and IT,” GenderIT ’18: Proceedings of the Fourth Conference on Gender & IT, pp. 3–9.
    doi: https://doi.org/10.1145/3196839.3196841, accessed 21 June 2022.

  80. J.E. Schroeder, 2021. “Reinscribing gender: Social media, algorithms, bias,” Journal of Marketing Management, volume 37, numbers 3–4, pp. 376–378.
    doi: https://doi.org/10.1080/0267257X.2020.1832378, accessed 16 August 2021.

  81. C. Sherron, 2000. “Constructing common sense,” In: E. Balka and R. Smith (editors). Women, work and computerization: Charting a course to the future. Boston, Mass.: Springer, pp. 111–118.
    doi: https://doi.org/10.1007/978-0-387-35509-2_14, accessed 21 June 2022.

  82. M. Sloane, E. Moss, O. Awomolo and L. Forlano, 2020. “Participation is not a design fix for machine learning,” arXiv:2007.02423 (5 July), at https://arxiv.org/abs/2007.02423, accessed 21 June 2022.

  83. S.D. Snapp, S.T. Russell and M. Arredondo, 2016. “A right to disclose: LGBTQ youth representation in data, science, and policy,” Advances in Child Development and Behavior, volume 50, pp. 135–159.
    doi: https://doi.org/10.1016/bs.acdb.2015.11.005, accessed 16 August 2021.

  84. K. Spiel, O. Keyes and P. Barlas, 2019. “Patching gender: Non-binary utopias in HCI,” CHI EA ’19: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, paper number alt05, pp. 1–11.
    doi: https://doi.org/10.1145/3290607.3310425, accessed 21 June 2022.

  85. S. Suárez-Gonzalo, 2019. “Personal data is political: A feminist view on privacy and big data,” Recerca: Revista de Pensament i Anàlisi, volume 24, number 2, pp. 173–192.
    doi: http://dx.doi.org/10.6035/Recerca.2019.24.2.9, accessed 20 May 2021.

  86. R. Tassabehji, N. Harding, H. Lee and C. Dominguez-Pery, 2021. “From female computers to male comput♂rs: Or why there are so few women writing algorithms and developing software,” Human Relations, volume 74, number 8, pp. 1,296–1,326.
    doi: https://doi.org/10.1177/0018726720914723, accessed 21 June 2022.

  87. L. Taylor, 2020. “The price of certainty: How the politics of pandemic data demand an ethics of care,” Big Data & Society (22 July).
    doi: https://doi.org/10.1177/2053951720942539, accessed 20 May 2021.

  88. J.O. Terradillos, J.M. i Grass and A.L. Ureta, 2019. “Feminisms outraged at justice: The online conversation on ‘La Manada’ case,“ IC — Revista Científica de Información y Comunicación, volume 16, pp.249–283.
    doi: https://dx.doi.org/10.12795/IC.2019.i01.08, accessed 21 June 2022.

  89. T.L. Thompson, 2020. “Data-bodies and data activism: Presencing women in digital heritage research,” Big Data & Society (23 November).
    doi: https://doi.org/10.1177/2053951720965613, accessed 21 June 2022.

  90. B. Vaitla, S. Verhulst, L. Bengtsson, M.C. González, R. Furst-Nichols and E.C. Pryor, 2020. “The promise and perils of big gender data,” Nature Medicine, volume 26, pp. 17–18.
    doi: https://doi.org/10.1038/s41591-019-0712-z, accessed 21 June 2022.

  91. M. Van Oort, 2019. “The emotional labor of surveillance: Digital control in fast fashion retail,“ Critical Sociology, volume 45, numbers 7–8, pp. 1,167–1,179.
    doi: https://doi.org/10.1177/0896920518778087, accessed 21 June 2022.

  92. S.F. Way, D.B. Larremore and A. Clauset, 2016. “Gender, productivity, and prestige in computer science faculty hiring networks,” WWW ’16: Proceedings of the 25th International Conference on World Wide Web, pp. 1,169–1,179.
    doi: https://doi.org/10.1145/2872427.2883073, accessed 21 June 2022.

  93. L. Weinberg, 2017. “Rethinking privacy: A feminist approach to privacy rights after Snowden,” Westminster Papers in Communication and Culture, volume 12, number 3, pp. 5–20.
    doi: https://doi.org/10.16997/wpcc.258, accessed 21 June 2022.

  94. B.F. Welles, 2014. “On minorities and outliers: The case for making big data small,” Big Data & Society (1 April).
    doi: https://doi.org/10.1177/2053951714540613, accessed 21 June 2022.

  95. A. Werner, 2020. “Organizing music, organizing gender: Algorithmic culture and Spotify recommendations,” Popular Communication, volume 18, number 1, pp. 78–90.
    doi: https://doi.org/10.1080/15405702.2020.1715980, accessed 21 June 2022.

  96. S.M. West, 2020. “Redistribution and rekognition: A feminist critique of algorithmic fairness,” Catalyst: Feminism, Theory, Technoscience, volume 6, number 2.
    doi: https://doi.org/10.28968/cftt.v6i2.33043, accessed 21 June 2022.

  97. B. Wiens, S. Ruecker, J. Roberts-Smith, M. Radzikowska and S. MacDonald, 2020. “Materializing data: New research methods for feminist digital humanities,” Digital Studies/le Champ Numérique, volume 10, number 1.
    doi: https://doi.org/10.16995/dscn.373, accessed 21 June 2022.

  98. L. Wilcox, 2017. “Embodying algorithmic war: Gender, race, and the posthuman in drone warfare,” Security Dialogue, volume 48, number 1, pp. 11–28.
    doi: https://doi.org/10.1177/0967010616657947, accessed 21 June 2022.

  99. M.G. Worthen, 2020. “A rainbow wave? LGBTQ Liberal political perspectives during Trump’s presidency: An exploration of sexual, gender, and queer identity gaps,” Sexuality Research and Social Policy, volume 17, number 2, pp. 263–284.
    doi: http://dx.doi.org.lib.costello.pub.hb.se/10.1007/s13178-019-00393-1, accessed 21 June 2022.

  100. L. Yarger, F. Cobb Payton and B. Neupane, 2020. “Algorithmic equity in the hiring of underrepresented IT job candidates,” Online Information Review, volume 44, number 2, pp. 383–395.
    doi: https://doi.org/10.1108/OIR-10-2018-0334, accessed 21 June 2022.

  101. H. Yoon, 2021. “Digital flesh: A feminist approach to the body in cyberspace,” Gender and Education, volume 33, number 5, pp. 578–593.
    doi: https://doi.org/10.1080/09540253.2020.1802408, accessed 21 June 2022.

  102. A. Yu, 2020. “Digital surveillance in post-coronavirus China: A feminist view on the price we pay,” Feminist Frontiers, volume 27, number 5, pp. 774–777.
    doi: https://doi.org/10.1111/gwao.12471, accessed 21 June 2022.

 


Editorial history

Received 30 September 2021; revised 11 January 2022; accepted 21 June 2022.


Creative Commons License
This paper is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Feminist Data Studies and the emergence of a new Data Feminist knowledge domain
by Hana Marčetić and Jan Nolin.
First Monday, Volume 27, Number 7 - 4 July 2022
https://firstmonday.org/ojs/index.php/fm/article/download/12295/10681
doi: https://dx.doi.org/10.5210/fm.v27i7.12295