The shift towards big data-driven decision-making and algorithmic automation across many aspects of everyday life remains a contentious subject of debate and critique. Critical social scientists and media scholars assert that this shift alters the nexus and power relations between state, citizens, and industry. Individuals and communities have little control over how their data are collected and have little to no influence on the algorithmically informed decisions that govern their lives. This chapter addresses power asymmetries that are emerging at this contemporary juncture. The chapter points to possibilities to agency in the data practices, including consent practices, refusal practices, citizen participation (including citizen juries and citizen assemblies), as well as other forms of data activism. In doing so, we aim to contribute to reshaping data power from the bottom up and propose people-centred and radically contextualized approaches to imagining alternative data futures.

Introduction

The shift towards big data-driven decision-making and algorithmic automation across many aspects of everyday life remains a contentious subject of debate and critique. Critical social scientists and media scholars assert that this shift alters the nexus and power relations between state, citizens, and industry (for example, Kennedy and Moss, 2015). The extractive logic central to today’s data economy has further centralized power, wealth, and capital in the hands of the few industry leaders (Srnicek, 2017; Cohen, 2019). The desire of states to reap the perceived benefits of data use for optimization, efficiency, and control is increasing the use of (commercial) data systems. Individuals and communities have little control over how their data are collected and have little to no influence on the algorithmically informed decisions that govern their lives. We will refer to this power asymmetry as a difference in data power.

However, this data power does not affect everyone equally, and some people are more resourceful in (temporarily) pushing back against or working around processes of datafication. In that sense, we need to ask how these data and automated decision-making processes are shifting power, how they work, for whom and for whom not. Members of marginalized, racialized, and vulnerable communities experience the brunt of data power. For example, research has shown such communities to be a target of automated decisions within welfare states, such as being more likely to be subjected to algorithmic fraud detection in social services and datafied policing (Eubanks, 2017; Roosen, 2020; Jansen, 2022). Yet people and communities are challenging and negotiating the influence and impact of this hegemonic data power. This chapter will highlight a range of practices developed by people in the face of data power.

The approaches we discuss here range from individual to community-based and collective practices that push back against state power as well as the power exerted by private companies and interests. The progression from the individual towards the collective, from the private towards the public, captures the underlying rationale that individual acts of reclaiming data power are necessary but need to be complemented by collective approaches in order to address the wider implications and transformations of increasingly datafied societies. The practices we outline and envision in this chapter exemplify these different dimensions, although they do not aim to be exhaustive of the power people enact through their data practices. They demonstrate what is possible, capturing a range of possibilities and potential incentives for reimagining, reclaiming, and building better datafied futures.

The practices brought to the fore are meaningful consent, refusal as an act of agency, data literacy and collective agency, community activism, and participatory governance approaches. Challenges associated with realizing meaningful forms of consent are presented as a central set of concerns that call for further privacy research and advocacy to support individuals as they attempt to realize data power. If the embedding of the meaningful consent mechanism through regulatory and technological means is an essential first step for empowering individuals, enabling refusal practices as an act of control and agency is the next. The process of refusal is about more than saying no, it is about the willingness, knowledge, and ability to exercise refusal. As an act of ‘speaking back’ and shifting power, it enables people to act according to their own will. Yet it requires the willingness and ability to invest time and energy, and therefore, as a practice, faces limitations.

People’s practices might provide important indications to the kinds of structures needed to support their communal and collective exercise of power in relation to state and commercial actors. Challenging the use of algorithmic welfare systems and exercising redress in the face of data power cannot be the sole responsibility of the individuals or communities impacted. It requires a mix of skills, knowledge, and voice(s), to mobilize what is referred to as collective agency. One such example is community data activism as a new vector for participatory power. Participatory governance approaches are another emerging and possible practice of opposing state-exerted data power. Through democratic innovations such as citizen assemblies, citizen juries, and others, citizen voices are advanced in decision-making processes. Focusing on giving voice to those impacted the most by these power apparatuses, these methods allow citizens to partake in complex conversation and are seen as positive expressions of agency, although they do not transfer decision-making power to the citizen, yet. While these emerging practices might seem marginal in the face of data power, they can give people and communities knowledge and voice in datafied societies.

Meaningful consent and data power – Jonathan Obar

The data subject oversees their own information protections. This assertion is fundamental to privacy law and regulation based in the Fair Information Practice Principles (Cate, 2006). The European Union’s General Data Protection Regulation (GDPR) and Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) are two international examples of this approach. Both emphasize the importance of the data subject in the realization of their own information protections, conveyed via the centrality of consent provisions. These provisions are supposed to afford opportunities for assenting (or dissenting) to the implications of service engagement, aligned with calls for ‘democratis(ing) data power’ (Kennedy and Moss, 2015: 2). Meaningful consent suggests that individuals not only engage with consent materials before deciding whether to agree or not, but that individuals also understand what they are agreeing to, as well as the implications of agreement (OPC, 2020). Those implications might be that data collected today are integrated into artificial intelligence (AI) development in the future. If consent processes shift data power towards the individual, meaningful dissent expressions might result, with individuals refusing to consent to AI development possibilities. Unfortunately, current methods for delivering meaningful forms of online consent suggest a difficult set of challenges for realizing this type of information protection.

‘I agree to the terms and conditions’ is said to be ‘the biggest lie on the internet’ (Lannerö, 2012). Research suggests individuals tend to ignore terms of service and privacy policies when clicking ‘agree’ during app/website sign-up (Obar and Oeldorf-Hirsch, 2020, 2022). In two empirical studies, one of undergraduates (n=543), and another of older adults 50+ (n=500), participants were presented with the front page of a fake social network (called NameDrop) and asked to engage with a fictitious sign-up and associated consent process (Obar and Oeldorf-Hirsch, 2020, 2022). In both studies the majority of participants agreed to the fake privacy policy without accessing or reading it via a form of clickwrap. For those that accessed the text of the policies, average and median reading times suggested that many participants were likely scrolling to the bottom of the policy as quickly as possible without reading or understanding service terms. To assess the implications of potential policy-ignoring behaviours, in both studies ‘gotcha clauses’ were included in the terms of service. In the undergraduate study, 93 per cent agreed to give NameDrop their first-born child for service access. In the older adult study, 83 per cent agreed to give a kidney or other bodily organ (Obar and Oeldorf-Hirsch, 2020, 2022). These findings suggest that consent provisions, as currently presented to data subjects during service sign-up in particular, are not delivering information protections.

There are many reasons data subjects struggle with online consent processes. The length and complexity of service terms are a longstanding concern (McDonald and Cranor, 2008; Reidenberg et al, 2015; Obar, 2022a, 2022b). The problematic user interface designs of digital services can also make it difficult to realize information protections (Acquisti et al, 2017). The literature suggests the deceptive design of an online consent process can distract and even discourage people from engagement and understanding of service terms (Obar and Oeldorf-Hirsch, 2018; Habib et al, 2020). Due to these difficulties, it is no surprise that data subjects may also deal with feelings of resignation and apathy when attempting to realize information protections (Hargittai and Marwick, 2016; Draper and Turow, 2019).

How to achieve a consistent and meaningful consent across myriad consent scenarios online is a question without a clear answer. There are a variety of approaches that are part of the ongoing discourse attempting to support consent processes. This includes efforts to encourage service provider self-regulation in the form of consent processes that are more dynamic, as opposed to obtaining consent during service sign-up or when policies change (OPC, 2021). The ‘just-in-time’ notice (OPC, 2021) is an example of this type of ongoing consent process, which would alert individuals to new opportunities for considering whether to consent or not, associated with specific online behaviours such as turning on a device camera or posting content online. To encourage better self-regulation by digital service providers, policy makers internationally are imposing monetary penalties where the design of online consent interfaces suggests organizations are maintaining unhelpful and even deceptive practices (Obar, 2023). Strategies for making notice materials more engaging continue to be tested. This includes calls for policies with language and formatting that are easier to understand (OPC, 2021), and supplementary services that distil complex details (TOS;DR, 2023). Nutrition label-type notice materials, along with thoughtful signage examples are also being considered, especially in contexts such as on smart devices and in public spaces where there isn’t a screen to facilitate privacy policy engagement (Emami-Naeini et al, 2021; Helpful places, nd).

How meaningful consent will be realized as the use of AI expands complicates matters further, as AI developers are not always consumer-facing. This contributes to distance between data subjects and the organizations working with data sets, leading to the possibility of vast data sets used to train AI being built without ensuring the meaningful online consent of the data subject. These challenges may pervade AI development processes. As Crawford (2021: 95) notes, ‘The AI industry has fostered a kind of ruthless pragmatism, with minimal context, caution, or consent-driven data practices while promoting the idea that the mass harvesting of data is necessary and justified for creating (AI) systems’. It is challenging enough to address these concerns when a single company collects data from users for its own purposes. More complex are examples such as Amazon’s data exchange programme, where data sets are shared between data set providers and subscribers, creating additional distance between opportunities for consent protections and the future of AI development.

As policy makers consider the future of information protections, consent provisions must remain central to international law and regulation. The extent to which individuals realize forms of data power will be linked to whether forms of consent delivered are meaningful. Indeed, how to reduce ‘the biggest lie on the internet’ remains one of the central information policy challenges of the ongoing debate over the future of AI, and should be a primary focus of privacy research and advocacy.

Speaking back to corporate power: embedding opportunities for refusal – Ana Pop Stefanija

‘What could I have done: Refused?’ wrote one of my participants in her diary recording and detailing her interaction with a social media platform. Trying to figure out what data this platform holds about her, what kinds of inferences are made about her, and how her online and offline life is entangled with(in) the platform, she underwent a months-long process of going back and forth with the platform to obtain ‘her’ data. Becoming aware of her ‘little to no leverage in this relation’, she concludes: ‘During these months of corresponding, I’ve gotten absolutely nowhere closer to obtain, what I wanted, but have remained absolutely where they wanted: as the powerful processors (owners) of MY data’.

And she is not the only one of my participants sharing this feeling of powerlessness. Navigating datafication and algorithmic systems and ‘taking care’ of one’s data and one’s entanglement with these systems increasingly felt like a futile and frustrating endeavour (see Pop Stefanija and Pierson (2023), for more detailed accounts). Anyone who wants to investigate this is faced with gatekeeping practices of platforms, with hidden ‘entry points’ to data about themself, with controlled flows of information even when access is ‘authorized’, with intentionally misleading, incomplete, incomprehensible, or overwhelming information, with obfuscated automated decision-making processes and absence of opportunities for inspection and redress.

These are just some of the rich insights collected through a participatory study that I conducted in 2020 with 47 participants, focusing on digital and social media platforms. Since I intentionally aimed towards a research design that enables a purposeful interaction and provides essential insights based on real-life experiences, participants were given diaries to capture their interactions, thoughts, and experiences while trying to figure out their position within and in relation to particular social media platforms. This setting, together with the adoption of the approach of critical companionship (Ziewitz and Singh, 2021), understood as a methodology for studying the lived experiences of individuals and as a research-with-care by providing support along the way, enabled a research design that was both about and with people.

The detailed diary reports and the time frame that enabled introspection and reflection of the participants, provided rich insights from which a number of concepts emerged. One element, the element of refusal, figured prominently. As envisioned by the participants and described in their diaries, this concept has two distinct arrangements: refusal as a practice and refusal as an opportunity.

Refusal as a practice relates to the wish or the need of the participants to be able to refuse. But refuse what? According to participants’ diaries, this refers to being able to refuse having data collected about them (in general or by particular entities); to refuse the use or sharing of these data; to refuse to be profiled algorithmically; and, in general, to refuse to be subjected to algorithmic decision-making (see Pop Stefanija and Pierson, 2023).

The ability to refuse can be described as an opportunity for ‘getting in the way’, to borrow Ahmed’s (2023) phrase. This strategy of getting in the way of datafication, profiling, and steering based on data and through algorithmic systems, means also having the ability to critically engage with one’s data and the outputs of algorithmic systems. It also means an ability to make a decision for oneself, based on self-reflection and self-determination. It can also be seen as ‘speaking back’ – a process of being able to correct data inputs and algorithmic outputs and impose one’s own version of ‘truth’ about oneself. This ability to ‘speak back’ is hence intricately related to the existence of opportunities to refuse.

These opportunities to refuse are related to refusal taking form – in order to be able to practise refusal, there must exist possibilities to refuse, these need to be afforded in the first place. This affording should be enabled via a number of elements and mechanisms (Davis, 2020). As the participants envision it, refusal should be located in the ‘materiality of the medium’ (Bucher and Helmond, 2018: 240), enabled primarily via the interface of the platform. As such it should take the form of tabs, buttons, pop-ups, visualizations, settings, and reminders, among others. These elements should enable the individuals to inspect, control, restrict, and opt-out of data collection practices, but also to modify, change, delete, and repair data inputs and algorithmic outputs. Some of these are already foreseen in the EU’s GDPR (European Commission, 2016), however, their implementation in practice is lacking proper compliance (see, for example, noyb’s open cases with EU’s Data Protection Authorities (DPAs) (noyb, 2023)).

Being overpowered by corporate interests, agendas, and profit-making goals of private companies, individuals do not have much manoeuvring space around the datafication and algorithmic networks that influence their lives. For the moment, the acts of refusal and resistance, if possible at all, are experienced as labour intensive, requiring a lot of resources and time. They also require (almost expert) knowledge to navigate and understand the inner workings of algorithmic systems, as well as specific competences and particular capabilities and skills. These vary from initially having the knowledge that one has been algorithmically profiled, to knowing how to look and ask for one’s data (for example, filing a Subject Access Request according to GDPR’s Article 15, or using the platform’s transparency tools), how to read the files (often times these are in unfamiliar formats, like JSON), to how to access these files (if possible, these are often hidden behind tabs and settings), and all the way to how to file and start a redress or complaint procedure (more information on these gatekeeping practices can be found in Pop Stefanija (2023)). Additionally, they sometimes require privilege to navigate the refusal process and to refuse at all (for example, see Jansen in this chapter). However, the ability to refuse for the participants of my study was intrinsically related to individual autonomy, control, self-reflection, self-directedness, and ultimately power. Power in relation to, or power over, the tech proprietors of these algorithmic systems, but also, power over oneself, as a power to act and steer one’s actions and life in a self-determined manner. Designing algorithmic systems that embed opportunities for refusal should ensure that individuals will always have ‘the chance to refuse’ (Benjamin, 2016), if and when they want to.

Collective agency in the face of data + state power – Fieke Jansen

‘Why is my son on that list?’ (Peled, 2022) asks a mother whose son was selected for the Top400, a youth crime prevention approach of the city of Amsterdam in the Netherlands. From 2015 onwards, children and young adults who, in the eyes of the police and the city of Amsterdam, showed concerning behaviour, were selected for a crime prevention approach that combines care and control. Once selected, the municipality and its partners structurally intervene in their lives for a minimum of two years. This approach encroaches on those profiled and there are concerns with the way the Top400 criminalizes antisocial and teenage behaviour, instrumentalizes care for crime prevention, stigmatizes the youngster and their family, limits or obstructs access to justice and redress, and places a spotlight on their younger brothers and sisters (Jansen, 2022). The case of the Top400 offers insights into what a struggle for justice in the face of data + state power entails, as it is both a story of state repression and of resistance.

Between 2016 and 2019, over 300 children and young adults were selected for the Top400 through two data models, one of which was ProkidPlus. This model identified a ‘softer’ group of ‘at risk’ youngsters, those that had been in contact with but not arrested nor charged by the police. Freedom of Information Act (FOIA) documents revealed that the inclusion of this ‘softer’ group of 125 Prokid children was hard to explain. The city advised civil servants responding to the question ‘Why is my child selected for the Top400’ not to mention the word algorithm or the name Prokid (Jansen, 2022). The municipality deliberately obfuscated the basis on which the 125 children were selected and limited their ability for redress. This case is not unique, it is just one of many stories where new forms of algorithmic governance (Dencik et al, 2019; Katzenbach and Ulbricht, 2019; Amoore, 2020) mediate and obfuscate decision-making in the European welfare state. However, the experiences of challenging the Top400 reveal that collective empowerment is a prerequisite for the struggle for justice in contemporary society.

Contemporary legal, technical, and social responses that aim to minimize the negative externalities of data power are connected to rights, knowledge, and skills of the individual. One prominent empowerment angle is that of increased data literacy, where building competencies will increase a person’s ability to economically and socially participate in society (Pangrazio and Sefton-Green, 2020; Sander, 2020). Data literacy is conceptualized as more than learning how to read and write code or how to use new technologies, it is about a person’s ability to navigate the complexities of contemporary societies. As such, most data literacy approaches aim to build knowledge and skills that will allow them to make informed choices about their digital lives. This approach assumes that with increased competencies people can directly control and influence the relationship between them and the data processor (Viljoen, 2020). That they just need to skill up their knowledge on data power and that they know how, and are able to, participate in political and social structures that enable and constrain datafication (Jansen, 2021). Reflections on our ongoing investigation into the Top400 reveal that data literacy is not enough in the struggle for justice in the face of data + state power.

In 2020 I met the documentary filmmaker Nirit Peled who had at that time spent over four years researching the Top400. She was struggling to tell its story. Public officials from the municipality and police did not want to go on the record, the mothers of the Top400 boys were afraid that speaking up publicly would lead to more stigmatization and reprisal, and the ex-Top400 boys wanted nothing to do with it, that time in their lives, or the state. Yet, by listening, Nirit Peled noticed that there was a discrepancy between the bureaucratic reality and success the city attributed to the Top400 and the lived experiences, foregrounding a number of serious concerns. To unravel the Top400, a collaboration was formed between the documentary filmmaker, a human rights lawyer, and me. We started a collective investigation into the Top400, where we requested and systematically analysed 2,000+ pages of FOIA documents to gain more insights into the politics, the problems, the governance, and the data models behind the Top400. In November 2022 our investigation was made public through the documentary Mothers and the Top400 report (Jansen, 2022).

What we learned along the way was that Young’s (2011) approach of listening to those subjected to state power to locate injustice(s) is the first crucial step in the long process of collective agency. It takes resilience, courage, and determination of those impacted to speak up against injustices caused by a paternalistic and repressive welfare state (Vonk, 2014). It took an interdisciplinary team years of research, conversations with those impacted, engaging with technical experts, and discussions with the wider network of stakeholders to make sense of the Top400. It took a documentary to be aired at a leading Dutch film festival and on national TV for the families to be heard and get some kind of recognition for the injustice of this intervention. It took the social and political capital of what society considers ‘experts’ to be able to put the problems of the Top400 on the political agenda. Despite the mayor’s rejections of the injustice claims made by the community, the documentary, and the report (Halsema, 2022), the practice of investigation allowed for networks of solidarity with those impacted, the building of a collective, the foregrounding of injustices, and the articulation of justice claims.

The case of the Top400 shows that claiming justice in the face of state + data power is complex and cannot be the responsibility of informed individuals. When state power becomes enabled and enacted through data systems, the power asymmetry between the individual and the state increases. Challenging decisions of the datafied state requires knowledge and understanding of data systems, social capital, and political agency to claim rights, and resilience and courage to stand up. As such, I argue that we need to move away from the notion of individual empowerment through data literacy to collective agency through the practice of resistance. Collective agency, which should be understood as a process that brings together different competencies needed to identify and uncover the problem and jointly work towards a solution.

Datafication and community activism – Roderic Crooks and Lucy Pei

Like other forms of overt political work enacted in the register of the technological, data activism has been lauded as a vector of participatory power, including the power to subvert existing economies of knowledge production and expertise (Dencik et al, 2019; Lehtiniemi and Haapoja, 2020). But like other kinds of collective action, data activism arises from particular social locations, from people and communities dealing with the persistent consequences of structural inequality. As critical scholars have argued persuasively, the potentials of data to contribute to movements for justice (social or otherwise) are always tempered by competing, incommensurable understandings of what data can do, what kinds of political action are available to motivated parties in the present, and what kinds of people are considered legitimate civic participants: these dynamics favour those already privileged by economic and political hierarchies (Coleman, 2017; Gray, 2018; Heeks and Shekhar, 2019). In minoritized communities, however – those communities marked by socially consequential, interlocking forms of difference – building people-power via commercial tools and platforms of the tech sector poses specific risks. Minoritized communities, bound as much by material, intersecting differences such as race, class, gender, sex, citizenship, geography, and/or disability as by ‘their ordered relation to capital’ (Allen, 2021: 6), are the site of many forms of harm specific to the use of data-intensive computation. Chief among these perhaps is the datalogical enframing of community-defined problems, the brute mistranslation of the knowledge and experiences of working-class communities of colour into structures, documents, and evidence valorized by the state, academia, and the tech sector.

Since 2019, the Evoke Lab at UC Irvine has hosted an event called ‘Datafication and Community Activism’, a space where scholars and community organizers have been thinking about the relationship between minoritized communities and datafication. Community organizers work to shape voice and political strategy in the communities they serve. In the context of American political life, professional community organizers work in all kinds of communities in all parts of the political spectrum. The organizers we work with most often are based in minoritized communities, where they pursue larger social movement goals under precarious employment conditions in the not-for-profit sector. From the perspective of these organizers, to say a community is minoritized is to point to hierarchy in public life, to the way the public sphere is defined, constituted, and shaped for the benefit of dominant groups whose interests are enforced by the state, not by demography. This work has resulted (only infrequently) in meaningful and mutually beneficial working relationships with individual community organizers and community-based organizations of different kinds. Our work with organizers frequently puts us in a position to confront ‘academic and nonprofit complicity’ in the production and sale of data-intensive technologies that are used to harm minoritized communities via surveillance, criminalization, discrimination, and extraction.

Over the years, our strategy has prioritized listening to our colleagues so that we might be educated and informed about how digital data in all of their forms and manifestations relate to the self-determination and ongoing freedom struggles of working-class communities of colour. To date, we have been fortunate to work with many national, regional, and neighbourhood organizations who are generally interested in digital data but are specifically concerned with abolition of police, racial discrimination in computational systems, data-driven government service, reintegration for formerly incarcerated people, economic empowerment, and many other issues of interest to working-class communities of colour. These aspects of community organizing play out in very surprising ways, especially where the overtly political and social movement-aligned work of community organizing intersects with the pleasures and potentials of digital data. From the perspective of these organizers, datafication is not an unintended consequence of unpredictable technological change, but a continuation of the exploitation of working-class communities of colour. If data justice concerns ‘fairness in the way people are made visible, represented and treated as a result of their production of digital data’ (Taylor, 2017: 1), the community organizers with whom we work would remind technologists, academics, and civil society groups that digital technologies are inextricably linked to both state violence and private discrimination. Fairness, for many kinds of people in the United States, has never been on offer.

Participatory governance of datafication – Arne Hintz

The increasing use of data analytics for a variety of both commercial and public services occurs mostly without the knowledge of data subjects. We are profiled, categorized, assessed, sorted, and scored according to criteria that we do not understand, through processes that remain obscure, with consequences that are difficult to foresee, and with few possibilities to object or resist. This is already problematic in the context of commercial systems, such as the allocation of platform services and discriminatory pricing (Redden, 2022), but it becomes a fundamental challenge to democratic systems if it affects state functions and state–citizen relations. If our performance as citizens is permanently assessed through data systems, power is conferred onto the data collector (the state) and shifted away from citizens (Hintz et al, 2019). As a consequence, the role of the sovereign – the people – is diminished and citizens lose influence over government and public decision-making. This raises significant questions regarding people’s roles in the deployment and management of data systems. How can, and should, we participate as citizens in governance systems that are informed and infused by data and AI? How do we intervene into decision-making about the roll-out of data and AI in government and the public sector? How do we advance civic agency and democracy in the datafied state?

Practices of, and research on, ‘democratic innovations’ offer a possible way forward in exploring how to advance citizen voices in decision-making outside and beyond established processes of institutionalized democratic engagement (Smith, 2009). Citizen assemblies, citizen juries, citizen summits, deliberative polling, distributed dialogues, and similar models and practices bring together a small selection of the population for deliberation on key issues that society is facing. Supported by expert input, a smaller or larger (from around 15 to potentially over 1,000) group of people, often recruited to represent wider society, meet for a few days and develop proposals or decisions on the issue they have debated. These methods have increasingly been applied to engage citizens in discussions on the use of data analytics in areas such as health, policing, and criminal justice. Think tanks, civil society organizations, policy institutions, regulators, and government departments have commissioned or organized such initiatives to understand people’s views and seek guidance on policies and applications.

Research by the Data Justice Lab (Hintz et al, 2022) has explored the significant promises and challenges of these practices, focusing on the UK where they have enjoyed particular prominence recently, in part as a response to scandals and wider dissatisfaction with excessive data uses. We found that non-expert citizens were, in fact, able to discuss a complex topic such as data and AI with sufficient depth and to develop thorough outcomes and policy recommendations. Participants largely viewed the experience as positive and empowering, with the rare opportunity to both learn about the subject and make their voices heard. As deliberative exercises, however, these initiatives typically do not transfer decision-making power to citizens. They provide a platform for contributing voices and concerns but do not involve a substantial power shift. In some cases, they amount to little more than an opinion poll, which underlines concerns regarding their possible use for ‘participation-’ or ‘engagement-washing’, that is, the legitimation of decisions taken elsewhere. Yet many of them do have either direct policy impact or broader normative influences on decision-making.

Despite their titles, initiatives such as citizen juries and citizen summits are typically not self-organized at grassroots level, and organizers have significant leeway in framing discussions. Possibilities for participants to define the agenda and move the goalposts of the debate are often limited, and some organizers have steered deliberations (explicitly or implicitly) towards an acceptance of data uses and a recognition of its value. Further, the goal of representing a cross-section of society comes at the expense of considering impacts on, and experiences of, particularly affected communities. People from impoverished, racialized, and otherwise marginalized backgrounds are impacted by datafication in specific ways and their voices are crucial in properly assessing data uses, but they are not always incorporated.

As this brief snapshot demonstrates, these practices come with significant shortcomings. Building on theories of participation, we may categorize them as ‘partial participation’ (Pateman, 1970), at best, and as a form of ‘tokenism’ (Arnstein, 1969) that may lead to an advisory role in policy- and decision-making. They empower participants to learn, share their views, and (ideally) affect policy, but they do not amount to a sharing of decision-making power. While they enhance people’s voices in a debate dominated by commercial and governmental actors, their characterization as a ‘people’s practice’ requires qualifications. However, they contribute to a growing composite of strategies for civic participation, together with community and grassroots initiatives, civil society campaigns, emerging institutional data governance structures (such as data trusts and data cooperatives), technical approaches towards algorithmic accountability, and other strategies as explored in this chapter. Together, these different models, practices and initiatives reflect a growing recognition that data subjects need to be involved in decisions about data, and that those who are affected by datafication should steer its future development and deployment.

Conclusion: Contextual research practices – Stine Lomborg and Anne Kaun

Research on people’s practices in the face of data power, such as the examples described earlier in this chapter, testifies to the value of centring the people implicated by data power operations to understand the promise and perils of datafication and automated decision-making systems, and people’s myriad ways of making their stakes in datafied societies. As we have seen, people contest and enact data power both in what may be considered small, individual acts of engagement, and in communal forms and participatory processes.

The use of data and automated decision-making systems in public services and private companies is often motivated with quantitative measures of efficiency and resource-savings. In the context of public welfare, it is also justified with reference to fairness and equal treatment based on the assumption that bias and noise will be minimized or eliminated altogether by reducing human intervention in the decision-making process. But the large number of failed pilot projects, along with several scandals across countries pertaining to systems that are actually put into use, suggest that ideals and reality do not always match all that well. As a first response, a number of high-level ethical guidelines for AI implementation and use have been developed and ratified at international, national, and organizational levels to shape the governance of data-driven systems. Arguably, guideline-based approaches to ethics and justice, too, assume standardization is not only possible for systems, but also for our ethical management and use of such systems. But the people who are actually implicated seem to be largely absent from these top-down and often universalizing discussions of the ethical uses of data, for example in relation to automated decision-making systems. This is problematic, because we know from decades of research about relationships between technology, data, and people that outcomes are not the same for everyone, and may vary substantially between social and cultural contexts. It remains to be seen whether and how systems that are built on clearly defined rules and enhanced standardization leave room for the different needs to reach capabilities for citizens.

Developing in-depth empirical accounts of people’s data practices and experiences presupposes a move to radical contextualization. Context matters for how people make sense of data practices, what data practices they perceive as fair and just, and what they are capable of doing with or to them. Context also matters for whether people experience a controlling state or commercial actors as their main combatant. This chapter, therefore, ends with putting forward a programmatic statement for the need for radical contextualization as a way of centring people, and giving voice to people in all their diversity in discussions of data power. In terms of a research agenda this means ‘to consider the specific contexts within which people (professionals as well as private citizens) interface with ADM [and other data-based technologies], the ways in which they make sense or reject them and ultimately develop frameworks for approaching technological continuity and change’ (Lomborg et al, 2023: 14).

Attending empirically to the lived experiences and stories of agentic, ambivalent, and alienated human beings, as demonstrated across this chapter, can help us better understand the social and cultural-contextual dynamics of data power and data-induced empowerment. And it can amplify people’s voices in public debates on data power, which might lessen the burden on individuals when facing data power. In turn, however, it also demands reflection on researcher roles, and the nexus between science, community engagement, and activism when working with people front and centre.

Commitment to contextualization entails a shift in what is figure and what is ground as we study data power. How can we centre implicated people and their positions while keeping an analytical eye on the data infrastructures that they act upon? How can we do justice to cultural contingencies, social exigency, and historical trajectories that shape people’s practices in the face of data power, without compromising culturally comparative scholarship? People-centred approaches must balance these needs to bring lived experience in dialogue with studies of the platforms and actors who channel, retain, augment, and consolidate data power in pursuit of competitive edge in innovation, population management, and economic gain.

A people-centred and radically contextualized approach to data power will also allow to tease out the possibilities to agency in the data practices discussed earlier – consent practices, refusal practices, citizen participation (including citizen juries and citizen assemblies), as well as other forms of data activism. Such an approach will allow to disentangle structural inequalities in the current data ecologies while highlighting possibilities of agencies and shifts from the bottom up. By extension, this underlines that there is nothing natural about the state of our data-based systems, rather that they are always changeable and in the making even if such changes require immense engagement and work.

Discussant – Catherine D’Ignazio

The strategies outlined in this chapter are exciting, both because they represent diverse forms and scales of resistance to data power and also because they come from empirical and participatory work with people and communities. As Aristea Fotopoulou has argued, researchers should look to shift their object of research from data, algorithms, and platforms themselves towards the human practices of acquiring, analysing and using data, so that we may ‘reinstate the materiality of data, to think about laboring bodies, invisible human practices, and social relations and activities’ (Fotopoulou, 2019).

One issue that I would like to raise in relation to data activism relates to the diverse data epistemologies employed by activists themselves. In this chapter, the authors have mainly centred on data as it is acquired, stored, analysed, and deployed in models, algorithms, and predictions by state and corporate actors. The epistemological approach to data science embodied by states and corporations is just one approach – and it is one that is predominantly positivist, optimizing, and neoliberal. It results in, as Crooks and Pei state, ‘surveillance, criminalization, discrimination, and extraction’ (this chapter). To their list, I would add ‘grotesque accumulation’ because ultimately this approach is about wealth hoarding for those on top and the operationalization of scarcity and inequality for everyone else.

But as prior work has shown, this is not the only epistemology of data. In their work on data activism, Stefania Milan and Lonneke van der Velden (2016) call attention to the important role of data activists who function as ‘producers of counter-expertise and alternative epistemologies, making sense of data as a way of knowing the world and turning it into a point of intervention. They challenge and change the mainstream politics of knowledge’ (Milan and van der Velden, 2016: 63–64). In recent years, alternative epistemological approaches to data have been flourishing: data feminism, indigenous data sovereignty, Data for Black Lives, environmental data justice, QuantCrit, and queer data, to name a few. These approaches posit that there are other ways of using (or refusing) data in the service of co-liberation.

For the past four years, I have been working on a participatory research and design project called Data Against Feminicide in which my colleagues and I have been working with feminicide data activists in the Americas. These are groups – of academics, journalists, activists, nonprofits, concerned individuals, mothers, sisters, aunties, and families – who painstakingly document cases of fatal gender-related violence and use those data for a variety of political demands and impacts. The information ecosystem in which they work is deeply biased: states publish little to no information about feminicide, even where laws do exist. The state regularly misclassifies the killings of gender and racial minorities as accidents and suicides. The media – which often end up being activists’ main sources of information – are racist, misogynist, transphobic, and victim-blaming.

And yet, activists persist in assembling carefully curated spreadsheets and databases. They seek humanizing photos of killed women. Some groups provide direct services and accompaniment through the justice system to families. Other groups stage collective memorials with empty chairs or empty shoes, using aggregated absence as an aesthetic approach. Still others use their data to gain audience with the state or to influence and reframe toxic media narratives about this violence.

As I have reflected on this work with my collaborators and the activists themselves, I have written about just how profoundly the activists’ epistemological approach to working with data diverges from the mainstream positivist approach (D’Ignazio, 2024). Feminicide data activists centre care, memory, and justice. They are often from the communities from which they draw their data – they are women, Black women, indigenous women, Latin American women, trans women, survivors, mothers, family members, community members. Rather than using data to ‘solve’ a problem, they use data to remake and reframe the problem of feminicide. They challenge the idea that gender-based violence is a personal problem and they reframe it as a political problem, a public problem, a structural problem. Feminicide data activists are deeply aware of the biases and limitations of their data because they are, themselves, the data producers. They talk about their role as caring for the data and the people and lives represented therein.

Does producing carefully documented databases of feminicide cases ‘solve’ feminicide? No, and no activist would imagine that it does. It is an imperfect informatic tactic in a deeply asymmetrical environment. But this production of information does participate in a broader constellation of efforts that are working towards the restoration of rights, the healing of communities, and the longer-term work of structural transformation. One thing emphasized to me over and over again by activists is that this work is not about counting the dead, it is a defence of life itself.

I offer these thoughts on data epistemologies so that we do not forget to think differently about our information and our technologies. As with all tools, their origins do not preclude their appropriation, their reclamation, and their reimagination. As Paola Ricaurte frames it, there are ‘possible alternative data frameworks and epistemologies that are respectful of populations, cultural diversity, and environments’ (Ricaurte, 2019). These may help us work towards data practices in the service of life, living, and vitality.

Discussant – Dan McQuillan

By assembling and analysing a vivid canvas of people’s actual practices, this chapter makes a valuable contribution to a critique of data power. I have tried to respond diffractively; that is, through a constructive approach to the differences present within the text and between the text and my own perspectives. For brevity this is presented as a series of statements; some of which, I hope, may resonate with the reader.

  • It is clear from the chapter that datafication is an attack on the poor and the marginalized.

  • Datafication renders social relations as abstractions for distanced and indifferent manipulation.

  • Datafication is not a shift of power relations away from an acceptable norm but an intensification of existing injustices.

  • Any claims that data are being collected to fulfil people’s needs are a distraction and diversion.

  • Datafication is degradation.

  • We do not need a stake in the datafied society, we need to consciously and explicitly resist it.

  • The extractive and centralizing logics described in the chapter reduce the space available for a livable life.

  • Optimization and efficiency replace relationality with resource extraction.

  • More data under datafication means more precarity and austerity.

  • Datafication is part of material structures that burn through energy, create emissions, and deplete water resources, all while claiming to be a solution to the climate crisis.

  • Calling for resistance is to recognize what is at stake; we resist in order to exist.

  • A resistant framing includes people’s existing practices but points beyond them.

  • It is not an attempt to impose a programme but to challenge the datafied foreclosure of the future.

  • Subjects in a datafied society are shaped by data power; practices of resistance restore different subjectivities.

  • Resistant subjects do not simply have viewpoints but standpoints; valid forms of knowing that are embodied and situated, not abstract and distant.

  • Where datafication drives precaritization, resistance responds with mutual aid.

  • Where datafication leads to exclusion and oppression, resistance develops solidarity.

  • Resistance is also a call for academic commitment rather than complicity; as the chapter demonstrates, research can remain rigorous while asking questions that are resistant.

  • Resistance is more than collective refusal; it is a commitment to possible alternatives.

  • Resistance is ‘one no, many yeses’.

  • Resistant responses to data power are those that develop counter-power.

  • Consent becomes meaningful under conditions that Illich (1973/1975) called conviviality; where it enables autonomous action by means of tools least controlled by others.

  • Anything other than conviviality merits collective refusal.

  • Resistant refusal develops not only data literacy but self-organization literacy, critical feminist literacy, decolonial literacy, and so on.

  • Resistance to datafication is a social movement, or rather, a movement that is part of other social movements where injustices are becoming datafied.

  • Forms of ‘consultation’ and ‘participation’ that do not transform power relations in favour of these movements are actually modes of assimilation.

  • Datafication is a slippery opponent because it facilitates fake empowerment, with proposals to participate in our own datafication and ‘control’ our own data in the name of datafication-for-good.

  • As Ruha Benjamin (2019) says: ‘Demanding more data on subjects that we already know much about is … a perversion of knowledge in which the hunt for more and more data is a barrier for acting on what we already know.’

  • Datafication is the new ‘Society of the Spectacle’, but its very pervasiveness creates opportunities for intersectional resistance.

  • Where so many are affected in so many dimensions, resistance means making connections across different contexts of datafication.

  • This resistance can learn from previous movements against repressive technologies, such as the Luddites (see, for example, Binfield, 2004), who were not anti-technology but anti-automatization, and who defended their autonomy through militant community mobilization.

  • Resisting datafication is not about rejecting technology, but a matter of resolving for ourselves which technologies and which subjectivities will emerge together in response to which material problems.

  • In other words, resistance to datafication is the development of a prefigurative technopolitics.

  • Resistance to datafication is about responding differently to the problems that are currently being datafied, in ways that are both collective and technical.

  • Whereas the constant demand for data always implies an existing deficit, not only of knowledge but also of capacity, resistance to datafication builds on our collective strengths and resilience.

  • Like the workers who developed the Lucas Plan, it asks: what do we already know, what can we already do that can become transformative?

  • Resistance to datafication is this search for new socialities and new tools, for practices that reclaim the common good.

  • Datafication is rooted in fossil fuel modernity and the reduction of all beings to a standing reserve.

  • Our resistance is not only the careful contextualization of existing dystopias, but the multiplication of alternative futures.

Notes

1

Alphabetical author order

2

Discussant

3

Facilitator

4

Discussant

References

  • Acquisti, A., Adjerid, I., Balebako, R., Brandimarte, L., Cranor, L. F., Komanduri, S., et al (2017). Nudges for privacy and security: Understanding and assisting users’ choices online. ACM Computing Surveys (CSUR ), 50(3): 141.

    • Search Google Scholar
    • Export Citation
  • Ahmed, S. (2023). The Feminist Killjoy Handbook. Allen Lane. Retrieved from: www.penguin.co.uk/books/454793/the-feminist-killjoy-handbook-by-ahmed-sara/9780241619537

    • Search Google Scholar
    • Export Citation
  • Allen, J. S. (2021). There’s a Disco Ball between Us: A theory of black gay life. Duke University Press.

  • Amoore, L. (2020). Cloud Ethics: Algorithms and the attributes of ourselves and others. Duke University Press.

  • Arnstein, S. R. (1969). A ladder of citizen participation. Journal of the American Institute of Planners, 35(4): 216224.

  • Benjamin, R. (2016). Informed refusal: Toward a justice-based bioethics. Science, Technology, and Human Values, 41(6): 967990.

  • Benjamin, R. (2019). Race After Technology: Abolitionist tools for the New Jim Code. Polity.

  • Binfield, K. (ed) (2004). Writings of the Luddites. Johns Hopkins University Press.

  • Bucher, T. and Helmond, A. (2018). ‘The Affordances of Social Media Platforms.’ In: J. Burgess, T. Poell, and A. Marwick (eds) The SAGE Handbook of Social Media, pp 233253. SAGE.

    • Search Google Scholar
    • Export Citation
  • Cate, F. H. (2006). The failure of fair information practice principles: Consumer protection in the age of the information economy. Retrieved from: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1156972

    • Search Google Scholar
    • Export Citation
  • Cohen, J. E. (2019). Between Truth and Power: The legal constructions of informational capitalism. Oxford University Press.

  • Coleman, G. (2017). From internet farming to weapons of the Geek. Current Anthropology, 58(Supplement 15): S91S102.

  • Crawford, K. (2021). Atlas of AI. Yale University Press.

  • D’Ignazio, C. (2024). Counting Feminicide: Data feminism in action. MIT Press.

  • Davis, J. L. (2020). How Artifacts Afford: The power and politics of everyday things. MIT Press.

  • Dencik, L., Hintz, A., Redden, J., and Treré, E. (2019). Exploring data justice: Conceptions, applications and directions. Information, Communication & Society, 22(7): 873881.

    • Search Google Scholar
    • Export Citation
  • Draper, N. A. and Turow, J. (2019). The corporate cultivation of digital resignation. New Media & Society, 21(8): 18241839.

  • Emami-Naeini, P., Dheenadhayalan, J., Agarwal, Y., and Cranor, L. F. (2021). An informative security and privacy ‘nutrition’ label for internet of things devices. IEEE Security & Privacy, 20(2): 3139.

    • Search Google Scholar
    • Export Citation
  • Eubanks, V. (2017). Automating Inequality: How high-tech tools profile, police, and punish the poor. St Martin’s Press.

  • European Commission (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). Retrieved from: https://eur-lex.europa.eu/eli/reg/2016/679/oj

    • Search Google Scholar
    • Export Citation
  • Fotopoulou, A. (2019). ‘Understanding Citizen Data Practices from a Feminist Perspective: Embodiment and the Ethics of Care.’ In: H. Stephansen and E. Trere (eds) Citizen Media and Practice: Currents, connections, challenges, pp 227242. Taylor & Francis/Routledge.

    • Search Google Scholar
    • Export Citation
  • Gray, J. (2018). Three aspects of data worlds. Krisis: Journal for Contemporary Philosophy, 1: 417.

  • Habib, H., Pearman, S., Wang, J., Zou, Y., Acquisti, A., Cranor, L. F., et al (2020). ‘It’s a scavenger hunt’: Usability of websites’ opt-out and data deletion choices. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp 112.

    • Search Google Scholar
    • Export Citation
  • Halsema, F. (2022). Raadsinformatiebrief: Recente media-aandacht voor de Top400-aanpak. Gemeente Amsterdam. Retrieved from: https://open.overheid.nl/documenten/ronl-8f2884ba05c31a13f2a7e463932502d81289fde7/pdf

    • Search Google Scholar
    • Export Citation
  • Hargittai, E. and Marwick, A. (2016). ‘What can I really do?’ Explaining the privacy paradox with online apathy. International Journal of Communication, 10: 37373757.

    • Search Google Scholar
    • Export Citation
  • Heeks, R. and Shekhar, S. (2019). Datafication, development and marginalised urban communities: An applied data justice framework. Information, Communication & Society, 22(7): 9921011.

    • Search Google Scholar
    • Export Citation
  • Helpful Places (nd). Digital Trust for Places & Routines. Retrieved from: https://dtpr.io

  • Hintz, A., Dencik, L., and Wahl-Jorgensen, K. (2019). Digital Citizenship in a Datafied Society. Polity Press.

  • Hintz, A., Dencik, L., Redden, J., Trere, E., Brand, J., and Warne, H. (2022). Civic Participation in the Datafied Society: Towards Democratic Auditing? Research Report. Retrieved from: https://datajusticelab.org/wp-content/uploads/2022/08/CivicParticipation_DataJusticeLab_Report2022.pdf.

    • Search Google Scholar
    • Export Citation
  • Illich, I. (1973/1975). Tools for Conviviality. Fontana.

  • Jansen, F. (2021, August). Critical is not political: The need to (re) politicize data literacy. Seminar.net, 17(2).

  • Jansen, F. (2022). Top400: A top-down crime prevention strategy in Amsterdam. PILP. Retrieved from: https://pilpnjcm.nl/wp-content/uploads/ 2022/11/Top400_topdown-crime-prevention-Amsterdam_v2.pdf

    • Search Google Scholar
    • Export Citation
  • Katzenbach, C. and Ulbricht, L. (2019). Algorithmic governance. Internet Policy Review, 8(4): 118.

  • Kennedy, H. and Moss, G. (2015). Known or knowing publics? Social media data mining and the question of public agency. Big Data & Society, 2(2): 111.

    • Search Google Scholar
    • Export Citation
  • Lannerö, P. (2012, 27 January). Previewing online terms and conditions: CommonTerms alpha proposal. Retrieved from: http://commonterms.org/commonterms_alpha_proposal.pdf

    • Search Google Scholar
    • Export Citation
  • Lehtiniemi, T. and Haapoja, J. (2020). Data agency at stake: MyData activism and alternative frames of equal participation. New Media & Society, 22(1): 87104.

    • Search Google Scholar
    • Export Citation
  • Lomborg, S., Kaun, A., and Hansen, S. S. (2023). Automated decision-making research: Towards a people-centered approach. Sociology Compass, 17(8): e13097. https://doi.org/10.1111/soc4.13097

    • Search Google Scholar
    • Export Citation
  • McDonald, A. M. and Cranor, L. F. (2008). The cost of reading privacy policies. I/S: A Journal of Law and Policy for the Information Society, 4(3): 543568.

    • Search Google Scholar
    • Export Citation
  • Milan, S. and van der Velden, L. (2016). The alternative epistemologies of data activism. Digital Culture & Society, 2(2): 5774.

  • noyb. (2023, 21 July). Projects | noyb.eu. Noyb – None of Your Business. Retrieved from: https://noyb.eu/en/projects

  • Obar, J. A. (2022a). A policy complexity analysis for 70 digital services. Retrieved from: www.biggestlieonline.com/policy-complexity-analysis-2019/

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. (2022b). Unpacking ‘the biggest lie on the internet’: Assessing the length of terms of service and privacy policies for 70 digital services. Retrieved from: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4293363

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. (2023, March). Consumer Privacy Protection Act could lead to fines for deceptive designs in apps and websites. The Conversation. Retrieved from: https://theconversation.com/consumer-privacy-protection-act-could-lead-to-fines-for-deceptive-designs-in-apps-and-websites-196019

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. and Oeldorf-Hirsch, A. (2018). The clickwrap: A political economic mechanism for manufacturing consent on social media. Social Media + Society, 4(3): 114.

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. and Oeldorf-Hirsch, A. (2020). The biggest lie on the internet: Ignoring the privacy policies and terms of service policies of social networking services. Information, Communication & Society, 23(1): 128147.

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. and Oeldorf-Hirsch, A. (2022). Older adults and ‘the biggest lie on the internet’: From ignoring social media policies to the privacy paradox. International Journal of Communication, 16: 47794800.

    • Search Google Scholar
    • Export Citation
  • OPC (Office of the Privacy Commissioner of Canada) (2020, August). PIPEDA fair information principle 3 – consent. Retrieved from: www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/p_principle/principles/p_consent/

    • Search Google Scholar
    • Export Citation
  • OPC (Office of the Privacy Commissioner of Canada) (2021, August). Guidelines for obtaining meaningful consent. Retrieved from: www.priv.gc.ca/en/privacy-topics/collecting-personal-information/consent/gl_omc_201805/

    • Search Google Scholar
    • Export Citation
  • Pangrazio, L. and Sefton-Green, J. (2020). The social utility of ‘data literacy’. Learning, Media and Technology, 45(2): 208220.

  • Pateman, C. (1970). Participation and Democratic Theory. Cambridge University Press.

  • Peled, N. (2022). ‘Waarom staat mijn zoon op die lijst?’ Groene Amsterdammer. Retrieved from: www.groene.nl/artikel/waarom-staat-mijn-zoon-op- die-lijst

    • Search Google Scholar
    • Export Citation
  • Pop Stefanija, A. (2023). ‘Power Asymmetries, Epistemic Imbalances and Barriers to Knowledge: The (im)possibility of knowing algorithms.’ In: S. Lindgren (ed) Handbook of Critical Studies of Artificial Intelligence, pp 563672. Edward Elgar Publishing.

    • Search Google Scholar
    • Export Citation
  • Pop Stefanija, A. and Pierson, J. (2023). Algorithmic governmentality, digital sovereignty, and agency affordances: Extending the possible fields of action. Weizenbaum Journal of the Digital Society, 3(2): 130. https://doi.org/10.34669/WI.WJDS/3.2.2

    • Search Google Scholar
    • Export Citation
  • Redden, J. (2022). ‘Data Harms.’ In: Dencik, L., Hintz, A., Redden, J., and Trere, E., Data Justice, pp 5972. SAGE.

  • Reidenberg, J. R., Breaux, T., Cranor, L. F., French, B., Grannis, A., Graves, J. T., et al (2015). Disagreeable privacy policies: Mismatches between meaning and users’ understanding. Berkeley Technology Law Journal, 30(1): 3968.

    • Search Google Scholar
    • Export Citation
  • Ricaurte, P. (2019). Data epistemologies, the coloniality of power, and resistance. Television & New Media, 20(4): 350365.

  • Roosen, M. (2020). What SyRI can teach us about technical solutions for societal challenges. Global Data Justice, 20 February. Available at: http://globaldatajustice.org/2020-02-20-roosen-syri/

    • Search Google Scholar
    • Export Citation
  • Sander, I. (2020). What is critical big data literacy and how can it be implemented? Internet Policy Review, 9(2).

  • Smith, G. (2009). Democratic Innovations: Designing institutions for citizen participation. Cambridge University Press.

  • Srnicek, N. (2017). Platform Capitalism. John Wiley & Sons.

  • Taylor, L. (2017). What is data justice? The case for connecting digital rights and freedoms globally. Big Data & Society, 4(2): 114.

    • Search Google Scholar
    • Export Citation
  • TOS;DR (Terms of Service; Didn’t Read) (2023). Frontpage. Retrieved from: https://tosdr.org

  • Viljoen, S. (2020). Democratic data: A relational theory for data governance (SSRN Scholarly Paper ID 3727562). Social Science Research Network. https://doi.org/10.2139/ssrn.3727562

    • Search Google Scholar
    • Export Citation
  • Vonk, G. (2014). Repressive welfare states: The spiral of obligations and sanctions in social security. European Journal of Social Security, 16(3): 188203.

    • Search Google Scholar
    • Export Citation
  • Young, I. M. (2011). Justice and the Politics of Difference. Princeton University Press.

  • Ziewitz, M. and Singh, R. (2021). Critical companionship: Some sensibilities for studying the lived experience of data subjects. Big Data & Society, 8(2). https://doi.org/10.1177/20539517211061122

    • Search Google Scholar
    • Export Citation
  • Acquisti, A., Adjerid, I., Balebako, R., Brandimarte, L., Cranor, L. F., Komanduri, S., et al (2017). Nudges for privacy and security: Understanding and assisting users’ choices online. ACM Computing Surveys (CSUR ), 50(3): 141.

    • Search Google Scholar
    • Export Citation
  • Ahmed, S. (2023). The Feminist Killjoy Handbook. Allen Lane. Retrieved from: www.penguin.co.uk/books/454793/the-feminist-killjoy-handbook-by-ahmed-sara/9780241619537

    • Search Google Scholar
    • Export Citation
  • Allen, J. S. (2021). There’s a Disco Ball between Us: A theory of black gay life. Duke University Press.

  • Amoore, L. (2020). Cloud Ethics: Algorithms and the attributes of ourselves and others. Duke University Press.

  • Arnstein, S. R. (1969). A ladder of citizen participation. Journal of the American Institute of Planners, 35(4): 216224.

  • Benjamin, R. (2016). Informed refusal: Toward a justice-based bioethics. Science, Technology, and Human Values, 41(6): 967990.

  • Benjamin, R. (2019). Race After Technology: Abolitionist tools for the New Jim Code. Polity.

  • Binfield, K. (ed) (2004). Writings of the Luddites. Johns Hopkins University Press.

  • Bucher, T. and Helmond, A. (2018). ‘The Affordances of Social Media Platforms.’ In: J. Burgess, T. Poell, and A. Marwick (eds) The SAGE Handbook of Social Media, pp 233253. SAGE.

    • Search Google Scholar
    • Export Citation
  • Cate, F. H. (2006). The failure of fair information practice principles: Consumer protection in the age of the information economy. Retrieved from: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1156972

    • Search Google Scholar
    • Export Citation
  • Cohen, J. E. (2019). Between Truth and Power: The legal constructions of informational capitalism. Oxford University Press.

  • Coleman, G. (2017). From internet farming to weapons of the Geek. Current Anthropology, 58(Supplement 15): S91S102.

  • Crawford, K. (2021). Atlas of AI. Yale University Press.

  • D’Ignazio, C. (2024). Counting Feminicide: Data feminism in action. MIT Press.

  • Davis, J. L. (2020). How Artifacts Afford: The power and politics of everyday things. MIT Press.

  • Dencik, L., Hintz, A., Redden, J., and Treré, E. (2019). Exploring data justice: Conceptions, applications and directions. Information, Communication & Society, 22(7): 873881.

    • Search Google Scholar
    • Export Citation
  • Draper, N. A. and Turow, J. (2019). The corporate cultivation of digital resignation. New Media & Society, 21(8): 18241839.

  • Emami-Naeini, P., Dheenadhayalan, J., Agarwal, Y., and Cranor, L. F. (2021). An informative security and privacy ‘nutrition’ label for internet of things devices. IEEE Security & Privacy, 20(2): 3139.

    • Search Google Scholar
    • Export Citation
  • Eubanks, V. (2017). Automating Inequality: How high-tech tools profile, police, and punish the poor. St Martin’s Press.

  • European Commission (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). Retrieved from: https://eur-lex.europa.eu/eli/reg/2016/679/oj

    • Search Google Scholar
    • Export Citation
  • Fotopoulou, A. (2019). ‘Understanding Citizen Data Practices from a Feminist Perspective: Embodiment and the Ethics of Care.’ In: H. Stephansen and E. Trere (eds) Citizen Media and Practice: Currents, connections, challenges, pp 227242. Taylor & Francis/Routledge.

    • Search Google Scholar
    • Export Citation
  • Gray, J. (2018). Three aspects of data worlds. Krisis: Journal for Contemporary Philosophy, 1: 417.

  • Habib, H., Pearman, S., Wang, J., Zou, Y., Acquisti, A., Cranor, L. F., et al (2020). ‘It’s a scavenger hunt’: Usability of websites’ opt-out and data deletion choices. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp 112.

    • Search Google Scholar
    • Export Citation
  • Halsema, F. (2022). Raadsinformatiebrief: Recente media-aandacht voor de Top400-aanpak. Gemeente Amsterdam. Retrieved from: https://open.overheid.nl/documenten/ronl-8f2884ba05c31a13f2a7e463932502d81289fde7/pdf

    • Search Google Scholar
    • Export Citation
  • Hargittai, E. and Marwick, A. (2016). ‘What can I really do?’ Explaining the privacy paradox with online apathy. International Journal of Communication, 10: 37373757.

    • Search Google Scholar
    • Export Citation
  • Heeks, R. and Shekhar, S. (2019). Datafication, development and marginalised urban communities: An applied data justice framework. Information, Communication & Society, 22(7): 9921011.

    • Search Google Scholar
    • Export Citation
  • Helpful Places (nd). Digital Trust for Places & Routines. Retrieved from: https://dtpr.io

  • Hintz, A., Dencik, L., and Wahl-Jorgensen, K. (2019). Digital Citizenship in a Datafied Society. Polity Press.

  • Hintz, A., Dencik, L., Redden, J., Trere, E., Brand, J., and Warne, H. (2022). Civic Participation in the Datafied Society: Towards Democratic Auditing? Research Report. Retrieved from: https://datajusticelab.org/wp-content/uploads/2022/08/CivicParticipation_DataJusticeLab_Report2022.pdf.

    • Search Google Scholar
    • Export Citation
  • Illich, I. (1973/1975). Tools for Conviviality. Fontana.

  • Jansen, F. (2021, August). Critical is not political: The need to (re) politicize data literacy. Seminar.net, 17(2).

  • Jansen, F. (2022). Top400: A top-down crime prevention strategy in Amsterdam. PILP. Retrieved from: https://pilpnjcm.nl/wp-content/uploads/ 2022/11/Top400_topdown-crime-prevention-Amsterdam_v2.pdf

    • Search Google Scholar
    • Export Citation
  • Katzenbach, C. and Ulbricht, L. (2019). Algorithmic governance. Internet Policy Review, 8(4): 118.

  • Kennedy, H. and Moss, G. (2015). Known or knowing publics? Social media data mining and the question of public agency. Big Data & Society, 2(2): 111.

    • Search Google Scholar
    • Export Citation
  • Lannerö, P. (2012, 27 January). Previewing online terms and conditions: CommonTerms alpha proposal. Retrieved from: http://commonterms.org/commonterms_alpha_proposal.pdf

    • Search Google Scholar
    • Export Citation
  • Lehtiniemi, T. and Haapoja, J. (2020). Data agency at stake: MyData activism and alternative frames of equal participation. New Media & Society, 22(1): 87104.

    • Search Google Scholar
    • Export Citation
  • Lomborg, S., Kaun, A., and Hansen, S. S. (2023). Automated decision-making research: Towards a people-centered approach. Sociology Compass, 17(8): e13097. https://doi.org/10.1111/soc4.13097

    • Search Google Scholar
    • Export Citation
  • McDonald, A. M. and Cranor, L. F. (2008). The cost of reading privacy policies. I/S: A Journal of Law and Policy for the Information Society, 4(3): 543568.

    • Search Google Scholar
    • Export Citation
  • Milan, S. and van der Velden, L. (2016). The alternative epistemologies of data activism. Digital Culture & Society, 2(2): 5774.

  • noyb. (2023, 21 July). Projects | noyb.eu. Noyb – None of Your Business. Retrieved from: https://noyb.eu/en/projects

  • Obar, J. A. (2022a). A policy complexity analysis for 70 digital services. Retrieved from: www.biggestlieonline.com/policy-complexity-analysis-2019/

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. (2022b). Unpacking ‘the biggest lie on the internet’: Assessing the length of terms of service and privacy policies for 70 digital services. Retrieved from: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4293363

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. (2023, March). Consumer Privacy Protection Act could lead to fines for deceptive designs in apps and websites. The Conversation. Retrieved from: https://theconversation.com/consumer-privacy-protection-act-could-lead-to-fines-for-deceptive-designs-in-apps-and-websites-196019

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. and Oeldorf-Hirsch, A. (2018). The clickwrap: A political economic mechanism for manufacturing consent on social media. Social Media + Society, 4(3): 114.

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. and Oeldorf-Hirsch, A. (2020). The biggest lie on the internet: Ignoring the privacy policies and terms of service policies of social networking services. Information, Communication & Society, 23(1): 128147.

    • Search Google Scholar
    • Export Citation
  • Obar, J. A. and Oeldorf-Hirsch, A. (2022). Older adults and ‘the biggest lie on the internet’: From ignoring social media policies to the privacy paradox. International Journal of Communication, 16: 47794800.

    • Search Google Scholar
    • Export Citation
  • OPC (Office of the Privacy Commissioner of Canada) (2020, August). PIPEDA fair information principle 3 – consent. Retrieved from: www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-act-pipeda/p_principle/principles/p_consent/

    • Search Google Scholar
    • Export Citation
  • OPC (Office of the Privacy Commissioner of Canada) (2021, August). Guidelines for obtaining meaningful consent. Retrieved from: www.priv.gc.ca/en/privacy-topics/collecting-personal-information/consent/gl_omc_201805/

    • Search Google Scholar
    • Export Citation
  • Pangrazio, L. and Sefton-Green, J. (2020). The social utility of ‘data literacy’. Learning, Media and Technology, 45(2): 208220.

  • Pateman, C. (1970). Participation and Democratic Theory. Cambridge University Press.

  • Peled, N. (2022). ‘Waarom staat mijn zoon op die lijst?’ Groene Amsterdammer. Retrieved from: www.groene.nl/artikel/waarom-staat-mijn-zoon-op- die-lijst

    • Search Google Scholar
    • Export Citation
  • Pop Stefanija, A. (2023). ‘Power Asymmetries, Epistemic Imbalances and Barriers to Knowledge: The (im)possibility of knowing algorithms.’ In: S. Lindgren (ed) Handbook of Critical Studies of Artificial Intelligence, pp 563672. Edward Elgar Publishing.

    • Search Google Scholar
    • Export Citation
  • Pop Stefanija, A. and Pierson, J. (2023). Algorithmic governmentality, digital sovereignty, and agency affordances: Extending the possible fields of action. Weizenbaum Journal of the Digital Society, 3(2): 130. https://doi.org/10.34669/WI.WJDS/3.2.2

    • Search Google Scholar
    • Export Citation
  • Redden, J. (2022). ‘Data Harms.’ In: Dencik, L., Hintz, A., Redden, J., and Trere, E., Data Justice, pp 5972. SAGE.

  • Reidenberg, J. R., Breaux, T., Cranor, L. F., French, B., Grannis, A., Graves, J. T., et al (2015). Disagreeable privacy policies: Mismatches between meaning and users’ understanding. Berkeley Technology Law Journal, 30(1): 3968.

    • Search Google Scholar
    • Export Citation
  • Ricaurte, P. (2019). Data epistemologies, the coloniality of power, and resistance. Television & New Media, 20(4): 350365.

  • Roosen, M. (2020). What SyRI can teach us about technical solutions for societal challenges. Global Data Justice, 20 February. Available at: http://globaldatajustice.org/2020-02-20-roosen-syri/

    • Search Google Scholar
    • Export Citation
  • Sander, I. (2020). What is critical big data literacy and how can it be implemented? Internet Policy Review, 9(2).

  • Smith, G. (2009). Democratic Innovations: Designing institutions for citizen participation. Cambridge University Press.

  • Srnicek, N. (2017). Platform Capitalism. John Wiley & Sons.

  • Taylor, L. (2017). What is data justice? The case for connecting digital rights and freedoms globally. Big Data & Society, 4(2): 114.

    • Search Google Scholar
    • Export Citation
  • TOS;DR (Terms of Service; Didn’t Read) (2023). Frontpage. Retrieved from: https://tosdr.org

  • Viljoen, S. (2020). Democratic data: A relational theory for data governance (SSRN Scholarly Paper ID 3727562). Social Science Research Network. https://doi.org/10.2139/ssrn.3727562

    • Search Google Scholar
    • Export Citation
  • Vonk, G. (2014). Repressive welfare states: The spiral of obligations and sanctions in social security. European Journal of Social Security, 16(3): 188203.

    • Search Google Scholar
    • Export Citation
  • Young, I. M. (2011). Justice and the Politics of Difference. Princeton University Press.

  • Ziewitz, M. and Singh, R. (2021). Critical companionship: Some sensibilities for studying the lived experience of data subjects. Big Data & Society, 8(2). https://doi.org/10.1177/20539517211061122

    • Search Google Scholar
    • Export Citation

Content Metrics

May 2022 onwards Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 611 611 589
PDF Downloads 231 231 206

Altmetrics