Uploaded by Белек Курманбеков

Barassi (2022) Algorithmic Violence

advertisement
481
34
ALGORITHMIC VIOLENCE
IN EVERYDAY LIFE AND
THE ROLE OF MEDIA
ANTHROPOLOGY
Veronica Barassi
Introduction
On a warm evening in 2018, I sat down in a crowded restaurant in the heart of West Los Angeles.
I had arranged to meet Cara, one of the parents involved in my project on the datafication of
children. That evening we started talking about her experience of surveillance in everyday life,
from her use of social media to home technologies like Amazon Echo, and Cara rather than
focusing on the issue of surveillance, immediately shifted the discussion to tell me how angry
and irritated she felt about the fact that she was being sent targeted ads because she was being
profiled as a single 50+​woman. She told me that she felt stereotyped and belittled by those
practices of digital profiling. Although it was true that she was single and in her fifties, she said
that this aspect did not define her because ‘There is so much more to me as a person.’
I met Cara for the first time in 2016, when I launched the Child | Data | Citizen project. It was just one year after I had my first daughter, and I suddenly realized that most of
the families that I met in my daily life shared multiple, almost unimaginable data traces of
children. I launched the project because I was curious to understand how families negotiated
and interacted with the systematic, relentless, and worrying practices of data tracking of their
children. At that time, I was living in London and found myself immersed in the ethnographic
reality of datafied families. But shortly after, my husband was relocated for work to Los Angeles.
Hence I started studying the datafication of family life in both cities dealing with two homes,
two health and education systems, and of course two very different data environments.
For three years (2016–​2019) I documented how it felt to live in a world where not only
me, my family, and children were being datafied. I became a participant observer in my own
intercontinental life and started writing fieldnotes about how it felt to have to sign Terms and
Conditions, even if I knew that my consent was somehow coerced and certainly not informed.
I started to observe what other parents were doing in parks, school meetings, children’s parties,
and playdates. This auto-​ethnographic research enabled me to start tackling the lived experience of the datafication of children from a parent’s perspective. In both cities I worked with
families with children between 0 and 13 years of age, whose personal information online –​in
DOI: 10.4324/9781003175605-47
481
482
Veronica Barassi
both countries –​is ruled by the Children’s Online Privacy Protection Act (1998). I carried
out 50 semi-​structured interviews and eight months of digital ethnography of eight families’
‘sharenting’ practices, which involved a weekly analysis of the pictures, news, and updates they
posted on their social media profiles (most parents posted regularly on Facebook and Instagram,
and some on YouTube) and observed how people interacted with the information posted.
Over the last several years the field of media anthropology –​which I define as the field
built by those scholars and research communities who rely on anthropological knowledge and
ethnographic practice in anthropology to study media and technological processes –​has been
preoccupied with the rapid rise of algorithmic logics and data technologies in everyday life.
Some have focused on the rise of big data as a new form of knowledge and meaning construction (Boellstorff, 2013; Boellstorff and Maurer, 2015), others have analysed the powerful
discourses associated with algorithmic logics in culture (Dourish, 2016; Seaver, 2017) or the
multiple ways in which people were negotiating with data technologies in everyday life and
their data narratives (Pink, Lanzeni, and Horst, 2018; Dourish and Cruz, 2018). These works
are extremely insightful because they draw on classical anthropological theory to shed light on
the cultural complexities that have defined the techno-​historical transformation of our times.
In this chapter I want to add to these debates by focusing more specifically on algorithmic
profiling and its impacts on parents. I will draw on some of the ethnographic data that I gathered
during the Child | Data | Citizen project to offer a new intervention for this chapter. It will
demonstrate that we can no longer talk about ‘tech-​surveillance’ in everyday family life without
dealing with the question about ‘algorithmic profiling’, and explore how algorithmic profiling
in everyday life makes people feel belittled and objectified, and is often experienced as a form of
violence. I also want to demonstrate that the experience and understanding of algorithmic profiling varies immensely if the participant comes from a privileged position or one of inequality,
this is because algorithmic profiling impacts groups differently.
Over the last decades we have seen many scholars –​outside of the field of media and
digital anthropology –​who have explored and analysed the relationship between surveillance,
algorithmic profiling, and social inequality (Barocas and Selbts, 2016; Madden et al., 2017;
Eubanks, 2018). In the last few years we have also seen scholars referring to concepts such
as data violence (Hoffmann, 2018, 2020) or algorithmic violence (Onuhoa, 2018; Safransky,
2020; Bellanova et al., 2021) to explain how algorithms and data feed into specific forms of
violence. Although insightful, these works seem to ignore anthropological theory and its key
understanding that bureaucratic processes play a role in yielding symbolic and structural violence. In this chapter I would like to draw on this theory of bureaucracy and symbolic violence
(Appadurai, 1993; Herzfeld, 1993; Gupta, 2012; Graeber, 2015) to shed light on the relationship between algorithmic profiling and bureaucratic processes and to reflect on the impact of
algorithmic violence in everyday life.
Tech-​Surveillance and the Question about Profiling
One evening in 2018, I drove to Mabel’s house. Mabel worked as a manager in the entertainment industry. She was a single mom and lived in a beautiful house in Pasadena, Los Angeles
with her son –​who at the time of the interview was six years old –​two dogs and a cat. As she
opened her front door, I noticed the camera right above my head. She picked up her phone,
laughed, and told me: ‘Look the app just informed me you are here.’ Mabel’s security camera
was connected to her phone and would inform her if there was movement at the front door
or at the back door of her house. When we sat down in her beautiful living room overlooking
the garden, I asked her if she was ok with me recording her. Influenced by years of ethical
482
483
Algorithmic Violence in Everyday Life
research practice, I told her that if –​at any point in the interview –​she wanted me to turn off
my device she just needed to let me know. During the interview, I asked Mabel to talk about
all the ways in which she thought that her son’s data and her own data were being surveilled.
She started talking about all the data that was being tracked on her social media accounts, her
security system, her fitness apps, and of course Alexa, the voice-​operated virtual assistant of
Amazon that is included in the home hub Amazon Echo. As she mentioned the ‘wake word’
Alexa, the Amazon Echo turned on and started recording our interview. I immediately thought
how paradoxical that simple fact was for ethnographers who are committed to preserving the
anonymity of their participants in the interviews. Also Mabel noticed, she laughed and added:
M: You see we have Alexa, we are surveilled all the time and she records everything. I read
that she also records without the wake word.
V: Does that bother you?
M: No I am not concerned, my life is very boring, but also I am not stupid enough to buy
into the promise of it. So for instance if Alexa surveilles me to decide that I like blue,
I won’t go out and buy blue items.
When Mabel said that she was not stupid enough to buy into the ‘promise of it’, she was
referring to algorithmic profiling. That is the business logic behind the technologies –​that
inhabit the homes of many families in the UK and US –​for which the analysis of users’ data
traces could be used to predict their desires and future behaviors and sell them new products.
During the Child | Data | Citizen project, which took place as mentioned in the introduction between 2016 and 2019 in London and Los Angeles, I walked into many different living
rooms, which varied in style, wealth, and location. Some were in wealthy neighborhoods,
others were in very poor ones. Those living rooms reflected what anthropologist Arturo
Escobar (2018) calls the ‘pluriverse’, which is defined as ‘a place where many worlds fit’.
I too observed a pluriverse of experiences in London and Los Angeles. In my multi-​sited ethnography, I worked with parents who came from a variety of cultural, ethnic, and national
backgrounds. The parents were extraordinarily diverse not only in terms of ethnicity (e.g.
Asian, Latinos, Indian, Black, Indigenous, Multiracial, White), but also in terms of cultural
and national heritage (e.g. Afghani, Mexican, Brazilian, Indian, German, Italian, Hungarian,
Icelandic, Zimbabwean, and Scottish, among others). I also made a genuine attempt to seek
parents from different classes, by interviewing parents working in low-​income jobs (such as
nannies, cleaners, buskers or in administration) as well as parents working in high-​income jobs
(such as lawyers, film-​producers, journalists, and marketing, for example). I also came across
a plurality of family situations that challenged the heteronormativity or the structure of the
‘nuclear family’, so I interviewed gay parents, divorced parents who had to juggle with a complex living arrangement, and single mothers who chose to adopt a child. Some of the living
rooms I only visited once, but others became familiar spaces in the unfolding of my project, and
I would return to them over and over again.
Despite the extraordinary variety of settings, locations, and backgrounds, many of the parents
that I worked with shared Mabel’s experience and described a wide range of surveilling technologies in their homes: from social media to apps and virtual assistants. They also talked about
how their children were being surveilled and tracked not only at home, but by a multiplicity
of other technologies and data collection practices that they encountered in their everyday life
such as their online school platforms or the data gathered through their doctor’s office. Some
parents, unlike Mabel, were really concerned and tried to limit the number of technologies
483
484
Veronica Barassi
they used in their homes, as they did not want to expose themselves and their children to daily
surveillance.
In 2017 for instance I found myself sipping coffee in a small and cosy living room in
South London with Alina, the mother of two children aged one and four, and she described
the change in everyday life, as well as the feeling of a lack of alternatives. Alina had moved
to London from Germany five years before our interview. She had been pregnant or a full-​
time mom since she had arrived in the UK and lived in a low-​income neighborhood in
South-​East London. Alina was particularly worried about the issue of surveillance; she had
been brought up in East Germany. Even though she was very young in 1989, she knew of
the surveillance tactics of the former German Democratic Republic (DDR). As we sipped
our coffee, she started to tell me how uncomfortable she felt with the increased surveillance
in everyday life:
It feels that you have to give up a lot of information to actually get a service; you don’t
really have an alternative. Think about the car insurance you cannot really not buy.
The lack of alternative is concerning, but also the lack of security. It’s everywhere,
they are starting with health care, they are starting with the chips in things, which it
could be a good thing but they can be misused. There are cameras everywhere; you
can’t really escape this transformation.
During my research I met different parents who like Alina when talking about surveillance
would often relate a lack of alternative and a feeling of powerlessness. Like Mabel had done,
Alina related her worry about the ways in which ‘data was being used to make assumptions and
build profiles about her or her children’ and added:
It’s scary. It’s a new fear of our lives. In the 1980s during the Cold War you would
always be worried about an atomic bomb, now you have to be worried about these
things together with other things. It’s too much. You should really stop thinking
about it.
Although Mabel and Alina saw the issue of surveillance in different ways and with different
degrees of worry, both of them could not discuss the issue without thinking about how data
traces were being used to profile them and their children.
A few months after interviewing Alina, I interviewed Dan. His living room was in Central
London, in a large modern flat conversion of an old school. The modernist and minimalist
furniture contrasted with the children’s toys, books, and items of clothing that were scattered
around the room on chairs, sideboards, and an old trunk. Dan was a stay-​at-​home dad. He used
to work in IT for a digital marketing company, but when he was made redundant, he decided
to support his wife’s career and take care of the house and kids. During the interview Dan told
me –​like Mabel and Alina had done –​that he was very well aware of the fact that his family was
being surveilled through a variety of technologies. He also told me that he worried that data
that was being collected from his children today would be used by artificial intelligence (AI)
systems in the future to determine key aspects of their lives including whether they would get
access to a job, a rental accommodation, or a university degree, and then added:
I’d like to think that we can change how it is done, but the world is a big marketing
and profiling machine. I’d like to think that I can protect my children from that, but
I don’t think I can do it.
484
485
Algorithmic Violence in Everyday Life
My research thus led me to the conclusion that it was impossible for me to explore the
issue of tech-​surveillance in family life without shedding light on the ways in which families were being affected and negotiated with algorithmic profiling. In particular, this chapter
moves beyond prior scholarly emphases on ‘big data’ to understand the violence that occurs in
everyday lived experiences. It concludes by calling for deeper exploration of the connection
between anthropological theory on bureaucracy and the structural violence of algorithmic
profiling.
‘There is so much more to me as a person’: Everyday Negotiations with
Algorithmic Profiling, Human Reductionism, and Inequality
The evening that I interviewed Cara and she told how she was being profiled as ‘50+​and
single’, she said that this annoyed her because ‘there was so much more to her as a person’. Then
she added she felt that the data trackers on the internet were like ‘gossipers’ and then added:
When others talk about you, when people seem to infer something about you on
the basis of a certain information or rumor, then that is wrong, it feels like gossiping.
When I get targeted for a search I have done on Google it feels exactly like that; like
someone has been gossiping about me.
If Cara talked about data trackers as gossipers, Amy, another friend I met through the project,
talked about the meanness of algorithmic profiling and the fact that data companies seemed to
judge and define people on the basis of their weaknesses. She told me, for instance, that she
was trying to lose weight and found it demeaning and unforgiving that every time she went
on Facebook she was offered a new diet or plus-​size clothing. She explained that she knew
she was overweight and was trying to lose weight, but the fact that online data trackers kept
reminding her of this hurt her feelings. She told me, as Cara had done, ‘that there is more to
her as a person’ than her weight.
The fact that Cara and Amy used the same sentence (‘there is more to me as a person’)
reveals a fundamental aspect of the experience of algorithmic profiling: the problem of human
reductionism.
Media anthropologists have long been arguing that one of the problems behind the rapid rise
in use of big data and algorithms to profile individuals is the belief that ‘data is raw’ and objective
and that the large amounts of personal data ensure a good understanding of people’s practices,
beliefs, and desires. In fact, this is very far from being true. There is no such thing as raw data
because data collection itself requires processing of narration and framing (Gitelman, 2013;
Boellstorff and Maurer, 2015; Dourish and Cruz, 2018). Boellstorff (2013) has demonstrated
that, in anthropology, the debates about data not being raw have been influenced by Levi-​
Strauss’ distinction between raw, cooked, and rotted data (Levi-​Strauss in Boellstorff, 2013: para
52) and Geertz’s reference to the ethnographic algorithm to discuss the work of interpretation
that comes from processes of data collection (Geertz in Boellstorff, 2013: para 56). In addition to the fact that data is neither raw nor objective, most of the data collected from people
today is systematically taken out of context and thus the gathering of large amounts of personal
data is not necessarily an indicator of quality especially when we are trying to understand
humans (Boellstorff, 2013). Although not a media-​anthropologist, Costanza-​Chock (2018)
has discussed this problem at length and has argued that human identity and experience are
violated and belittled by binary data systems and computer reductionism as they do not take
into account the variety and complexity of human existence.
485
486
Veronica Barassi
Our technologies are not designed to take into account human variety and complexity.
A key example, which is close to the heart at the time of writing, can be found in COVID-​
19 contact-​tracing apps. In her fascinating work, Milan (2020) has shown that most of these
apps are based on a ‘standard’ experimental subject that hardly allows for exploring the role
of variables such as gender, ethnicity, race or low income. Both Costanza-​Chock (2018)
and Milan (2020) show how the roots of this reductionism stem from design practice itself.
It is for this reason that the anthropologist Escobar (2018) has advanced a new vision for
design theory, one that takes into account the complex and intersectional pluriverse we
live in.
It is by understanding the human reductionism implicit to processes of algorithmic profiling
that we can shed light on why –​in everyday life –​these processes are perceived as belittling,
with people like Cara and Amy arguing that ‘there is so much more to them as a person’. One
aspect that surprised me, however, during my research was to notice that there were multiple
ways in which people were negotiating and resisting algorithmic reductionism. Cara told me,
for instance, that she would ‘play the algorithm’ and that many times she would consciously
choose not to like someone’s post on Facebook, even if she liked it, because she realized that
if she did not like things, her news feed became much more ‘democratic’ and open to chance
rather than likes. She also told me that she often tried to create a ‘happy day on Facebook’ for
herself. She had different tricks to do this: either she would start liking the photos of animals
posted by her friends, and she would get bombarded with cute animal feeds on Facebook, or if
she was having a bad day, she would do a web search for a ‘2-​bedroom house in Fiji’ over and
over again ‘just to be targeted with beautiful advertising of amazing places’.
What emerged clearly from my research was that, as I mentioned also elsewhere (Barassi,
2020), there was a fundamental difference in the ways in which the families that were in a position of inequality or a position of privilege thought about algorithmic profiling and surveillance. Mabel for instance talked about algorithmic profiling only with reference to targeted ads,
whilst Mariana, a Mexican immigrant who worked as a cleaner and lived in Los Angeles with
her four children, told me during the interview that:
You have to be aware of the technologies, because, you are also checked by the government, when you pass the border, they check it and they can push you back. We
are being checked by everyone, insurances, doctors, police, everyone knows what we
do as a family, where we go, what we eat.
Mariana was particularly worried about how immigration enforcers used that information to
make entry decisions and related the story of her sister who had been refused a visa, because
the border control had seen from her profile that she had too many family members in the US
and were doubting the fact that she was just visiting.
Lina who migrated from Latin America to the UK over ten years ago shared the same worry.
She lived in a small apartment at the top floor of a housing estate in one of the most deprived
areas of South London. She shared the apartment with her two daughters, an eight-​year-​old
and a teenager, and her husband. Both she and her husband were highly educated and aware of
the transformations that were taking place when it came to data surveillance. In the interview,
Lina told me:
When I think about all this surveillance I feel as if I were an object, like I was being
constantly objectified. We do not have a choice, you don’t have privacy, you don’t
have anything. I feel as if I am being belittled, minimized, and invaded. I feel little
486
487
Algorithmic Violence in Everyday Life
–​how else can I explain it? I feel that it is too big for me, I can’t fight it. I can’t defend
myself. I am completely powerless. I feel as if I am being used, because they could do
whatever they want with your data and turn it against you.
These findings are not new or surprising. In fact, over the last five years different researchers,
outside the field of media anthropology, have shown that marginal communities are more
exposed to the injustices of tech-​
surveillance and algorithmic profiling (O’Neil, 2016
Barocas and Selbst, 2016; Eubanks, 2018). What is becoming clear is that data technologies
and automated systems are not equal or fair, and the experience of data harms depends on
one’s position in society. This emerges clearly in the work of the legal scholar Gilman (2012)
who shows that the poor are more exposed to privacy intrusions by government surveillance
and other agents, and that current privacy law does not address the disparity of experience.
Marginal communities are more exposed to privacy intrusion and data harms, because in their
everyday life they are subjected to systemic surveillance and discrimination. In addition to
this, as Madden et al. (2017) have rightly argued, poor and marginal communities are exposed
to ‘networked privacy harms’, because they are held liable for the actions of those in their
networks and neighborhoods.
Yet in the data something more emerged: algorithmic profiling, because of its reductionism
and intrusiveness, was often perceived as a form of violence especially by people like Lina and
Mariana who came from a disadvantaged position in life. Over the last few years some scholars,
outside of anthropology, have focused on the notion of violence when reflecting on the impact
of data technologies and algorithmic logics. Hoffmann (2018) argues that when we think about
the inequality of algorithmic profiling and automated systems we need to talk about ‘data violence’ to understand the many ways in which these systems reinforce existing forms of structural violence against marginal communities and the poor. In contrast to Hoffmann (2018,
2020), during my research I often referred in my notes to the analytical and methodological
idea of algorithmic violence instead of data violence to describe why people like Lina and Mariana
felt violated and harmed by algorithmic profiling. I understood algorithmic violence, defined
by Onuoha (2018), as the violence that an algorithm or automated system inflicts on people
which –​like other forms of violence –​encompasses everything from micro-​occurrences to life-​
threatening realities. It can materialize itself in political economic structures, as Safransky (2020)
shows with research on smart cities or as Bellanova et al. (2021) portray in numerous examples
such as drone attacks planned by tracking mobile phones, anti-​r iot police using face recognition, AI systems used for warfare and international politics, immigration agents using Twitter,
and algorithmic profiling used for security intervention. Although in the understanding of
algorithmic violence I believe it is pivotal to focus on structures of power, I also believe that it
is essential to explore how algorithmic violence has become, for people like Lina and Mariana,
but also for Mabel and Amy, an everyday sensory reality. In my theoretical and analytical use
of the term algorithmic violence, therefore, I am more concerned with the way in which this
violence was experienced and negotiated on a daily basis.
There is something more that sets my approach away from current positions on algorithmic
violence. In fact, I was surprised to notice that –​although Onuhoa (2018) briefly mentions
the work of Graeber on bureaucracy –​all the articles I read do not mention anthropological
theory when they suggest that it is important to refer to the social sciences in understanding
the violence of our algorithmic cultures. The lack of engagement with anthropological theory
implied that they overlooked the fact that algorithmic logics are tightly linked with bureaucratic processes and hence with symbolic and structural violence as understood in anthropology.
As we shall see below, the anthropological literature on bureaucracy and symbolic violence
487
488
Veronica Barassi
(Appadurai, 1993; Herzfeld, 1993; Gupta, 2012; Graeber, 2015) is pivotal if we really want to
understand the violence of data and algorithmic logics in everyday family life.
Algorithmic Violence, Bureaucracy, and the
Role of Anthropology
We cannot understand the rise of big data and the cultural logics of algorithms without
considering a key economic transformation that happened over the last few decades, which
has transformed our cultures and institutions globally. In her work, Zuboff (2015, 2019), for
instance, talks about the rise of a new economy of surveillance capitalism. She argues that it was
Google that played a fundamental role in the emergence of surveillance capitalism, when in
2002 the company discovered behavioral surplus. The company, according to Zuboff (2019),
played a very similar role to those that the Ford Motor and General Motors companies played in
the establishment of industrial capitalism. This is because, according to Zuboff (2019), Google
has not only introduced a new economic logic which revolved around data extraction, accumulation, and analysis, but the discovery of behavioral surplus has affected human practices and
behaviors, restructured institutions, and transformed everyday life.
In anthropology the change has been theorized by David Graeber (2015). David Graeber
never discussed the ‘turn to data’ as the rise of a new economic model, or as the emergence of
the new age of surveillance capitalism like Zuboff does. His analytical eye did not focus on disruption and novelty, rather on the dialectical relationship between continuity and change. In his
collection of essays, The Utopia of Rules: On Technology, Stupidity and the Secret Joy of Bureaucracy
(2015), he shows that what has paved the way for today’s environment is a structural transformation of corporate bureaucracy away from the workers, and towards shareholders and eventually
towards the financial structure as a whole. This led to a double movement of a sort. On the
one hand, corporate management became more financialized; on the other hand, the financial sector became more corporatized, and as a result the investor and executive class became
indistinguishable, and hence numbers, measures, and bureaucratization became associated with
value production.
One of the most fascinating aspects of David Graeber’s theory of transformation in corporate bureaucracy is that he shows how this led to a broader cultural transformation whereby
bureaucratic techniques (e.g. performance reviews, focus groups, and time allocation surveys)
that were developed in the financial and corporate sector invaded different dimensions of
society –​education, science, government –​and eventually pervaded every aspect of everyday
life (Graeber, 2015: 19–​21).
Graeber believed that we had seen the establishment of a ‘culture of evaluation’. He argues
that much of what bureaucrats do is to ‘evaluate things’ as ‘they are continually assessing,
auditing, measuring, weighting the relative merits of different plans, proposals, applications etc.’
and of course constantly evaluating human beings (Graeber, 2015: 41). This culture of evaluation, he believes, is not only the product of financialization but the continuation of it since
‘what is the world of securitized derivatives, collateralized debt obligations, and other such
exotic financial instruments but the apotheosis of the principle that value is ultimately a product
of paperwork’ (Graeber, 2015: 42).
These processes of evaluation and documentation function as modern rituals. Anthropologists
have long studied human rituals and focused precisely on those symbolic acts or phrases that
defined social reality, for example a phrase such as ‘I pronounce you husband and wife’.
According to Graeber (2015: 49–​50), in our societies, documents and the bureaucratic process
function as rituals because they make things socially true. For example, we are not citizens of a
488
489
Algorithmic Violence in Everyday Life
nation if we don’t have a passport, we are not experts on a subject without a diploma, among
many other examples.
Around the 2000s, this bureaucratic process that is so fundamental to our societies was
quickly digitized and given over to computers. In addition, at the turn of the 2000s something
else happened. On the one hand, thanks to the advent of new technologies like social media
or apps, the amount of personal information that could be collected, correlated, and used to
profile people increased dramatically. For example, on a single day in 2019, according to one
study, 350 million photos were posted on Facebook and 500 million tweets sent (Crawford,
2021: 106). On the other hand, developments in big data and artificial intelligence have led to
an expansion of profiling technologies used by governments to all dimensions of everyday life
(Elmer, 2004; Kitchin, 2014).
In the last decade, profiling technologies such as predictive analytics have started to be used
to gather as much data about an individual as possible from different sources (e.g., family history, shopping habits, social media comments) and aggregate this data to make decisions about
individuals’ lives. These technologies are used everywhere. Banks use them to decide on loans,
insurance companies use them to decide on premiums, and recruiters and employers use them
to decide whether or not a person is a good fit for a job. Even the police and courts use them,
to determine if an individual is a potential criminal or if there is a risk that an offender will
repeat a crime.
Algorithmic profiling, like any form of bureaucracy, is defined by forms of symbolic violence, because it pigeon-​holes, stereotypes, and detaches people from their sense of humanity.
As Hertzfeld (1993) would argue, bureaucracy is based on the ‘social production of indifference’
by which bureaucrats insulate themselves from social suffering. Yet there is something more
at stake. Bureaucratic systems emphasize numbers and rationality over individual lives and the
unpredictability of human experience. They have often been used as tools of social oppression
and control. Appadurai (1993), for example, demonstrated that in the British colonial imagination the numbers and classifications of population censuses were used as a form of control and
imposition of a colonial and racist ideology.
In his work on bureaucracy, Graeber (2015) was particularly concerned with the relation
between bureaucracy and violence. In the Utopia of Rules he refers to the feminist anthropological literature and a rereading of the concept of structural violence, and argues that the bureaucratization of everyday life is always built not only on symbolic violence, but also on some
‘threat’ of physical violence. The threat of physical violence he believes can be seen everywhere,
but we have become so used to it that we actually do not see it. It is embodied in the many
security guards, cameras, technologies, and enforcers entering different areas of social lives from
schools to parks and public spaces, who are there to remind us that we have to stick to the rules
or have the right papers. The violence of bureaucratization cannot only be perceived as the
threat of physical violence but also as ‘a near-​total inequality of power between the bureaucratic
structure and individuals’ (Graeber, 2015: 59–​60).
According to Graeber (2015), historically the everyday experience of bureaucratic violence
is different for the poor or marginal communities, because they have constantly been exposed
to continued surveillance, monitoring, auditing, and to the lack of interpretative work of the
bureaucratic machine. As Gupta (2012) shows, the violence of the bureaucratic machine is not
arbitrary in the sense that it does not affect everyone in the same way. This is because through
classifications, rules, and systems the bureaucratic machine reinforces the structural inequality
of a given society. Gupta’s ethnographic work focused on India and the postcolonial state and
one example that he uses is the fact that any application submitted by a woman in a bureaucratic office needs to indicate the name of the father or husband, and this simple fact not only
489
490
Veronica Barassi
reinforces and institutionalizes the patriarchal order but also normalizes heterosexual relations
(2012: 26).
As media anthropologists, when we think about the rise of algorithmic violence we cannot
fail to engage with the anthropology of bureaucracy because it clearly shows us that structural
violence and human suffering are a fundamental aspect of our data-​driven societies.
Conclusion
Over the last several years the field of media anthropology has explored the impact of algorithmic
logics and data technologies in everyday life. Whilst much needed attention has been placed on
the powerful discourses associated with algorithmic logics and data in society (Dourish, 2016;
Seaver, 2017; Boellstorff, 2013; Boellstorff and Maurer, 2015), media anthropological research
on how data technologies, flows, and narratives are experienced and negotiated in everyday
lives is still limited (Pink, Lanzeni, and Horst, 2018). In this chapter I decided to focus on these
processes of negotiation to demonstrate that we can no longer talk about ‘tech-​surveillance’
in everyday family life without dealing with the question about ‘algorithmic profiling’, and
explore how algorithmic profiling in everyday life makes people feel belittled and objectified,
and is often experienced as a form of violence. In the last few years we have also seen scholars
referring to concepts such as data violence (Hoffmann, 2018, 2020) or algorithmic violence
(Bellanova et al., 2021) to explain how algorithms and data feed into specific forms of violence.
Although insightful, as this chapter has shown, these approaches fail to engage with anthropological theory and hence with the understanding that there is a clear interconnection between
algorithmic profiling, bureaucratic processes, and symbolic and structural violence. The aim of
this chapter was to shed light on this relationship and how this is experienced in everyday life.
Yet there is much more work to do. At present the research on algorithmic violence,
including mine, is based on US and western-​centric understandings of data inequality. The
anthropology of bureaucracy and structural violence teaches us that, although bureaucratic
processes may be similar in different cultural contexts, they sustain and amplify culturally specific inequalities and injustices. Media anthropologists have much work to do when it comes to
understanding algorithmic violence in everyday life. They can shed light on the multiple and
complex ways in which algorithmic logics intersect with local understandings of violence and
on context-​specific processes of meaning construction and negotiations.
References
Appadurai, A. (1993) ‘Number in the Colonial Imagination.’ In Orientalism and the Postcolonial
Predicament: Perspectives on South Asia (pp. 314–​339). Philadelphia: University of Pennsylvania Press.
Barassi, V. (2020) Child Data Citizen: How Tech Companies Are Profiling Us from Before Birth. Cambridge,
MA: MIT Press.
Barocas, S. and Selbst, A. D. (2016) Big Data’s Disparate Impact (SSRN Scholarly Paper ID 2477899). Social
Science Research Network.
Bellanova, R., Irion, K., Lindskov Jacobsen, K., Ragazzi, F., Saugmann, R., and Suchman, L. (2021)
‘Toward a Critique of Algorithmic Violence.’ International Political Sociology, 15(1), 121–​
150.
https://​doi.org/​10.1093/​ips/​olab​003.
Boellstorff, T. (2013) ‘Making Big Data, in Theory.’ First Monday 18(10). http://​firs​tmon​day.org/​ojs/​
index.php/​fm/​arti​cle/​view/​4869.
Boellstorff, T. and Maurer, B. (Eds.) (2015) Data: Now Bigger and Better! Chicago, IL: Prickly Paradigm Press.
Costanza-​Chock, S. (2018) ‘Design Justice, A.I., and Escape from the Matrix of Domination.’ Journal of
Design and Science, MIT. https://​jods.mitpr​ess.mit.edu/​pub/​costa​nza-​chock/​rele​ase/​4.
490
491
Algorithmic Violence in Everyday Life
Crawford, K. (2021) Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. New Haven:
Yale University Press.
Dourish, P. (2016) ‘Algorithms and Their Others: Algorithmic Culture in Context.’ Big Data & Society,
3(2): 1–​11. https://​doi.org/​10.1177/​20539​5171​6665​128.
Dourish, P. and Gómez Cruz, E. (2018) ‘Datafication and Data Fiction: Narrating Data and Narrating
with Data.’ Big Data & Society, 5(2): 1–​10. https://​doi.org/​10.1177/​20539​5171​8784​083.
Elmer, G. (2004) Profiling Machines: Mapping the Personal Information Economy. Cambridge, MA: MIT Press.
Escobar, A. (2018) Designs for the Pluriverse: Radical Interdependence, Autonomy, and the Making of Worlds.
Durham: Duke University Press.
Eubanks, V. (2018) Automating Inequality: How High-​
Tech Tools Profile, Police, and Punish the Poor.
New York: St. Martin’s Press.
Gilman, M. E. (2012) The Class Differential in Privacy Law (SSRN Scholarly Paper ID 2182773). Social
Science Research Network.
Gitelman, L. (2013) ‘Raw Data’ Is an Oxymoron. Cambridge, MA: MIT Press.
Graeber, D. (2015) The Utopia of Rules: On Technology, Stupidity, and the Secret Joys of Bureaucracy.
Brooklyn: Melville House.
Gupta, A. (2012) Red Tape: Bureaucracy, Structural Violence, and Poverty in India. Durham: Duke University
Press.
Herzfeld, M. (1993) The Social Production of Indifference. Chicago: University of Chicago Press.
Hoffmann, A. (2018) ‘Data Violence and How Bad Engineering Choices Can Damage Society.’Medium. https://​
med​ium.com/​ s/​ story/​data- ​v iole​ nce-​ and- ​how- ​bad- ​e ngi ​n eer ​i ng-​ choi​ces- ​can- ​dam​a ge-​soci ​ e ty-​
39e44​150e​1d4.
Hoffmann, A. L. (2020) ‘Terms of Inclusion: Data, Discourse, Violence.’ New Media & Society, 23(12):
3539–3556. https://​doi.org/​10.1177/​14614​4482​0958​725.
Kitchin, R. (2014) The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences.
1st edition. Los Angeles, CA: Sage.
Madden, M., Gilman, M., Levy, K. and Marwick, A. (2017) ‘Privacy, Poverty, and Big Data: A Matrix of
Vulnerabilities for Poor Americans.’ Washington University Law Review, 95(1): 53–​125.
Milan, S. (2020) ‘Techno-​
solutionism and the Standard Human in the Making of the COVID-​
19
Pandemic.’ Big Data & Society, 7(2): 1-7. https://​doi.org/​10.1177/​20539​5172​0966​781.
O’Neil, C. (2016) Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.
New Bremen, OH: Crown Archetype.
Onuhoa, M. (2018) Notes on Algorithmic Violence. https://​git​hub.com/​Mim​iOnu​oha/​On-​Algo​r ith​mic-​
Viole​nce.
Pink, S., Lanzeni, D. and Horst, H. (2018) ‘Data Anxieties: Finding Trust in Everyday Digital Mess.’ Big
Data & Society, 5(1): 1–​14. https://​doi.org/​10.1177/​20539​5171​8756​685.
Safransky, S. (2020) ‘Geographies of Algorithmic Violence: Redlining the Smart City.’ International Journal
of Urban and Regional Research, 44(2): 200–​218.
Seaver, N. (2017) ‘Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems.’ Big
Data & Society, 4(2): 1–​12. https://​doi.org/​10.1177/​20539​5171​7738​104.
Zuboff, S. (2015) ‘Big Other: Surveillance Capitalism and the Prospects of an Information Civilization.’
Journal of Information Technology, 30 (1): 75–​89.
Zuboff, S. (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power
(1st edition). New York: Public Affairs.
491
Download