Phone ban

Reading Surveillance through a Gendered Lens: Some Theory

Dr. Anja Kovacs - February 2017
 
 

Contents

In the introduction to this site, I explained why we decided to look at surveillance through a gendered lens. But once you have decided to do so, how do you gender surveillance? Two different yet connected frameworks have been especially valuable for our work. In the following two sections, I will share those theoretical insights, and the questions they force us to ask. In the final section, I will then look at the implications of these questions, and of the practice of gendering surveillance, for the struggle against the harms of surveillance more broadly. As will become clear, if we are to counter the harms of surveillance effectively, fighting for stronger privacy protections alone will not be enough.

1. The Two Dimensions of Surveillance

First, it is crucial to realise that surveillance is done for two different reasons. The first one is to monitor what you have done or are currently doing - that’s what happens, for example, when law enforcement has a strong suspicion that you might be engaging in a criminal activity. This dimension is usually referred to when we discuss surveillance as a tool to maintain security. It is what most people have in mind when they say, ‘you have nothing to fear if you have nothing to hide’. It is also the dimension that most frequently gets the spotlight when we talk about human rights violations relating to surveillance - the Snowden revelations, for example, are all about violations with regard to this dimension of surveillance.

Many democratic societies had, over the years, developed a fairly stable consensus as to in what circumstances and to what extent monitoring is acceptable, or not - and generally this agreement included that domestic scrutiny at least should be quite specific, both in terms of who and when (Lyon 2003). However, the Internet and digital technology have now shattered this consensus almost everywhere in the world. As governments engage more and more in mass surveillance, mostly merely because they can, human rights activists argue, among other things, that they are criminalising everyone. Even those who are not even suspect of a crime now fall under the gaze of the state and this is contrary to international human rights law. Since the Snowden revelations, this part of the debate has received a fair amount of attention.

However, there is a second dimension to surveillance practices, one that doesn’t get as much attention by far: surveillance can also shape what you will do in the future. That is because surveillance can incentivise certain kinds of behaviour, and discourage others. In that sense, as Jasbir Puar has noted, surveillance is pre-emptive: it seeks to control now, so that it can avoid having to repress later (Puar in conversation with West 2014). And it’s also productive, because it actually makes people do certain things. Just think of how carefully many of us tweak how we are perceived online, for example - indeed, very few of us are immune to the disciplinary power of the collective cybergaze (’That picture? No, don’t put up that picture!?! I look horrible in it!!’). The disciplinary dimensions of surveillance - perhaps most eloquently expanded on by Michel Foucault already in 1975 (1995) - pre-date the big data era; the fascist political systems of the twentieth century, for example, played amply on its potential, creating subjects who both disciplined themselves and closely watched and scrutinised others (as well as reported others to the authorities, all too often).

But with big data, the possibilities of shaping people’s behaviour have become ever more comprehensive. Perhaps you remember, for example, how Facebook noticed that all of us had started to share fewer personal updates, which are so important to Facebook’s business model, and then in June 2016, announced that it had tweaked its algorithm in an effort to get us to do this more again (Wagner 2016). That is the power that big data has to shape what we do.

There are even more insidious examples, through what is known as ‘social sorting’, by sorting people ‘into categories, assigning worth or risk, in ways that have real effects on their life-chances’ (Lyon 2003: 1). For example, more and more large companies use personality tests to sort through the large number of job applications they receive, even though plenty of studies have shown that such tests are a highly unreliable indicator of job performance. Yet if you don’t ‘crack’ that test, there is no future for you in these businesses. In this way, opaque algorithms that use proxies for what they claim to measure and that have inadequate-to-no feedback loops increasingly end up deciding our fate at crucial junctures in our lives (O’Neil 2016).

Moreover, surveillance might operate in a more and more dispersed manner in the digital age, but it has certainly not become more democratic: who receives discipline and punishment, who is deemed worthy of pleasure and intimacy remains distributed in deeply uneven manners (Puar in conversation with West 2014). This is perhaps even more so where surveillance takes the shape of social sorting: we aren’t even all subjected to the same gaze now, but differently gazed at depending on which box the algorithm has decided to slot us in. And where the assumptions on which these algorithms are based are biased, these ‘weapons of math destruction’, as Cathy O’Neil has called them, therefore frequently only further entrench inequalities (O’Neil 2016).

There are three further points to take away, then, from this discussion about surveillance’s avatars in the digital age. First, surveillance does not only take the form of watching and screening bodies and identities. It is also about identifying, tracking, monitoring, tabulating, analysing data (Monahan 2009 and Puar in conversation with West 2014). To fully grasp the harms of surveillance in the digital age, this is crucial to realise.

Second, this effect is also achieved by many systems that perhaps do not primarily have surveillance as a goal, such as the datagathering machines of the big Internet companies. The uses and effects of those machines are, however, nevertheless of a surveillant nature (Monahan 2009).

And third, whether or not you are put under surveillance is less and less in your control. Earlier, not committing any crimes could get you a long way. But today, your mere presence in a place, or simply having provided one or more data points, is enough for you to be brought into yet another network of control. Unless you stop participating in modern public life, it will be hard to escape surveillance right now.

2. Three Intersections of Gender, Technology and Surveillance

How does gender fit into surveillance in its dual dimensions? Torin Monahan’s (2009) conceptual categories of gender and technology design help to get a sense of the different ways in which this is possible. Monahan argues that there are three ways in which technologies have gendered outcomes. The first one is through body discrimination, which happens when technology privileges a certain type of person over others, effectively treating everyone who does not fit the norm as deviant. As a consequence, the outcomes that these technologies trigger are far less predictable for some people than for others.

For example, the whole-body imaging technologies that are now used in many airports around the world to screen passengers are often represented as objective and neutral - after all, for these technologies colour does not exist. Yet as the aim of using such technologies is precisely to police non-normative bodies, some are far more likely to be treated as a potential threat, and thus to be singled out for secondary screening, than others (Magnet and Rodgers 2012). People with bodily shapes that may render them deviant, thus, apparently include obese people, who supposedly could hide weapons ‘between folds of fat and flesh’! (Jen Phillips quoted in Magnet and Rodgers 2012). The second way in which technologies have gendered outcomes is through context or use discrimination. As Monahan explained in earlier work:

'Because technologies are underdetermined, meaning that they take on values from the context of their use, existing conditions of inequality inflect technologies and technological systems, reproducing unequal social orders' (Monahan 2005).

And so, when existing social relationships are already patriarchal, ‘then surveillance (and other) technologies tend to amplify those tensions and inequalities’(Monahan 2009). When it comes to whole body imaging technologies, that means that transgender people, for example, are also at a heightened risk of being treated as ‘deviant’ and subjected to additional screenings - and to possibly be outed in public without their consent as a consequence (Magnet and Rodgers 2012). The case studies we present on this site have many additional examples of context or use discrimination.

The third form is through discrimination by abstraction, where we are reduced to data points in databases, disembodied and denuded from our social context, abstract representations of the world - or at least of what those who are in control of the data consider important in it. It is this that facilitates the control at a distance that is so typical of modern surveillance systems - just think of how central data gathering is to the development and management of so-called ‘smart’ cities, supposedly so much smoother and more efficient in their functioning than the cities most of us actually live in today.

But all too often, people’s social inequalities and experiences are not adequately reflected in the data. The fact that in India, as elsewhere, gender-disaggregated data is unavailable in so many cases just makes that all too clear. But things may be even worse when your individual data record is compared with the averages in a database as many weapons of math destruction do: in those cases, the fact that the inequalities you are suffering from might not be purely individual but have a collective dimension can simply disappear from view. And as existing structural inequalities are invisibilised, they end up getting further entrenched (O’Neil 2016).

The result is that we are sometimes treated as the sum of a few data points, that our data takes precedence over who we are as human beings - such as when a person is taken aside in an airport merely because of their name, their nationality, the colour of their skin.

3. The Implications of Gendering Surveillance: Moving Beyond Privacy towards Social Justice

As we hope the case studies on this site will show, women’s exact location in webs of power and subordination impacts profoundly the extent and nature of their experiences of surveillance. Yet there are important connections in how these experiences are produced. As the above discussion makes clear:

  • surveillance is about relations of power and domination.
  • surveillance almost always reinscribes existing power equations - because it generally aims to control, even eliminate, those who ‘deviate’ from the norm (though this needn’t necessarily be the case: sometimes marginalised people have successfully managed to reverse the gaze, or to appropriate a dual-use technology for their own benefit).
  • surveillance is pre-emptive, productive as well as repressive in its functioning, such that its effects can take many forms.

This means, then, that the effects of surveillance are never merely personal; they are structural. Or put in another way, because of the central role of surveillance in policing and reproducing power relations, surveillance is ‘not merely a matter of personal privacy but of social justice’ (Lyon 2003: 1). For those interested in fighting back against the harms of surveillance, this insight has important consequences. Fighting for strong privacy protections - while necessary - will not be sufficient. Beyond this, fighting back against the harms of surveillance also requires, for example, transparency and accountability on the part of those who develop and implement powerful technologies. It requires preventing, to the extent possible, possibilities for manipulation of data and algorithms. And it requires grounding technologies once again ‘in social context, embodiment, and place’ (Monahan 2009: 299). Technologies of control at a distance facilitate the naturalisation of inequalities. It is only by putting technologies back into social contexts, and the webs of power relations that underlie them, that the promise of surveillance as empowering - which safety apps,for example, like to claim - can ever possibly be realised.

In this way, gendering surveillance redefines the arguments and strategies that human rights defenders will need to deploy to fight back against the harms of surveillance. If gender has always already been surveilled, bringing it into the heart of the debate on surveillance in the digital age will allow us to move that debate forward in new and profoundly empowering ways.

References

Foucault, Michel (1995). Discipline and Punish: The Birth of the Prison. New York: Vintage Books.

Lyon, David (2003). Introduction. In David Lyon (Ed.), Surveillance as Social Sorting: Privacy, Risk and Digital Discrimination. London: Routledge.

Magnet, Shoshana and Tara Rodgers (2011). Stripping for the State: Whole Body Imaging Technologies and the Surveillance of Othered Bodies. Feminist Media Studies, 12(1): 10-118.

Monahan, Torin (2005). Globalisation, Technological Change, and Public Education. New York: Routledge.

Monahan, Torin (2009). Dreams of Control at a Distance: Gender, Surveillance and Social Control. Cultural Studies ↔ Critical Methodologies, 9(2): 286-305.

O’Neil, Cathy (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. UK: Allen Lane.

Wagner, Kurt (2016). Facebook Is Cutting Traffic to Publishers in Favor of User-Generated Content. Recode, 29 June, http://www.recode.net/2016/6/29/12053800/facebook-news-feed-algorithm-change-publisher-traffic.

West, Lewis (2014). Jasbir Puar: Regimes of Surveillance. Cosmologics Magazine, 4 December, http://cosmologicsmagazine.com/jasbir-puar-regimes-of-surveillance/.

 
 
 
Safety apps
‘Chupke, Chupke’: Going Behind the Mobile Phone Bans in North India

Since 2010, a number of khap panchayats across north India have pronounced bans on mobile phone use for young women. What drives such orders? Are they really effective? And how do young women themselves respond to the bans and the underlying anxieties? We went to Haryana and western UP to find out.

Safety apps
A Handy Guide to Decide How Safe That Safety App Will Really Keep You

Imagine a world where an army of women’s safety apps scoops you away from danger, and delivers you to safety? That’s the promise made by these apps. How well do they measure up against their own claims? Let’s look at a cackle of apps that act as surveillance assistants, leak your data and look smug, while purporting to keep you safe.

CCTV cameras in garment factories
Caution! Women at Work: Surveillance in Garments Factories

Garment factory workers, predominantly women, work in stressful conditions in an exploitative industry where the pay is low. Read on to find out what CCTV cameras have done to the dramatic imbalance of power between workers and the factory management, and what role they can play in workers’ fight for an equitable workplace.