Cloud Computing and Concluding Thoughts

Cloud computing was week 9’s topic, but I wanted to write about it in my course review post because I think it captures the main themes of the course. I’m also in a period of post-election blues and feeling hopeless and scared about the future of this country and the rest of the world. David Lametti’s “The Cloud: Boundless Digital Potential or Enclosure 3.0?” was quite a difficult read for me: his fear about the potential for corporate enclosure seemed to outweigh his hope or optimism that the cloud could facilitate a digital commons. This quite accurately reflects my own optimism/pessimism “balance” (or lack of…) about alternative politics at this point in time.

Moreover, in his assessment of how a digital commons might be established, he came down in favour of “a direct form of government control” (ie: regulation) [229]. I am not against this idea: I merely find it problematic in that it does not solve the corporate/state problem I wrote about in blog 5, whereby governments and corporates are increasingly colluding. For it to work (and be trusted by its users), it would require governments that are not complicit in corporate capitalism and state/corporate surveillance. Unfortunately, most governments (including our own), do not have this independence from corporatism or global surveillance, nor are they making bold steps to establish such independence. I’m left thinking that structural changes to our political institutions need to take place before a digital commons could be established by the government and trusted by its users. Perhaps the mutually reinforcing relationship between the technological apparatus and political apparatus actually tilts in favour of the political apparatus.

Lametti’s article reads as a good reminder to be wary of technological solutionism, particularly who it is that claims to be offering such solutions. Evgeny Morozov has warned us against the solutionism of Silicon Valley, for example, as he positions their innovations within a wider neoliberal move to turn personal data into a form of payment (“the information economy”).  Perhaps the most important take-away from this course, then, is to critically interrogate the political ideologies embodied in our technologies, rather than attaching ourselves to technophobia or techno-scepticism.

My Suggestions for Improving Auckland City

The brief for this week’s blog was to take photos of ubiquitous media encounters around the city. I completely forgot to do that. So, instead I’ve decided to do some drawings based on ideas for how Auckland City might be improved through technologies (gotta love a bit of utopian technological solutionism). I haven’t thought too much about how realistic or practical these ideas are. I wanted to use the creative part of my brain and that required putting the ever-restricting-but-totally-important critical thinking part on mute. Also, please excuse the lack of talent/understanding of depth and perspective: I can’t draw very well, but wanted to give it a go anyway. (Click on the photos to enlarge.)

1. Recycling bins with digital readers that will only allow the right kind of recycling through the slots (avoids people putting rubbish into recycling bins or the wrong kind of recycling in). My Mum suggested a mechanical fist should pop out and punch anyone who tries to put rubbish in them in the face. This, to me, seems like a workable suggestion from Ma.


2 (above). AT Hop card activated bike rentals. Speaks for itself, really. Small charge for bike rental upon “tagging on” that gets credited back to your account when you return the bike and “tag off”.

3. Weather sensitive bus shelters. Waiting in bus shelters is currently a very uncomfortable experience and could be preventing people from taking the bus (making it easier for them to choose the comfort of the polluting private motor vehicle instead). Bus shelters therefore need to be much more luxurious. I forgot to add a capacity for internal heating in winter months. A crucial mistake on my part. Future sketches will include solar powered heating facilities.

Bus Shelters

4. Separated cycleways marked by holographic barriers. Holographic barriers would avoid the need to put in physical barriers (like curbing or posts) between cyclists and motorists, but would still look opaque enough to motorists that cyclists will be protected. Holograms should really come in the coolest array of colours possible. Alternatively, they could also be made to look like plants or other “natural” barriers.

Holographic separated cycleways

5. Public art spaces with digital tool capability. I really like the idea of lots of public spaces where anyone can come and make art. I think it would be cool if these spaces allowed people to use both ‘traditional’ art and digital art methods. The spaces would therefore need to have some sort of digital interface capacity. Ideally, once the digital artwork was complete, the tool in use (e.g: Adobe Photoshop) would retract (somehow?) so it could no longer be visible. I can hear whispers about the potential for corporate enclosure through these digital tools (Hello, Adobe!), but I’m ignoring this rude, interruptive behaviour from the part of my brain that’s meant to be quiet today.

Public art spaces

6. Most importantly of all suggestions, the dogspeak app: an app for people who try to avoid human contact as they move through the city but are more than happy to speak to other people’s doggies. I have not copyrighted this ingenious-yet-absolutely-doable invention yet, so if I see this on the market in coming weeks or months, I’ll know that one of you stole it!

Dogspeak app

Potential for Intimacy in Lateral Surveillance?

In his 2013 book “State Violence and the Execution of the Law,” Joseph Pugliese describes two different relationships between sensor operators and drone technology. The parenthetical relationship emphasises the instrumentality of drone warfare, including its ability to distance sensor operator from their technology and from the victim(s) of drone attacks. As earlier blogposts of mine have suggested, I want to reject the idea that the work of the drone operator is easy because the killing is “technologically mediated” [1].

The prosthetic relationship he describes is much more nuanced and could help explain high rates of PTSD. Prosthesis describes the blurring of the (imagined) “boundary” separating human from technology, whereby the human body is extended through the technology in use. The drone operator’s joystick and controls therefore become an extension of his/her arms and hands. Pugliese acknowledges cyborg theory, referencing Donna Haraway’s progressive vision of cyborgs in particular, but wants to re-code the term to “evidence its violent assimilation and co-option” into the “phallogocentric, militaristic and instrumentalist authorities it was designed to contest” [2].

This is where I think Pugliese is too quick to dismiss the potential intimacy between drone operator and victim (and the problem intimacy creates for processes of de-humanisation). In one example, he argues that the operator’s heat-vision monitor (which “merely radiates signs of life”) encourages a clinical relationship between operator and victim, regardless of the prosthesis between human and technology. The heat signature is supposedly an example of antiseptic mediation because it is not a representation of life that we recognise from traditional “unmediated” interaction with others.

Pugliese overlooks how surveillance fosters intimacy, especially in an age of increasing lateral surveillance between friends, love interests and family. Drone operators are usually tasked with surveilling their victims before assassinating them. This activity, although “top-down” in that it is carried out by the military, would be difficult to separate from the peer-to-peer surveillance of everyday life in a ubiquitous media environment. The heat signature, rather than being an antiseptic representation of life, is simply a different “type” of information that connotes life: much like a ‘favourite’ or a ‘retweet’ on Twitter might re-affirm a friendship without any “traditional” conversational forms of interaction taking place.

It occurred to me* that drone operators who develop PTSD might be similar to Wiesler’s character in The Lives of Others: perhaps they started their careers convinced by their country’s military ideology and patriotism but, through surveillance, developed intimacy and empathy for the very people they are supposed to consider enemies. Like drone operators, Wiesler also had different “types” of information about those he was surveilling, but was still able to see their humanity and empathise with them.

Lives_of_Others 2


*I’m surprised at how long it took me to make this connection considering The Lives of Others is one of my all-time favourite movies.


[1] Joseph Pugliese, State Violence and the Execution of the Law: Biolpolitical Caesurae of Torture, Black Sites, Drones (Oxon: Routledge, 2013): 191.

[2] ibid., 204.

Un-American Surveillance and the “Big Brother” metaphor

“Big Brother” is a useful term in surveillance debates because it provides us with a well-known point of comparison within popular culture to easily communicate our concerns about government control. At the very least, its extremity incites people to be alarmed about the issue rather than quickly dismissing it as a necessary security measure – as is the case with: “if you have nothing to hide, you have nothing to fear”. However, the metaphor is problematic because it can be so easily dismissed by those on the pro-surveillance side as a polemical. It connotes the authoritarian governance of 1984, whereby freedom of choice is severely limited and citizens’ lives are controlled exclusively by the government (and not, importantly, by corporations). This doesn’t truthfully describe the power structures of today’s [neo]liberal democracies, nor does it provide us with an accurate representation of how they use surveillance. It is therefore all too easy for those on the pro-surveillance side to frame as an irrelevant comparison.

1984 (high qual)Orange guy

This comparative disjuncture is likely what led Steven Rambam to suggest that surveillance is un-American in the video Luke showed us on Monday.* Surveillance absolutely is American and there’s a reason it gains support from both the political right and centre left in the United States (and here in NZ!). We therefore need to use language nuanced enough to illustrate how liberal-democratic nation-states rely heavily upon surveillance technology.

My research on drone technology has been quite an eye-opener on how surveillance is a key component of the nation state’s apparatus in providing “security” in the “chaos” of world relations.** The surveillance apparatus is part and parcel of the state’s prison apparatus. Both are used to establish citizens from non-citizens (you need only look at prisoners’ lack of voting rights to see how citizenship is tied to good, lawful behaviour). James Forman, in his article “Exporting Harshness”, writes that the prison complex has normalised state violence and surveillance and exported it to wherever the nation is fighting its borders. With this in mind, the development of drone technology almost has an inevitable quality to it: their mobility makes them perfect at securing the United States’ ever-shifting and contradictory borders, thereby separating citizens from non-citizens as the US sees fit. Drones have the capacity to be both police and military: they can surveil domestically (and are currently used on the borders of Mexico and Canada) and internationally (protecting US interests overseas).

Border drones (high qual)


*Interestingly, a little research reveals Steven Rambam is the CEO of a large private investigative firm that works with the state to track down people wanted for crime/s.

**This is quite a realist, cold war politics conception of international relations, but it has a lot of currency in postmodernity. Trust is hard to establish (or does not need to be established?) when new, potentially de-stablising information comes to light with such speed and accessibility.


“Every Move You Make: US to Adopt New Biometric Surveillance System?”:

James Forman Jr., “Exporting Harshness: How the War on Crime Helped Make the War on Terror Possible,” New York University Review of Law and Social Change, vol.33 (2009): 333.

America’s Drone Wars and Technologies of Violence

Obama ISIS announcement

[ U.S President Barack Obama: “I want the American people to understand how this effort will be different from the wars in Iraq and Afghanistan. It will not involve American combat troops fighting on foreign soil. This counterterrorism campaign will be waged through a steady, relentless effort to take out ISIL wherever they exist, using our air power and our support for partners’ forces on the ground.” ]

Last Thursday, Obama outlined his plans for a ‘low-risk’ war, whereby drones will be used exclusively to target and assassinate ISIS rebels in Iraq and Syria. Despite Obama’s assurances, it’s questionable whether drone wars are low-risk and for whom they can be considered such. The large numbers of civilian casualties during the “war on terror” suggests drone strikes will be high-risk for the civilians of Iraq and Syria (the latter country, especially, is precariously referred to as an ‘intelligence black hole’). Domestically, US soldiers may be spared physical harm in a drone war, but research shows they are at high-risk for psychological harm – especially Post Traumatic Stress Disorder. This is not robot warfare: humans are an integral, but hidden, part of drone technology.

For my research essay, I’ve been looking into the ways technologies of violence are used to re-enforce nation-state boundaries. New technologies may be celebrated for their networking capability and globalising effect, but the converse of this is they provide nation-states with greater opportunities to assert their power and maintain their boundaries on the international stage. The “Othering” of non-citizens is vital to the legitimacy of the nation-state and this can be achieved through war technologies. The technical-scientific apparatus of Nazi Germany and the Khmer Rouge regime in Cambodia was a fundamental part of their processes of Othering.

I’m curious as to whether physical distance, coupled with the computer interface, makes the process of Othering easier for drone operators. Suvendrini Perera argues that enemies are made ‘non-human’ by ‘fetishistic devices such as the monstrous, the bestial and the racially abject’ [1]. The bird’s-eye view operators have of their victims doesn’t lend itself to humanisation. Moreover, while allowing drone operators to dehumanise their victims, drone technology could provoke operators to feel powerfully transhumanised (made ‘super human’ by the duality of human and machine).

Drone operators have, however, referred to an intimacy that develops while surveilling the “enemy” before killing them. Denying the humanity of a person while watching him/her interact with family, or carry out mundane day-to-day activities, would be near impossible. It is therefore feasible that the “enemy” is more human to drone operators than to combat troops. Likewise, the duality of drone operator and his equipment could be a relationship the mind wants to reject. I’m reminded here of Wikus in District 9: in the MNU lab, he is forced to use his alien arm to power technologies of violence. This is a deeply traumatic experience for him and later he tries to abject himself of his arm by cutting it off. Perhaps for a drone operator there is the same disconnect between their mind and their body where body and machine meet.

District 9



[1]: Suvendrini Perera, “Dead Exposures: Trophy Bodies and Violent Visibilities of the Nonhuman,” Borderlands 13, no. 1 (2014): 4, accessed September 14, 2014,

Obama’s war on ISIS statement:

Corporate Control and/or The Automated State


Does new technology provide us with the opportunity to imagine, and realise, alternative politics? It is often lauded to have this potential by techno-utopians (“computerisation will set you free”), but to what extent is that actually the case? [1] To me, we seem to be on the track towards an increase of corporate control and state power. I apologise for being so bleak, but I would argue that we are firmly in the bind of T.I.N.A (There is No Alternative) politics. New technology presently functions to further entrench corporatism or statism, and in many cases both (insomuch as the nation-state has become an enabler for corporate capitalism*). What kind of utopia is that?

David Golumbia argues that techno-utopians, far from imagining a utopian alternative politics, are often better described as “cyberlibertarians”. Their underlying ideology does more to entrench already existing right-wing politics (which privilege individual freedoms and meritocracy) than it does establish new ways of “doing” politics. I think Golumbia’s critique is spot-on, but it neglects to mention the other side of the coin: the ways in which the nation-state can boost its power through new technologies.


Golumbia is concerned that new technologies, and the unthinking cyberlibertarian reaction to them, dismantles the nation-state. I would argue that what we’re seeing is a dismantling of the most beneficial aspects of the nation-state (its redistributive/welfare components), and a consolidation of its worst aspects: the rigidity of state borders, the control of those within them, and the processes of ‘Othering’ those outside of them. A utopian alternative politics, to me at least, would not only offer an escape from corporate capitalism, but would additionally look beyond the nation-state as the ultimate decider of what is, and is not, legitimate use of violence. Does the nation-state need to remain our distributive and organisational unit? Being in this corporate/state bind is, to me, a dystopian reality.

IDF soldier uses instagram Brazil Robocop uniform

This week’s readings have been helpful for my research essay on drone technology. New technologies are often praised for their ability to transcend spacio-temporal borders, thereby globalising the body politic. However, surveillance and defence technologies are being developed, and enveloped, by the state apparatus to re-entrench nation-state borders. Zygmunt Bauman and Hannah Arendt’s work exposed an intrinsic link between modernity, technology and genocide. I think my research essay will argue that this link remains even in our supposedly “postmodern” times.


* The “state vs corporate” dichotomy is a false one considering the ways in which the state enables corporatism (and vice versa: providing campaign funding, etc.), particularly through providing tax breaks for corporates (or completely neglecting the problem of corporate tax evasion despite blatant awareness of it). Good article here: 


[1]: David Golumbia, “Cyberlibertarianism: The Extremist Foundations of ‘Digital Freedom’,” (Sept. 2013): 1. 

Screen-capped news stories: 1) 2)

BNZ: “We’re in Love with Niklas Luhmann!”

“Money is neither good nor bad, it’s what you do with it” says a white, middle-aged man in a business suit. The idea of money as an empty-signifier, the exchange of which is merely an “event” within the economic system, is the premise of an advertising campaign launched by the Bank of New Zealand (BNZ) in 2012. [1] Studying BNZ’s views of the campaign further, I’m led to believe they would love Luhmann’s systems theory so much, they’d probably want to exchange within the “intimacy system” with him. [2]

Trouble is, there’s a reason why the narrator of the BNZ advert is white, middle-aged, male and wearing a suit, just like there’s a reason why the set he walks through at the end of the advert is not an over-populated, derelict slum but a middle-class neighbourhood. Money is laden with political signification. The economic is the political; the political is the economic. You either have money (and therefore political power) or you don’t.

It is not my intention to offer a knee-jerk reaction to Luhmann’s systems theory. Luhmann doesn’t deny that systems, such as the political system and the economic system, can interact. He argues that “structural coupling” occurs whereby discrete systems “irritate” or “resonate with” each other. [3] This means that it is still possible within Luhmann’s theory to think of systems as interrelated. Structural coupling is quite compelling: perhaps we could consider any sign of influence of one system over another strictly in terms of irritation and resonance. However, at what point does differentiating between systems become arbitrary? If the relationship between politics and the economy is one of continuous irritation and resonance, is it still accurate to think of these systems as discrete? Why didn’t Luhmann refer to an autopoietic “politico-economic system” instead?

Apologies, Neal, I like Luhmann but I might be the humanist who comes “kicking and screaming” into class on Monday. [4] I must be addicted to “self-administered ideological opium”. [5]


[1] Hans-Georg Moeller, Luhmann Explained (Chicago: Open Court, 2006): 6.

[2] ibid., 7.

[3] ibid., 38.

[4] Dr Neal Curtis said (on 11/08/2014) that he always has at least one humanist who comes into class kicking and screaming.

[5] Moeller, 63.

Video link again:

Heidegger and Drone Operators

I can’t say I’ve thought much about the thinghood of things until reading Heidegger this weekend. This, I realise now, is rather unfortunate: my curiosity about the question of being has dwindled as I’ve gotten older. This week’s readings (as difficult as they were) not only reawakened a wonderment of existence and the everyday, but also got me thinking ahead to my research essay.

Drone operators:

I recently found out that US drone (UAV) operators develop Post Traumatic Stress Disorder at the same rate as combat troops, despite their geographical distance from Iraq and Afghanistan. [1] Furthermore, as you can see in the photo, drone technologies do not appear to be particularly immersive or frictionless: from the operator’s end, the technology looks rather outdated. One would think geographical distance and digital friction would combine to make it easier, psychologically, for drone operators to carry out their work than it is for combat troops.

Drone operators

Heidegger’s ideas might help us to understand why that is not the case. While the interfaces the drone operators “use” may appear to be simply ‘an equipment’ or a tool to outsiders looking in, for the operators themselves they become more than that in their interaction with them. Heidegger writes that there is no such thing as ‘an equipment’ or ‘a tool’ because any equipment belongs to a ‘totality of equipment’. [3] What looks like distinct pieces of equipment to us looking at the photo – four separate screens, a main screen, mouse, joystick, keyboard, headset and chair – all works together in totality in order to surveil and assassinate victims in Iraq and Afghanistan. The operator’s concern therefore ‘subordinates itself to the “in-order-to”’. [4] Taking Heidegger’s ideas into account challenges popular ways of thinking about immediacy eg: geographical proximity or the ‘immersive’ qualities of the interface.

This idea is only in its early stages, but I also think that the moving between ‘ready-to-hand’ and ‘present-at-hand’ could help to explain why operators experience high levels of PTSD. [5] I imagine moments where the equipment moves from being ‘ready-to-hand’ to becoming ‘present-at-hand’ could be highly anxiety-inducing when it comes to technologies of killing.

So, this week has been productive! Not only do I sort of understand Heidegger now but also I am fairly decided on my research essay topic.



[2] Martin Heidegger, Being and Time (not sure what version of this book we’ve been given, Neal!): 97.

[3] ibid, 98.

[4] Paul Dourish, “‘Being-in-the-world’: Embodied Interaction,” Where the Action Is (Cambridge: MIT Press, 2001): 109.

Image is taken from the NY Times article linked above.

The Interfaces of ‘White Bear’

Many TV critics have written that the ‘White Bear’ episode of Charlie Brooker’s series Black Mirror offers a scathing commentary on voyeurism and surveillance. It also has a lot to offer to our study of interfaces. Can we even begin to discuss voyeurism or surveillance without first talking about what is meant by “interface”?

Multiple and Shifting Interfaces

In this week’s reading by Alexander Galloway it is argued that the “dazzling array of various interfaces” in Rockwell’s ‘Triple Self-Portrait’ completely consume the viewers’ visual attention [1]. Their concern is primarily with that “within the content of the image” (its diegetic space) and not with what is happening in the metatext [2]. The interface of the magazine cover itself is therefore forgotten. Rockwell’s image has uncanny similarity to the unsettling “non-sequitur” moment of ‘White Bear’ [3]:

Screen Shot 2014-08-03 at 11.12.36 pm


In this scene (on the right), it is revealed to the protagonist, Victoria, that actors have constructed her day of horror, while a participatory live audience not only watched her pain, but also found it pleasurable to do so. Using Galloway’s definition of interface as the “state of being on the boundary”, an “effect” or a “process”, I think there are 9 interfaces we can be certain of in this image [4]:

(This exercise might work best if you keep looking back up at the image!)

1) That between ‘real’ Victoria and her on-screen image of herself.

2) That between the diegetic audience and the ‘real’ Victoria.

3) That between diegetic audience and the on-screen image of Victoria (that ‘real’ Victoria is also viewing).

4) That between the diegetic audience and the images of Victoria (that ‘real’ Victoria CANNOT see).

5, 6 and 7) The same processes as mentioned in 2, 3 and 4 but instead substitute the non-diegetic audience in for the diegetic audience.

8) That between non-diegetic audience and diegetic audience.

9) That between non-diegetic audience and the screen they are viewing this disturbing episode of Black Mirror on.

Much like Rockwell’s ‘Triple Self-Portrait’, I think this scene functions to detract the non-diegetic audiences’ visual attention away from acknowledging 5, 6, 7, 8 and 9 as interfaces. All of the metatext therefore becomes an “edge” while the diegetic space is a “centre”. The non-diegetic audiences are therefore not encouraged to reflect on the act of their own looking.

However, there could be a number of problems with my diagnostic. Does the fact that you can see the backs of the diegetic audience members make it distinct from Rockwell’s artwork? Could this change what works as “edge” and what works as “centre”? And, if this conflict between “centre” and “edge” cannot be resolved, might we best think of some of these elements as “intrafaces” instead?


[1] Alexander Galloway, The Interface Effect (Cambridge: Polity, 2012), 34.

[2] ibid., 36.

[3] Johanna Drucker, “Humanities Approaches to Interface Theory,” Culture Machine 12 (2011): 5.

[4] Galloway, 33.



The Excitement and Anxiety of Ubiquitous Computing

‘Turning on’ and ‘logging in’ will soon be outdated activities for many of the world’s citizens. Increasingly, it is no longer accurate to consider ourselves occupants of a ‘real world’ who, by switching on our digital devices or logging into our favourite social media platform(s), enter into mediated spaces or the ‘virtual world’. We are now, as the saying goes, ‘always on’.

Adam Greenfield argues that this new technological reality (in his words, a ‘paradigm shift’ to post-PC computing) requires us to think of media as everyware and everything as mediating [1]. The proliferation of microprocessing and network capability into previously unnetworked objects has changed our relationship with technology to one marked by immersion, invisibility and, therefore, inescapability.

This ‘transformation’ seems overstated for now: day-to-day, I continue to be faced with clunky interfaces whose unfriendly design causes more friction than immersion. However, it certainly seems that this is the direction technological development is headed. President Group CEO of Sony, Andrew House, used the language of transformation (or paradigm shift) in his Playstation 4 announcement. He stated that PS4 “represents a significant shift from thinking of Playstation as merely a box or a console” [2]. The two-hour-long announcement repeatedly used phrases such as ‘immersive experiences’, ‘Playstation ecosystem’, ‘architecture’, ‘network’, ‘integration’, ‘fluidity’ and ‘seamless interconnection’ [3]. Two of the selling points used are the PS4’s ability to download new games even when the main power is off and its ‘always on’ video compression and decompression systems [4].

This paradigm shift may conjure up dystopian images of a technological takeover of natural life (think: the ‘machines vs. humans’ premise of many science fiction texts). However, a reaction of this kind might be too dramatic: is it really so frightening that in the near future I could use my phone to check that I have turned the oven off at home? At the same time, we should be cautious of the way this transformative language embraced by corporations (like Sony) frames new technology as naturally occurring, often using a progress narrative to do so. This language takes away from a powerful argument against corporate capitalism; that commercialism creates needs.

[1] Greenfield, Adam. Everyware: The Dawning Age of Ubiquitous Computing (San Francisco: New Riders, 2010): 16.

[2] PS4 Announcement – 10 Minute Highlight:

[3] ibid.

[4] ibid.

Alex Edney-Browne (1726989)