Global judgements and ideas.

Wisdom of the Algorithmic Boss, or a Robot Army of Redressers?

Striphas comments that ‘algorithm is a less obvious keyword by means of which to make sense of culture today’ compared to keyword possibilities such as “information and crowd”’ (2015: 403). The wisdom emerging from algorithmic happenstance may be an elite outpouring that encapsulates the ‘best of what has been thought and said’ (Arnold, 1869/1993: 190, cited in Striphas, 2015: 406) which is what defines culture in the Arnoldian sense. Algorithmic trends can surprise, like the case of #AmazonFail in 2009. In this embarrassing case, one romance author could not find sales ratings for gay romances on Amazon and thought it was peculiar. Through contacting the help line, the author discovered Amazon’s electronic filters recognised these types of books as ‘adult material’. Thousands of Twitter users made comments that day about the discriminatory filtering system, a rampage that was seen to unseat the Prince of Peace that Easter (Striphas, 2015). Another example is seen through the quirky surprises that the progress in artificial intelligence developments have offered. A ‘teen girl’ chat robot was released by Microsoft in March 2016 and was permitted to chat with the public. Very quickly, uncensored chats began to inform her own words because, of course, computers can only restate what has been stated to them. The bot went from moody teen to offensive Hitler-endorsing incest promoter. In that case, the ‘wisdom’ of the public was one of dark cynicism as well as humour; Microsoft’s corporate public relations stint was a soft hackers’ paradise.

Given these recent anecdotes, it is provocative to ask whether an algorithm really should be trusted to replace a traditional manager. Crowd-sourcing and shared working platforms like peer to peer were once thought of as a democratic space where all could play a role in producing the workplace and owning the means and mode of production. Against the hopes of early collaborators in newly digitalised economies, these spaces have been privatised and largely co-opted. Algorithms are the outcomes of surges, such as the ranking of workers who take more contracts in Mechanical Turk or the shuffle of a subscription or service request like in the case of Uber. The hierarchy that is well known in the regular employment relationship is symbolically altered to a technologically, seemingly chance-driven system and the determinant for employability becomes reduced to a technological synapse. Inequality is obscured by these abstract moments in selection systems and allows backend actors such as the Uber CEO to reap a huge majority of profits from its business model (though Uber’s profit losses are well known) and leave those performing the service to, in the case of Uber, sleep in their cars. In February 2017 Uber CEO Kalinick yelled at an Uber driver in one of his cars, when the driver (who was secretly filming the encounter) asked why the company was raising standards and dropping prices. Kalinick insultingly told the driver that ‘some people don’t like to take responsibility’. Or in the case of the Amazon megalith, clients’ takings rise as they race to the bottom to find the cheapest and fastest service provider; providers whose lives are increasingly made precarious (not to mention increasing the market share of Amazon itself). As stated by Karatzogianni and Matthews (2016), we are seeing the ‘use of digital commons for ideological purposes’.


Robot Army of Redressers and the Privacy Strip Show

Strava, RunKeeper, Sports Tracker, Polar Beat, Endomondo, and Ghostracer are all fitness apps that track and store data about cyclists and runners. Users are encouraged to share their data with other runners and cyclists on active feeds that resemble the feed in Eggers’ novel The Circle. In this novel, the Circle is a firm in the not-so-distant future (which resembles Google), where employees are not only encouraged but expected to take part in the company’s social media engine (which are not very well disguised versions of Facebook and Twitter). Mae, the book’s protagonist, fails to join in the ‘socials’ within the first few weeks and is called in to speak to Gina, the CircleSocial manager who is investigating why she did not join. When Mae explains that she has been busy settling in to the job and has not had time for extracurricular activities, Gina does not seem impressed. Gina points out that ‘communication and community come from the same root word, communis, Latin for commons, public, shared by all or many’ (95). Gina asks, how can communication be extracurricular? As the tale develops, Mae is drawn into a world where the Circle requires constant and complete exposure from those in the inner (the workers in the firm) and outer (everyone else) ‘circles’. She begins to follow as many feeds, post as many comments, friend as many others as possible and watches the numbers of smiles and ‘zings’, visits to her site and the like, with pleasure. She watches the recording of her blood pressure on her wrist and is ‘thrilled’ when it increases alongside the increase in numbers. Of course, Mae also watches her steps and other health data. The job that Mae does is customer service, and she is explicitly judged on customers’ rankings of her service. Communication technologies in the world of corporate feedback loops consider com- munication to be quanti able with an ‘unhindered instrumental power’ (Haraway, 1991: 164), so in call centre service work such as the type in this ctional account, communication’s units are considered indisputable factors for analysis.

In the Circle, there is an acceleration of techniques to penetrate all of workers’ lives; where ‘privacy is theft’. This becomes a meme that playfully disrupts the anarchist Proudhon’s comment that property is theft; people are asked to set up cameras inside their homes and to record and broadcast their daily lives in a manner resembling the reality television programme Big Brother in its most macabre sense. Those who refuse to be a part of this ever-invasive society are soon chased down by drones, which is live streamed and intended to be amusing and a bit of fun, an assumption which is not held by those being chased. In fact, this drone activity directly leads to the suicide of one of the characters. Of course the book is fiction, but the extremes of quantified life as it dominates the qualified in the story are disturbingly convincing. The Circle dystopia is where all of work and experiences of life are immaterial. It is a world that Karatzogianni and Matthews (2016) liken to Harman’s zombie capitalism, where we see profit declining (like in the case of Uber) based on over-accumulation of things we can no longer even touch.

In a new dystopia like the Circle described above, people in (and even outside) any kind of collective are at risk of becoming the new ‘informers’ without the intention to do so. Historically, revolutionaries were appealed to; to become moles or informers, as described by E. P. Thompson. Thompson writes about what he called the ‘army of redressers’ in the late 1700s and early 1800s who met clandestinely during the night across Yorkshire, Birmingham, Bristol and London to ‘expose fraud and every species of Hereditary Government’ and to discuss how to ‘lessen the oppression of Taxes, to propose plans for the education of helpless infancy, and the comfortable support of the aged and destressed… to extirpate the horrid practice of war’ (Thompson, 1963/2013). Attendees of these underground meetings were asked to say yes to the following three questions: (1) Do you desire a total change of system? (2) Are you willing to risk yourself in a contest to leave your posterity free? (3) Are you willing to do all in your power to create the Spirit of Love, Brotherhood and Affection among the friends of freedom and omit no opportunity of getting all the political information you can? (Thompson, 1963/2013: 418–19) to ensure they really were revolutionaries. The Home Office began to seek out its own sources for information, including from previously active reformers who were in need of money or the ‘casual mercenary volunteers attempting to sell information by the “piece”’ (Thompson, 1963/2013: 534). In those days, spying and surveillance was a very different matter to what we see today.  The state set out to identify who was a threat and targeted them through various tactics, including trying to divide and destroy by paying for revolutionary information. Now, just by being online, someone may inadvertently give away information about another person or event that could be used by a blame-seeking state. Data can now be used to criminalise in ways that would make Foucault turn over in his grave. While online, we give data about ourselves to the state, for free; usually without any intention to reveal clandestine information.

Now the wisdom of the online crowd becomes a new army of redressers, where technological accident becomes the determinant of redress. Having mentioned above other revolutionary movements from two centuries ago, it is appropriate here to mention the Luddite movement. The Luddites were framework-knitters who saw the invention of the weaving machine as a direct threat to their work. Many jobs were precarious during this historical period, and these working class groups resorted to violence, destroying machines in a manner Hobsbawm (1952) called ‘collective bargaining by riot’. The weavers claimed that management were using machines not to immediately replace their work (though that was feared as an eventuality), but to defend exploitative management practices. Though the time period is different and the machinery antiquated by today’s standards, the Luddites’ fears were similar to those we hold today: that technology is being (or could be) used for bad governance and management practices and that machines could replace jobs. Like the army of redressers, these individuals feared change that would take power altogether out of people’s hands in a very undemocratic way. Now research must continue these investigations, identifying where algorithms and other technologies are being used to make unqualified work related decisions and identify how workers and citizens could be protected.

This excerpt is taken from my latest book The Quantified Self in Precarity: Work, Technology and What Counts

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s


This entry was posted on August 21, 2017 by .
<span>%d</span> bloggers like this: