Global judgements and ideas.

Unintended Consequences: The Dark Sides of Quantifying Selves

This paper was commissioned by the Sustainable Society Network+, funded by the EPSRC’s Digital Economy programme led by Imperial College London to indicate a call for research funding in the area that Dr Moore and Dr Piwek identified at the following event. The paper follows a 1-day sandpit workshop, ‘Quantified Self in a Sustainable Society’ held at Somerset House, London, on 13 November 2014. The event explored ways in which Quantified Self technology creates opportunities to increase sustainability, and aimed to stimulate dialogue between academics, practitioners and industry to provide opportunities for researchers to contribute to a future research funding agenda on the Quantified Self. For more information, please see

Please email me if you would like to cite this paper

Dr Phoebe Moore, Dr Lukasz Piwek

A key intention for people involved in the Quantified Self (QS) movement is to get to know aspects of selves that would not otherwise be knowable. People’s interest in getting to know their autonomic selves through self-tracking is, however, an increasingly lucrative business and corporate vendors and branding gurus are finding ways to use our data for corporate profit. Headline stories about invasion of privacy and insecure data sit alongside the rise in technologies that allow surveillance in new and creative ways, requiring a renewed dialogue around the potential consequences of a ‘big brother’ society and the changing meaning of privacy as well as control of personal data.

The ‘future normal’ involves more and more tracking devices (Ramirez, 2013) and this movement is accelerating rapidly. As QS participants generate increasing amounts of data, should companies providing the means to do so be increasingly regulated to avoid ethically questionable activities? Or does further regulation lead to increased unwanted surveillance and obstacles to innovation? At a micro-level, are there specific features of contemporary everyday life that have led to a drive toward personalised health care, self-management and self-monitoring? Is the rolling back of public spending on health care or the fragmenting of real-time communities forcing individualized experiences? Research is needed that operates at the interface between social and scientific disciplines, looking critically at the unintended consequences and dark side of the QS movement and drawing on two overarching themes, a) health and wellbeing and b) security and ethics.

Description of the call

People gather data about inputs to their bodies, states of mind or arousal and performance, both mental and physical, using wearable and other self-tracking technologies. Technologies are used to measure the relationship between room temperature and work-rate, to identify the level of pollutants in the air, count calories, track babies’ health and sleep, monitor personal habits. Methods include first-person digital ethnographies, lifelogging, diarising, and self-tracking of mental and physical activities as well as recording surroundings.

QS principles and solutions are becoming an influential force for people’s wellbeing, health management, personal development and behaviour change. While many tracking activities are lively and empowering, corporate owners of apps and wearable devices openly sell millions of anonymised GPS-tracked activities collected such as in the case of the Strava Metro project.[1] As personal data becomes increasingly fine-grained it is rapidly becoming an increasingly valuable asset, so one can envision ‘dark side’ consequences of data sharing by companies including invasive marketing targeting, violation of privacy, identity theft and socio-psychological issues such as self-fulfilling prophecy or attribution bias. Research that looks critically at issues emerging is necessary to identify the full impact that aspects of QS are having and will have on people and society.


Interest in self-quantification probably started in the 1970s, but the term quantified self appeared in our cultural lexicon in 2008 (Lupton, 2013: 26). The most dramatic increase in periodicals occurred in 2012 and 2013 and continues to thrive. Kevin Kelly and Gary Wolf, editors of Wired magazine, held the first Quantified Self conferences in San Francisco in 2007. In publicity, Kelly stressed that ‘real change will happen in individuals as they work through self-knowledge… of one’s body, mind and spirit… a rational [path]: unless something can be measured, it cannot be improved’ (Kelly, 2007). Kelly called for projects that would for example discuss personal genome sequencing, life logging, measuring chemical body load counts, self-experimentation, location tracking, digitizing body info, sharing health records, psychological self-assessments, medical self-diagnostics. Since the first groundbreaking event, in large numbers people have begun to organise QS meet-ups in according to the QS website, 134 countries. Very successful QS Europe conferences have been organised in Amsterdam attracting many enthusiasts from all walks of life. Alongside this extremely popular social movement, the acceleration of the development of a digital landscape has in many cases surpassed users’ ability to manage how their personal data is shared, distributed, stored and used. So there is great urgency to identify and deal with these issues.

Literature Review

Few academic publications explicitly address the unintended consequences and dark sides of QS. Lupton has identified issues around ‘function creep’ in self-tracking realms in the areas of ‘selfhood, citizenship, biopolitics and data practices and assemblages’ (2014). Till indicates that specific aspects of self-tracking and gamification of everyday practices can lead to valuable work going unpaid (2014). Whitson (2013) warned that the gamification trend, of which QS is a part, ignores the ‘messy actualities… in intrusive user monitoring’. Moore (2015) provides an early intervention in emerging debates, querying to what extent self-quantification is symptomatic of workplace transformations in cognitive informational arenas for production. She argues that too much focus on the cognitive dimensions of labour in the new world overlooks enduring physicality of labour, and notes that QS has a dual function. Quantification of the self has empowering potentials for workplace potential and possibilities for enhanced autonomy. However, unintended consequences come into play with monitoring and surveillance aspects including blurred lines in health and safety and ways to measure productivity and wellbeing.  Joinson and Piwek (2015) further contribute with cutting edge discussions on the potential of using digital solutions and tracking technology in facilitating behaviour change interventions.

Constant-on connectivity, low-power sensors, and powerful microprocessors mean that self-quantifying technology brings new potential for personal analytics, self-diagnosis and development of preventive intervention in healthcare (Swan, 2009, 2012, 2013; Topol et al., 2015; Bravata et al., 2007). Other existing literature focuses on technical issues surrounding reliability and validation of self-tracking devices (e.g. Lee et al., 2014; Case et al., 2015). There is significant potential to harness big data generated by QS communities in areas of population health, such as physical activity, diet, tobacco use, and exposure to pollution, as well as facilitating the discovery of risk factors for disease at population, subpopulation, and individual levels. Improving the effectiveness of interventions, captured data could help people achieve healthier behaviours in healthier environments (Barrett et al., 2013). In an opinion published in New England Journal of Medicine, authors discuss problems surrounding regulation of the health app market in US (Cortez et al., 2014), referring to a number of examples with unregulated health smartphone apps that pose a real risk to patients, such as Pfizer Rheumatology Calculator app which generates mistakenly high and low scores for measuring tender and swollen joints in patients with arthritis by as much as 50%; and the Sanofi Aventis diabetes app which miscalculated insulin doses. Cortez et al point out risks related to compromised patient privacy and legal liability for injuries that may be caused by use of ‘faulty’ apps and highlight some legislative suggestions of risk-based classification that could be used FDA in US to properly regulate apps.

In terms of privacy and unintended data disclosure, risks have been indirectly suggested in recent studies examining the way people use digital technology. Security issues related to QS have mainly received attention in popular media with report highlighting vulnerability in communication protocols between devices (Curtis, 2014) or ‘grey practices’ of companies sharing consumer data for sophisticated profiling (Venkataramanan, 2014). Indeed, we already know that tiny digital traces of continuous mobile activity can reveal our identity (De Montjoye et al., 2013) and consumer preferences (Kohne et al., 2005). We can make inferences about user contexts (Lathia et al., 2013), physical activities (Lane et al., 2011), emotions and stress (Rachuri et al., 2011) using data from smartphone sensors. Moreover, we can make highly accurate predictions about the personality traits of users by analysing the linguistic content of text messages (Holtgraves, 2011) or by examining the average time users spend on phone calls or messaging (Ehrenberg et al., 2008). Daly (2015) writes about self-quantification and the law, querying to what extent regulation can bring leverage around questions to do with health information.

Rationale for the call

Research will build on existing debates in science and technology, political economy, cybersecurity, psychology and sociology that look at the impact of monitoring and surveillance on people. Research should update these arguments to look at self-tracking to identify how new technologies and unprecedented uses reveal an unexplored arena of big data capture linked to usage of sensor technologies.

Scope of the call

Research is invited that can provide insight and analysis in the following key areas:

Health and Wellbeing

The first theme, Health and Wellbeing, focuses on the way that self-quantifying technologies are used to achieve higher levels of fitness and health. While related technologies have been used in hospitals such as lapel cameras suggested for Alzheimer’s patients for some time, the personalised use of health apps are difficult to regulate (Cortez et al., 2014). Furthermore, the increase in use of individualised health related technologies could reflect a number of social issues including decreasing investment in health care and other social spending. Why are people ‘turning their bodies into medical labs’ (Yangjingjing, 2012)? The first dimension of this theme has to do with social implications for the rise in individualised health technologies and the difficulties and attempts in regulation.

The second aspect of the health and wellbeing theme has to do with the use of QS wearables in workplaces and the implications for health and safety and happiness at work. According to ABI Research, more than 13 million wearable fitness tracking devices will be incorporated into employee wellbeing and wellness programs 2014-19 (Nield, 2014). Tracking for workplace wellbeing remains problematic however if does not capture other impacts of specific management approaches in the workplace such as depression or anxiety that has been depicted by a range of epidemiologists and work psychologists to emerge in environments with overly invasive surveillance. Wellbeing as linked to productivity has been seen for example in the application of a voice-driven wearable armband named Theatro. Motorola armbands have been used in Tesco and Amazon warehouses where productivity data is generated, and Sociometric Solutions capture employees’ physical movements and interactions in workplaces.

Security and Ethics

While QS technologies provide relatively unsophisticated data about such aspects of activity as steps walked or calories consumed, the potential for corporate data accumulation is rising. This could become an increasingly serious problem and regulators are slow in replying to the dynamic digital landscape. To what extent is the undisclosed use of personally produced data protected and to what extent does it put users at risk? There is a growing recognition of personal data vulnerability as well as network security risks (Symantec, 2014) that these technologies introduce. ‘Digital traces’, GPS location and patterns of activity are used to pinpoint users’ identity whilst companies such as Fitbit openly market data to third parties. While people technically agree to release of personal data when they agree to terms and conditions in often very long texts in end user license agreements (EULA), it can be argued that fully conscious consent is rare. ‘Anonymisation’ is an increasingly relative concept in the digital legal arena. There is no standard EULA for wearable and self-tracking devices but there are various laws at national and supranational levels such as the EU. Should EULA be universal? Will people who do not agree to EULA be prevented from using some technologies? Is there, indeed, much opportunity for opt-out? Specific products are now entering the market to address some of these concerns such as Apple’s ResearchKit[2] and the implications of these debates and product shifts are enormous.

Such questions put the spotlight on the security and ethics dimensions of these practices. People may feel that their opportunities for making intentional choices around self-disclosure and privacy are becoming increasingly rare, as ‘fine print’ and conditions for usage become ever more opaque. Access and ownership of personal data thus have become an issue, not least as we see the rise in invasive marketing and profiling. Is there a possibility for regulation of this new world of diminishing privacy, data ownership transformation and corporate use of what we may have once considered private information about ourselves? Who is responsible for creating the ‘ethical body’ (Morrison, 2015)? How is QS data stored and interpreted? This theme then looks for research on shifting expectations and considerations of what is reasonable and can be expected with regard to personal privacy, corporate social responsibility and accountability, diminishing availability of regulation and related questions to do with security and ethical responsibility surrounding the social practices of quantifying the self.

Suggested call criteria

Grants will last from 1 – 3 years and should include at least one European or international partner. Individuals at all career levels are welcome to apply for this grant but teams must include a balance of levels and must involve individuals from at least two Universities. Involvement of SMEs is encouraged. Dissemination plans must indicate stakeholder impact. Funding will not be made available for product innovation or development.


[1] See

[2] See

Author information:

Dr Phoebe Moore: Senior Lecturer in International Relations, Politics and Law, Middlesex University, London.


Dr Lukasz Piwek: Research Fellow – Behavioural Change, Research  Lab, Business School, University  of the West of England,  Bristol.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


This entry was posted on June 15, 2015 by .
%d bloggers like this: