Advertisement

The 'double edge' of pervasive personalisation

  <span class="attribution"><a class="link " href="https://www.shutterstock.com/image-photo/young-woman-using-tablet-control-business-600991298?src=hokVShiXa3QdTWKqGQOBHQ-1-17" rel="nofollow noopener" target="_blank" data-ylk="slk:Shutterstock;elm:context_link;itc:0;sec:content-canvas">Shutterstock</a></span>

Anyone who uses online shopping, social media or even email is voluntarily – though not necessarily consciously – experiencing the influence of personalised technologies. The “profiling” which leads to advertisements for certain products stalking you across the web can be a nuisance. But recent news stories about Cambridge Analytica sourcing personal data from Facebook raises deeper concerns about people’s autonomy, and the security of democratic processes.

Personalisation shapes people’s experiences of politics, commerce, personal and social life. It is a powerful, technologically-enabled principle, which harvests information about our choices and preferences to tailor our engagement with objects, institutions, services and one another.

Of course, this can help us to make decisions on our daily tasks quicker – to help us be more “ourselves”. Personalisation often appears to empower individuals; to help them feel competent, in control and confident in the tastes and opinions that distinguish us from others.

But all too often, there is a “double edge” to personalisation, which goes unrecognised by users – until they are confronted with its effects. Personalisation has raised questions about privacy, personal responsibility and consent; it has destabilised power relationships between designers, manufacturers and consumers; and it has disrupted discourse between political parties and citizens. It also has the potential to compromise understanding of what it means to be a person, and a moral agent.

In control, or under control?

Today, encounters with personalisation begin from birth. Smart dummies such as Pacif-i use Bluetooth technology to monitor a child’s temperature, and record when a parent last administered their medication, along with the dosage. These data are sent to parents’ phones, and can be shared with medical practitioners. Later in life, personalised medical interventions can use genetic analysis to optimise treatments, predict future health outcomes and the likelihood of future illnesses.

Having this information might seem like a good thing: Pacifi-i gives parents the ability to keep a close eye on their child’s health, while personalised medicine gives users the chance to adjust their habits to minimise the onset of a disease. But these personalised interventions may not be so welcome if the people concerned are prevented from getting health insurance, because this detailed data was disclosed to an insurance company.

Likewise, the NHS’ Personal Health Budget offers a personalised approach to social care, by allocating spending choices to patients. Controlling their own care may help people to feel empowered. But research from our book on personalisation shows that this benefit may be unevenly distributed by class and education, and that the requirement to choose and manage treatment may present a greater burden to patients and their families.

The echo chamber effect in social media may stunt political discourse through personalised prompts, which link us only to “people like us”. The self-affirming experience of liking something on Facebook fosters our feeling of belonging and being included. But, at the same time, Facebook – and indeed the companies which harvest data from Facebook – can also predict identifying traits which we chose not to disclose.

<span class="caption">Need a hand?</span> <span class="attribution"><a class="link " href="https://www.shutterstock.com/image-photo/3d-printer-prints-model-hand-process-623097050?src=8P_jtI017Udp39UhvMV3Eg-1-35" rel="nofollow noopener" target="_blank" data-ylk="slk:Shutterstock.;elm:context_link;itc:0;sec:content-canvas">Shutterstock.</a></span>
Need a hand? Shutterstock.

Innovations in production technology, such as 3D printing, offer consumers a combination of physical fit and personality. Some producers even give amputees the chance to choose the final finish for their prosthetic limbs. Nowadays, you can not only customise apparel using open design platforms, but also “borrow” someone else’s personality by, for example, personalising a famous footballer’s t-shirt . But this mass-customisation destabilises the relationship between producers and consumers, making it harder for manufacturers to plan for surges in demand and spikes in sales.

Personalisation for the people

With the latest digital assistants and genius platforms such as Apple’s Siri, Google Assistant and Amazon Echo, we can control devices with that most personal characteristic – our voice. Yet these virtual personal agents have their own voices, too, and there is a danger that such systems could end up with too much power to personalise what they offer us.

This would limit users’ options – rather than expanding them. But more importantly, if a machine seems too much like a person – if it is too sociable – then there is a danger we may come to think of it as an agent capable of independent decision making, when in fact they are simply tools to carry out our own agendas.

Over the last weeks, numerous companies decided to update their privacy and data sharing policies to make them more straightforward for their users to understand or find an opt-out option. But these documents remain lengthy and complex, and not everyone will be able to make sense of them. The fact that the UK Information Commissioner wasn’t able to access Cambridge Analytica files before Facebook shows that social media giants have an upper hand, regarding how our personal data are handled.

Facebook just moved 1.5 billion users from its international HQ in Ireland to California to avoid fines of up to 4% of its global turnover, if it breaks the new data protection rules that come into force in Europe on May 25, 2018. Clearly, we are not all in this together yet. The public is kept in the dark about the ways data are manipulated to influence our choices, attitudes and behaviours. If personalisation is to be a benefit – and not a curse – users must be given a stronger voice, to influence the ways their personal data are being shared.

This article was originally published on The Conversation. Read the original article.

The Conversation
The Conversation

Dr Tom Fisher receives funding from AHRC, EPSRC, Defra, ACE. He is affiliated with the Design Research Society.

Iryna Kuksa does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.