How persuasive technology is shaping our decisions – and how companies are using it.
Persuading fellow humans is as old as humanity itself. What is very new is that we have now created machines that are doing it for us – whether for profit, power or the common good. And these systems operate pervasively, globally and with access to vast quantities of data on our behavior patterns and social networks.
Think otherwise? Consider the technology you have so far used today. Over breakfast, you checked in on Facebook and were served a number of ads that were specifically targeted to you. In fact, maybe you used your iPhone – a device that you love because it is so pleasant to use and for which it is so easy to order new apps (how convenient that Apple already has your credit card number on file). You may not have noticed, but you have been very consistently and skillfully persuaded not only to use these products and services but to depend on them.
But wait, there’s more! During the course of your day, did you stop to consider that perhaps the real factor behind the overwhelming success of these two technologies is not simply their simple and elegant design, nor is it the fact that they enable a connected lifestyle to an unprecedented degree. It is, simply, this: they have become a habit in your daily life – yes, perhaps even an addiction.
Be honest now: how many times have you looked at Facebook today? Or checked your e-mail on your iPhone?
Start Your Desire Engines
Nir Eyal, who blogs about the intersection of psychology, technology and business, has written about what he calls the “desire engine.” According to Eyal, the degree to which a company can utilize habit-forming technologies will increasingly decide which products and services succeed and which will fail.
Eyal breaks down the mechanisms of addictive technology into “internal triggers” that cue users to take action or use the service without the need for further marketing or other external stimuli. Using the product becomes the user’s own innate desire. Creating these internal triggers comes from mastering the “desire engine” and its four components: trigger, action, variable reward, and commitment. Not surprisingly, many of the basic psychological mechanisms are familiar from studies of gambling addiction – or from rats pushing levers to receive treats in the so-called Skinner box.
Creating a billion dedicated – or better yet, addicted – users is, of course, what any web startup dreams of. Succeeding in this may be great for the business, but not necessarily for the consumer. Even benign but persuasively designed technologies can be very hard to “break out” of. How many people have you heard of that have actively used Facebook or an iPhone for a few years, and then stopped?
“The thing that makes all this so complicated is that we’re dealing not only with individual solutions and companies, but entire business ecosystems whose value depends directly on the number of people that are actively revisiting and contributing to the service,” notes Harri Oinas-Kukkonen, Professor of Information Systems at the Department of Information Processing Science, University of Oulu, and co-author of the recently published book Humanizing the Web: Change and Social Innovation.
Oinas-Kukkonen points out that on the other hand, we clearly also need persuasive systems. “In this age of the social Web, solutions simply must be designed persuasively if they are to have any chance of success.”
The Power Amplifier
With new technologies come new powers. This also means that companies need to pay attention to the way that they influence and persuade consumers. As more and more businesses operate entirely in the immaterial realm, the focus of corporate social responsibility is in many cases also shifting away from traditional metrics such as environmental sustainability.
In a recent essay, security expert and author Bruce Schneier noted that customers are increasingly seeking redress and judment in the court of public opinion on the Internet, augmenting or even bypassing the traditional legal process for a more favourable hearing.
“The court of public opinion is an alternate system of justice,” Schneier writes. “Arguments are measured in relation to reputation. If one party makes a claim against another that seems plausible, based on both of their reputations, then that claim is likely to be received favorably. Reputation is, of course, a commodity, and a loss of reputation is the penalty this court imposes.”
Technology amplifies the marketer’s ability to persuade, but also customers’ capacity to take action in the face of wrongs – whether real or perceived. Conversely, building a reputation for responsibility and trustworthiness will directly impact a company’s bottom line. Having a customer’s credit card number on file for one-click purchases is the ultimate sign of trust and privilege, granted only to those who have earned it.
Choice Without Awareness
N. Craig Smith is the INSEAD Chaired Professor of Ethics and Social Responsibility. In a recent paper co-authored with Daniel G. Goldstein, Principal Researcher at Microsoft Research and Eric J. Johnson, the Norman Eig Professor of Business at Columbia Business School, he points out an interesting historical arc in persuasive technology.
In 1957, Vance Packard’s book The Hidden Persuaders caused a sensation by claiming that marketers could manipulate consumers through subliminal advertising (e.g., by flashing a single-frame image of a soft drink on screen during a movie). In the subsequent five decades Packard’s claims have been mostly debunked, but technology and the science of psychology have caught up.
Smith and his co-authors note that “careful empirical research has identified a host of psychological and environmental manipulations that would be exceedingly difficult for consumers to detect or resist. In short, it is possible to influence consumer choice without awareness – and in some quite dramatic ways.”
The methods of influence range from changing the default selection on a web form to sophisticated solutions that provide customized persuasion tactics for every individual user. But one thing is certain – persuasive technology is everywhere, affecting your choices and decisions.
How, then, should we – as decision-makers and consumers – approach all this? What should we make of machines that persuade, influence or nudge us towards some predetermined behavior goal?
Suddenly, we need to apply ethical reasoning to behavior that has been part of our human existence for thousands of years, but is now done by machines of our own creation. This conceptual shift has left even professional ethicists and philosophers scratching their heads.
Should consumers and users always be made aware of persuasive efforts? This may be far from realistic. Is it ethical to use persuasive technology to change people’s behavior for their own good? Perhaps – but who defines that good? Any attempt to tackle these questions quickly descends into a philosophical, practical and ethical maze with no exit in sight.
Some even question the usage of the term ‘persuasive’ itself. “To me, the word ‘persuasion’ requires that the user be aware of the attempt to influence them,” says Harri Oinas-Kukkonen. “If it is not transparent, it should be called something else.”
“There are basically two main options of how to approach the ethics of persuasive technology,” adds Craig Smith. “The first is a ‘consequentialist’ approach; in other words, to look at what is the best outcome. The second, very different viewpoint is to focus on the nature of the decision being made. For example, in a consumer context, it is generally agreed that people should have the freedom to choose. The problem is that there are many techniques that are very successful in guiding choice without people being aware of them. In that case, the consumer cannot be said to have full autonomy.”
Harri Oinas-Kukkonen proposes approaching behavior-changing technology less from the philosophical tradition and more from the perspective of systems design, focusing on the practical ways of addressing ethical questions. However, he points out that software development in itself is so complicated that it may be too much to expect that developers also take into account a host of ethical dilemmas when designing their solutions.
Persuasion for Sale
Social scientist and researcher Maurits Kaptein works at the cutting edge of persuasive technology research. With his colleague Dean Eckles, he has done some fascinating research in a field called persuasion profiling. This means collecting data on not only what preferences an individual user or consumer has, but also what kind of persuasion they most respond to.
As an example, online bookstores routinely serve up customized recommendations based on the customer’s purchase and browsing history. Kaptein and Eckles noticed that as individuals, we react differently to various kinds of persuasion. Some of us respond to recommendations from authoritative sources, while others will be more apt to grab a hot, limited-time-only discount deal.
Remarkably, though the actual product or service may differ widely, the type of persuasion to which we respond remains very consistent across domains. So the same angle that will lead to that impulse buy at the online bookstore (”12 of your Facebook friends have bought this book!”) could shift your allegiance to a political candidate (”Everyone in your neighborhood is voting for so-and-so”).
On a psychological level, this is, of course, nothing new. Any used car salesman or real estate broker will flexibly adjust their sales pitch for each customer – and can even relay tips on the best tactics to colleagues. But now this information can be collected, aggregated, stored and sold automatically, pervasively and on an almost unimaginable scale.
If the ethical dilemmas of persuasive technology are enough to make your head spin, this is where it really gets interesting. In the age of persuasion profiling, a company’s core business model might well include not only selling a product (e.g., books over the Internet), but also data on what kinds of persuasion individual customers have best responded to. Since persuasion profiling works best when the user is not aware of it, it does not seem likely that companies would be in a hurry to disclose the use of such technologies. But should we even care? Is there really a problem?
“The first time Google and Yahoo served up customized advertisements in their e-mail clients, people were outraged,” notes Maurits Kaptein, who is also an entrepreneur with PersuasionAPI, a company offering solutions for persuasion profiling. “Now, it’s standard practice and people actually appreciate it.”
Kaptein continues: “Currently people have too little knowledge of what these technological solutions are about. Once there is a deeper understanding, the ethical view might well change.”
As for the ethical questions of selling people’s persuasion profiles? “So far, in all of our conversations with our clients, the issue has never come up. Not that I think that’s necessarily a good thing,” Kaptein says.
Here to Stay
What, then, does the future hold? It is not difficult to conjure up visions of Big Brother using persuasive technology to manipulate us to evil ends in some dystopian future. But right now, we may just have our hands full in coming to terms with the latest and cleverest online marketing ploy – or resisting the urge to check our mobile e-mail for the 18th time within the last hour.
On the positive side, persuasive technology can also be used for purposes that are slightly more clear-cut in their ethical implications. Examples could include mobile apps that help quit smoking – or the use of persuasive technology in public health campaigns to nudge people towards healthier lifestyles. In fact, in some cases our inborn capacity to be steered, persuaded – yes, manipulated – could even be for the better.
“People are often assumed to be rational decision- makers, when such is clearly not the case,” says N. Craig Smith. “There may actually be merit to having a certain degree of paternalistic technological intervention to help us make better choices.”
PROFILE MAGAZINE 2/2013, page 21