This blog is an except from the book “Behavioural Economics For Business”  
 
 
by Phil Slade
 
The path of persuasion is paved with cognitive bias
 
From the earliest recordings of human history people have been manipulating situations and using powers of persuasion to gain advantage in a ‘survival of the fittest’ world. Often these persuasive techniques unintentionally tap into unseen influences in our judgements and decisions, called cognitive biases. In recent years cognitive bias (and its manipulation) has received a notorious reputation for its role in poor decision-making and unethical behaviour. However, is the manipulation or utilisation of cognitive bias always a bad thing? Can accessing and leveraging off cognitive bias actually lead to better outcomes? And of course, if there is a line between skilful marketing, and unethical profiteering, where is it and who should be the one determining its position? This article explores these questions, and considers how becoming aware of our own cognitive bias may contribute to us all making better decisions every day.
 
 
 
 
A revolution in economic thinking
 
In 2007, the global financial crisis (GFC) fundamentally redefined the way we looked at financial markets and human behaviour more generally. Until that point, economic theory had been largely based on the assumptions of European philosophers and merchant traders from the 1700s, who proposed that human beings make rational value appraisals based simply on the usefulness (utility) of an item to the individual. Utility theory, as it was eventually to be known as, was embraced by economists around the world and underpinned western economies for centuries.
 
 
The GFC was a financial tsunami that completely transformed traditional economic theory. The assumption that people are completely rational economic decision-makers was found to be a fallacy. The unawareness of the roles emotion and instinct play in our decision-making was a major oversight, and one that would have devastating repercussions.
 
 
The unseen psychological influences on our judgements are called cognitive biases, and if you are a living breathing human being, then you have cognitive biases. This is a point picked up on in the best-selling book Nudge by Richard H. Thaler and Cass R. Sunstein. They argue that when facing an important choice, the operative question is not whether or not to bias someone’s decisions, but in which direction you should ‘nudge’ them. This concept has been embraced in the UK where a ‘behavioral insights unit’ is employed by the government to develop ‘nudges’ to make residents healthier, wealthier and safer.
The take-home message here is that whoever is presenting the choices will inevitably bias decisions in one direction or another, irrespective of whether this ‘nudge’ is intentional or not. In this way, influencing other people’s decisions through cognitive bias is not good or bad – it’s completely unavoidable. The issue becomes one of ethical manipulation of cognitive bias – which becomes problematic if you are relying on something to ‘feel good’ as your moral compass.
 
 
The reason for this is that one of the great wonders of cognitive bias is that it has the ability to make irrational decisions feel completely rational in the moment you are making the decision. It feels ‘right’ because it reinforces our beliefs about how we control and make sense of the world around us. The decision we made aligns with our implicit assumptions or ‘mental heuristics’ that have built up over time, and so we are more likely to trust this decision as being correct. The study of irrational decision-making in economic decisions is the basis for the ground-breaking field of behavioural economics.
 
 
Behavioural economics, developed by Amos Tversky and Daniel Kahneman in the 1970s, highlights the pivotal role automatic heuristics and cognitive bias play in our decision-making processes. Their research offered key insights post the GFC that revolutionised markets, governments and economic management globally.
 
 
A key element of this field of study has been the close linking of cognitive bias with bad, irrational decision-making. This has had the largely unintentional effect of ‘demonising’ cognitive bias, and framing it as a key barrier to rational thought. In a way, the pendulum has swung away from ‘utilitarian’ unawareness of cognitive bias pre GFC, to slight paranoia toward the ‘evils’ of cognitive bias post GFC.
 
 
Through short narratives, this article attempts to show that cognitive bias is neither good nor bad – it just is. We all have our own cognitive biases that influence decisions we make every day. Further to this, we are often manipulating others cognitive bias in order to convince people about our point of view, win competitions, or manipulate situations to our own advantage.
 
 
The key insight is to avoid intentionally or unintentionally manipulating others’ cognitive biases in order to profiteer from irrational decisions and unfair behaviours, and to become aware of our own cognitive biases in order to make better decisions and promote rational thought.
 
 
What is cognitive bias?
 
 
 
Daniel Kahneman and Amos Tversky described the human mind as working with two different systems, simply named System 1 and System 2.
 
 
System 1 is a faster system, and is used to describe cognitive processes that are automatic, unconscious and habitual. It acts as a way to conserve cognitive energy for novel and threatening situations. Much like when you ride a bike, you are not consciously thinking about your balance, or about what your hands or legs should be doing. Once these behaviours are learned they become automatic, and free our brains to be able to focus on the road ahead and where danger or novel situations might be. Indeed the only way we can coordinate everything we need to ride a bike is to engage System 1 thinking. The moment we start actively ‘thinking’ about how to ride a bike, we can start to lose balance and our performance drops dramatically.
 
 
System 2 is a slower, rational system and consumes much more cognitive energy and attention than System 1. It is a more conscious system and is what we engage when we need to navigate complex, novel or dangerous situations. For instance, if I asked you to add 1+1, you can quickly and easily rely on your System 1 thinking to seemingly instantly come up with an answer. However, if I asked you to add 249.3 + 94.8 you would need to pause and think about it for a little while before you came up with an answer. Here you are engaging your System 2.
 
 
Cognitive bias is simply a term that is used to explain the learned mental patterns that exist in our System 1 thinking system. However, it would be a mistake to think when you are engaging your System 2, that this overrides System 1 thinking. Subconsciously, our System 1 has a large influence on our System 2 thinking. This is why understanding cognitive bias, which is fuelled by our emotions, is so important in understanding why it is we often make decisions that seem irrational to the observer, or are not at all in our best interests.
 
 
Over the last 10 years, insights from behavioural economics have shown that cognitive biases drive decision-making. In particular, the emotional system of the brain becomes front and centre in economic behaviour whenever:
  • the decision relates to primary biological and social needs (e.g. food, safety, inclusion and financial security);
  • the decision-maker is under stress; or
  • product features seem familiar.
 
 
Importantly, behavioural economic research shows the value given to rational (System 2) processes is largely driven by an individual’s emotional (System 1) beliefs.
 
 
A quick review of the scientific literature has identified more than 170 different cognitive biases, with more being discovered and studied every day. Rather than looking at each bias individually, this article looks at real life scenarios and analyses the role that cognitive bias plays in the decisions being made.
 
 
The following stories will hold more meaning for you if you ask yourself these questions:
  • Would I react differently in these situations and if so, what would I do?
  • Does the character’s behaviour in the story resonate with me, or do I find it completely foreign (or frustrating)
  • If not myself, do I know someone else that I think might act like this, and how do I relate to them?
 
Instinct and experiential heuristics
 
Some of my favourite memories growing up were of sitting in the passenger seat of my father’s truck as we headed off to the farm and into town with a full load of wheat after a recent harvest. This was a trip we had done many times over the years, and on one trip I can quite clearly remember my father recalling that he had a sense something was not quite right ahead on the road. He couldn’t quite make sense of it, but it just didn’t feel right. We slowed down from 100km/h to 80km/h to make sure we had some distance between ourselves and the car in front – a 23-tonne fully loaded grain truck can be hard to stop quickly.
 
 
All of a sudden, the caravan three cars in front blew a tyre and veered violently across the road narrowly missing another car as the driver struggled to keep it under control. The traffic screeched to a halt. My father slammed on the breaks and we felt the full thud-thud-thud of the brake system kicking in as the extra momentum of the full load pushed us closer to the car in front. The truck stopped amid a thick smell of brakes and burnt rubber – inches away from a small hatchback that had a very, very clear view of the ‘Kenworth’ badge in their back window. We are all safe. My father’s automatic and irrational instincts to slow down saved the day.
 
 
I would argue this is an example of cognitive bias in all its ‘survival instinct’ glory.
In a similar scenario, Klein and Clinton-Cirocco explored this type of instinctive decision-making in their famous 1985 study, which responded to a peculiar story in the local newspaper. The story interviewed a senior fire fighter who had responded to a ‘typical’ fire in a New York apartment block. When his team entered the apartment he felt very uneasy, and despite no obvious signs of danger, immediately ordered his team out of the building. Seconds after they exited there was a massive explosion and most of the building collapsed. No doubt, the implicit and seemingly irrational decision by the experienced fire fighter saved the lives of his entire crew.
 
 
What was happening here? Based on many interviews and observational experiments, their findings suggested every time someone experiences a situation, their memory creates a picture of what is ‘typical’. People with lots of experience in a particular situation can create a very accurate picture, or mental heuristic, of what is typical. Even though inaccessible from declarative memory, the brain with great efficiency seems to automatically compare similar situations with this well developed heuristic. The assumption the brain then makes is that anything that is ‘atypical’ is dangerous, and should be treated with extreme caution.
 
 
Without even realising it, when the senior fire fighter entered that room his memory did an instant comparison with the heuristic of a ‘typical’ situation that had been built up over time, and alerted him to ‘atypical’ danger – even though he did not know specifically what that danger was. His instinct was crying out to him “Warning! Atypical situation encountered! Get out!”. Whilst the development of mental heuristics over time may be argued as using the rational functions of the brain, his decision in the moment was not based on thinking through the situation rationally. Indeed, if the senior fire fighter had called on another fire fighter to co-assess the situation and come to a rational decision, the consequences may have been dire.
 
 
In my father’s case, it was his thousands of hours’ experience driving and observing ‘typical’ driver behaviour while towing a heavy trailer. He did not know what it was he was noticing, but whatever the atypical cues were, they were strong enough to alert him to impending danger to be avoided at all costs. The cognitive bias to treat novel or atypical situations with caution triggered a decision to drive cautiously, and in the absence of any direct or quantifiable ‘rational’ information, it was a very good decision.
 
 
One of the recent positive trends in OH&S departments of large organisations is to take this ‘avoidance of danger’ cognitive bias and use it to design safer environments. Indeed, what it takes to encourage people to make better, safer decisions in potentially dangerous environments is currently the subject of many studies. Businesses understanding and using cognitive bias to create safer work places is a very positive recent development. In this way, cognitive bias is not a bad or good thing, it just is. The recognition and manipulation of that bias is not ‘evil’ at all, it simply recognises the way humans make decisions and designs systems that reflect that knowledge for positive outcomes.
 
 
The loyalty of fuel
 
 
 
Another instance of cognitive bias in everyday decision-making would be to take the example of John James. John is an account manager for a large advertising agency in the city and he is married to Mary – a very precise person who values attention to detail and does not suffer fools. John and Mary are taking a well-earned camping holiday to Fraser Island. Road-trips for the couple have not always been smooth sailing events, as John’s attention to detail and rigor in planning are not always up to Mary’s exacting standards. John and Mary have always been careful with their spending, and so despite their comfortable financial status, they are on a tight budget for their holiday. Another quirk of John’s personality is that once he is ‘on a mission’ (such as driving to a holiday destination) any barrier or hurdle to achieving that goal in the most efficient way possible will cause him considerable frustration.
 
 
Just before they reach the motorway, they pass their local BP service station. John knows the local station well because he has a BP card that allows him to purchase fuel and services from any BP, and then sends him a tidy tax bill at the end of each month. In his early university days the card enabled him to pay for fuel when cash flow was unpredictable. However, nowadays the only benefit is that it keeps his expenditure records simpler when it comes to tax-time. It’s a simple enough arrangement that makes his life easier. Unfortunately, the sight of the service station doesn’t prompt him to check the fuel level – a very regrettable omission, as it would turn out.
 
 
As they enter the on-ramp for the motorway, the fuel warning light and alert sound go off.
 
 
Now John is presented with three choices:
  • Exit at the next opportunity and frustratingly circle back to the local BP, risking the ridicule of his wife for not checking something so fundamental?
  • Exit at the next opportunity and use a rival service station that won’t accept the fuel card?
  • Take a chance and keep going until he reaches a large BP roadhouse about 30 minutes up the motorway (and say nothing to his wife and hope it all goes well?).
 
Experiencing an acute stress and cognitive load as he attempts to suppress his level of concern to Mary, as well as increased time pressure with the ‘ticking clock’ of the fuel gauge, he relies on his ‘System 1’ to make a decision.
 
 
At this moment he has at least two biases that will inform his decision.
  • A pain-avoidance bias which seeks to avoid any pain that might arise from:
    • confrontation with his wife; or
    • frustration of a deviation to the planned journey.
  • A short-term bias where the use of the fuel card doesn’t feel like he is spending money out of his immediate budget, allowing for more available funds for his holiday (a concept learnt from his Uni days).
 
With the influence of these cognitive biases, John decides the worst option is actually the best choice. He decides to conceal the situation from his wife and take the chance he will get to the BP 30 minutes down the road without running out of fuel.
 
 
This decision seems rational to John because his System 1 naturally searches for the decision that aligns with his biases, and when his actions align with his natural cognitive bias it feels like the right choice. A decision-making process that will back-fire on John when 20 minutes later he runs out of fuel 5km short of the BP service station – a service station whose price of fuel is 5% dearer than anywhere else.
 
 
From a behavioural economics perspective, it is interesting to consider the role the fuel card played in this decision.
 
 
This fuel card is not a loyalty card. John gets no extra fly-buys, discounts or other commercial incentives for using it, and there is no interest charged on the amount is the account is paid on time. In his early university days, it merely enabled him to pay for fuel when needed even if cash flow was temporarily sparse. However, the only benefit now is that it simplifies his paperwork a little at tax-time, a small benefit that still seems to hold a very strong influence on behaviour. Anything that reduces pain is seen as very desirable by System 1.
 
 
This behaviour has also been reinforced by years of avoiding other service stations in order to find a BP, behaviour that is rewarded each month by a hit of dopamine when John receives his fuel activity statement. A touch of Pavlovian conditioning, together with some cognitive bias and we have the makings of a bad decision.
 
 
With this in mind, however, it would be hard to argue the role the BP card played in his decision was unethical, it was just good commercial practice. Did it play on John’s cognitive bias to benefit BP commercially and disadvantage John both practically and financially in this particular instance? Absolutely. Is it also addressing a functional need that has high utility from John’s perspective? Undeniably.
 
 
This cannot be said for many loyalty cards and credit schemes that trigger numerous cognitive biases to entice people to spend more than they need in the moment and lock them into high interest loan repayment schemes. Some companies’ loyalty schemes span completely different business lines which results in the use of cards to purchase goods more cheaply in one business line (e.g. a supermarket), in order for them to spend more in another area of the business (i.e. a service station). Here we can see it is not the cognitive bias itself that is bad, but the unethical manipulation of bias to disadvantage the consumer, and lack of consumer awareness of the biases that are influencing their decisions.
 
 
Cognitive load
 
 
 
 
As highlighted in Kahneman’s book, and reflected in much behavioural science literature, one of the key functions of cognitive bias is to reduce the load on our brain which only has a finite cognitive capacity. The research suggests we are hard-wired to search for simple solutions to problems so we can conserve mental energy for tasks that require more thought. This is a well-known bias to marketing professionals who exploit this trait by advertising with features such as ‘with only two click you can be covered for life’ or ‘buy a house/exercise bike/holiday in three simple steps.’
 
 
This bias is also poorly manipulated when crucial details relating to a product are hidden amongst long, complex and confusing terms and conditions. Various companies have been accused of profiteering from unethical manipulation of this cognitive load bias with a court case currently underway in Australia against a large hire car company’s insurance product.
 
 
The hire car company was accused of misleading customers about how much the customer was liable for in the event of damage to hire cars. The issue was found in the fine print of the terms and conditions, which included charging customers a damage liability fee if their hire car was damaged or stolen – irrespective of whether they were at fault. The hire car company’s advertisements would proclaim the insurance premium was capped, but contained exclusions in the terms and conditions which saw some people paying up to five times more than what they understood the cap to be.
 
 
Of course, the company defended its practice by stating it was clearly outlined in the terms and conditions, and that an explanation of the customer’s full liability was available on their website. While this was all true, you do not have to be an expert in behavioural economics to know people are extremely unlikely to go through the finer detail of multiple car insurance policies to compare and contrast in order to make the best decision – particularly if they are at a busy airport under significant time pressure to get to where they are going. People will take in as much information as they can absorb in any moment, and then once they reach their cognitive capacity will presume that what they see is all there is, and use System 1 heuristics to make a decision.
 
 
The reason details are included in lengthy terms and conditions statements is because companies know if they make the product feel too complex up front, then customers won’t buy their products. Simply making the products simpler isn’t a solution either, because people’s needs and situations are diverse, which means the product needs to have a level of complexity and flexibility to be functionally useful to a broad cross-section of consumers.
 
 
The constant challenge for the financial institutions is how to get crucial information about their products to customers in a way that doesn’t significantly contribute to cognitive load. Having information available on a website or buried in complex terms and conditions is no longer proof of transparency. People can be completely blind to important information when experiencing high cognitive load, and companies that manipulate this bias to create profit are acting unethically.
 
 
Much research has been done on cognitive load, which confirms that under high cognitive load we rely much more on our mental heuristics and cognitive biases to make decisions, which as discussed in this article, can lead to poor decision-making. Daniel Kahneman in his book, Thinking, Fast and Slow, describes it as the brain’s way of directing its limited attention and processing power to the situations where it is needed most. Daniel Siegel, another prominent psychological author, explains it as ‘flipping your lid’. He describes high cognitive load and stress as short-cuts to relying on irrational and instinctive behaviour. What is interesting is that research shows while we need to be under a certain level of cognitive load in order to be engaged and motivated, too much is counter-productive and often leads to accessing poor decision-making processes.
 
 
From an organisational perspective, ensuring people at all levels of the organisation are aware of the behavioural implications of high cognitive load is critical in order to get the best outcome for the individual, the organisation, and the customer. Knowledge of the cognitive biases that might be triggered under high cognitive load is a crucial element of:
  • high performance teaming
  • ethical marketing of products
  • effective performance management
  • individual motivation
  • creative innovation
  • successful organisational networks
  • effective leadership and communication
  • healthy organisational cultures.
 
Highly stressful and cognitively demanding situations are unavoidable in today’s constantly changing business environment. Accurate knowledge of cognitive load and the effective application of this knowledge is crucial to maximising performance and increasing health and well-being.
 
 
Unintentional manipulation
 
Most of us at some point in time have had the experience of being caught short when needing to pay for something at the shops. I have a good friend, Jenny, who describes to me an experience she had recently with a service station attendant.
 
 
After filling her car, Jenny joined the long queue at the service station to pay for her fuel with an armful of ice creams intended as bribes for her children in the back seat of the car. It’s a public holiday and there is only a single attendant servicing many people who are eager to get back on the road. After some time shuffling in the queue, Jenny impatiently progresses to the front of the line and presents the attendant with her credit card to pay the $107.54 account. The credit card doesn’t work. Jenny asks the attendant to try it again. Still no luck. Jenny looks behind her at the sea of frustrated faces, eyes piercing and cursing her credit card incompetence.
 
 
In a slight panic Jenny searches her pockets and finds $100 cash. Her other cards are back in the car. She could leave the ice-creams, but she really wants them, and if she steps out of the line she may have to queue up all over again. With a pleading and desperate look she offers the attendant the cash and explains she can go to the car and get another $7.54 and be back in one minute. Jenny gives another pleading look. The attendant looks at the angry mob behind. The tension builds. “Don’t worry about it, $100 is fine.” Jenny thanks the attendant profusely and slinks out of the shop avoiding eye contact with anyone. She had received the ice creams and some of her fuel for free… was this bad?
 
 
Jenny wasn’t considering this at the time, but she was unintentionally, unfairly manipulating the service station attendant’s cognitive bias. The only solution to the problem that Jenny gave to the attendant was one where the pressure of the impatient people in the line would increase as she ran outside to retrieve her other cards. Any service quota KPIs the attendant was trying to achieve as part of their job description would also compound the pressure coming from the angry mob. The calculation in the attendant’s head was that the extra $7.54 was not worth the added pressure felt from the impatient customers, and the potential barrier to whatever quota they might be working towards.
 
 
Was there unethical behaviour? Unintentionally, yes there was. Of course, if Jenny decided this was a good personal business model and targeted under-staffed long-queued service stations on hot public holidays in order to get cheap fuel from service station attendants under pressure, then that would constitute intentional unethical behaviour. However, the fact the manipulation was unintentional doesn’t absolve her of responsibility. In a small way, Jenny was profiting through unfair manipulation of the attendant’s cognitive bias toward avoiding complication and conflict. This is particularly unfair in light of the fact that most attendants have to make sure their till total matches the value of the transactions. If it doesn’t then the difference is deducted from their pay slip. In this sense, the attendant on minimum wage was personally paying for the shortfall.
 
 
As an aside, if you still think you are not susceptible to your own cognitive bias, then ask yourself what was the age or sex of the attendant in Jenny’s story? As you might guess, nowhere in the story is the age or sex of the attendant mentioned, and your instinctive answer will reveal your own biases.
 
 
Looking at the issue of manipulation more broadly, organisations need to look at intentional and unintentional manipulation of markets and target customers in order to identify ethical behaviour. Organisations need to assess their products and ensure they are delivering value in an ethical way while also maximising the marketability of the product. Behavioural economics provides a lens with which to identify cognitive bias and how it is being used, and regulators all around the world are starting to look at ways of applying these principles to identify unfair behaviour toward consumers. However, the conversation needs to be about more than just compliance and box-ticking.
 
 
Behavioural economics provides the construct through which organisations can have a mature conversation about ethics.
 
 
In conclusion
 
All human interrelationships and communication is saturated with techniques to influence others, and most of them are benign. At a certain point, though, these methods can become unethical, treating people in an unfair manner and potentially profiteering through the exploitation of people’s cognitive bias.
 
 
We should all be aware of how cognitive bias can influence our decisions in order to make better decisions. There are many situations where cognitive shortcuts are helpful, but there are many situations where undermining rational thought processes leads to poor decision-making. An awareness of the heuristics used in judgements and decision-making allows for an increased understanding of what is actually influencing a customer’s decision-making process. It is up to organisations to use this knowledge to deploy techniques that will help people make better decisions about their products and services.
 
 
Cognitive bias is not bad, it is just a result of the way our brains have evolved over time to stay alive and cope with the complexity of our environment. Knowledge and awareness of these limitations and processes is the first step towards us all elevating our humanity and making better decisions every day.
%d bloggers like this: