The Sean Ellis Test

Sean Ellis ran a consulting company, 12in6, that specialized in helping startups during their growth transition stage (post Product/Market Fit). As a condition to taking on a client, he conducted a qualitative survey across a sampling of the company’s users to determine if their product had achieved Product/Market Fit.

The key question on the survey was:

How would you feel if you could no longer use [product]?

  • Very disappointed
  • Somewhat disappointed
  • Not disappointed (it isn’t really that useful)
  • N/A – I no longer use [product]

If you find that over 40% of your users are saying that they would be “very disappointed” without your product, there is a great chance you can build sustainable, scalable customer acquisition growth on this “must have” product.

This 40% benchmark was determined by comparing results across 100s startups. Those that were above 40% are generally able to sustainably scale the businesses; those significantly below 40% always seem to struggle.


Posted in process, research, strategy | Leave a comment

3 types of reasoning: Deductive, Inductive and Abductive

Deductive reasoning is the traditional form of reasoning you’ll be familiar with from pure maths or physics. You start with a general hypothesis, then use evidence to prove (or disprove) its validity. In business, this type of thinking is probably how your finance department plans its budget i.e. to generate this much profit we need to invest this much in staff, this much in raw materials and this much in buying attention.

Inductive reasoning is the opposite of deductive reasoning, using experimentation to derive a hypothesis from a set of general observations. In business, inductive reasoning is often the preserve of the customer insight and marketing team i.e. we believe our customers will behave this way, based on a survey sample of x number of people.

By comparison, abductive reasoning is a form of reasoning where you make inferences (or educated guesses) based on an incomplete set of information in order to come up with the most likely solution. This is how doctors come up with their diagnoses, how many well-known scientists formed their hypotheses, and how most designers work. Interestingly it’s also the method fictional detective Sherlock Holmes used, despite being misattributed as deductive reasoning by Sir Arthur Conan Doyle.

Abductive reasoning is a skill, and one that can be developed and finessed over time. It’s a skill many traditional businesses fail to understand, preferring the logical certainty of deductive reasoning or the statistical comfort of inductive reasoning. Fortunately that’s starting to change, as more and more companies start to embrace the “design thinking” movement.


Posted in process, psychology, strategy | Leave a comment

Scarcity Effect

WORCHEL, LEE, AND ADEWOLE (1975) asked people to rate chocolate chip cookies. They put 10 cookies in one jar and two of the same cookies in another jar. The cookies from the two-cookie jar received higher ratings—even though the cookies were exactly the same! Not only that, but if there were a lot of cookies in the jar, and then a short time later most of the cookies were gone, the cookies that were left received an even higher rating than cookies that were in a jar where the number of cookies didn’t change.

The Scarcity Effect

[From Neuro Web Design by Susan Weinschenck.]

Posted in psychology, research | Leave a comment

How do humans survive the automated revolution?

What should my daughter do when she grows up? The short answer, of course, is whatever she likes. It’s her life to live and despite the fact she’s not even two years old yet, she’s already showing such a strong will that I suspect she’ll do the opposite of whatever I suggest.
Nonetheless, I can’t help but wonder what the class of 2017 might be doing in 20 years. According to studies and expert opinions, the job market is about to change. A lot.
I’m sure you’ve seen estimates that 47 per cent of jobs in the US are susceptible to computerisation (think AI, self-driving cars and robotics), and then there’s the map that shows how soon-to-be-redundant roles like truck driver are among the most common jobs in America.
‘But I’m not a truck driver!’ I hear you say. Maybe you work in marketing and think you’re too important to be made obsolete by software. Time to think again. Every type of employment will be changed by this automated revolution. What’s more, machines can do your job faster and more reliably, without stopping for coffee breaks or demanding sleep, salary or pension plans.
With that in mind, what skills should we be developing in ourselves and our children to ensure we meet the workplace demands of the future?
It seems obvious that routine and repetitive tasks are the easiest to automate. This means flexibility and adaptability are key skills to develop – embracing new ideas, but also being more collaborative than a computer can be. Software and machinery tends to excel at specific tasks, but isn’t so good at switching between specialised physical and cognitive activities.

Exceptional social skills that result in meaningful personal interactions are likely to remain key in tomorrow’s workplace

It’s a given that tech and software skills will grow, but maybe roles will become more managerial, handling different AIs as much as writing new code. And what about the arts? Creativity is often lauded as the most human of skills and we get quite twitchy about computers playing at it. That said, machine learning has already created a Dutch master and is producing increasingly compelling music. However, by definition creativity involves not only novelty, but also value – and it is here where computers will fall short for the foreseeable future.
Chatbots are frequently described as the future of customer service and are likely to replace call centre operators eventually. However, because we see the world through human eyes and feel it with human hands, robots and software are unlikely to be able to provide genuine empathy or deal with the nuance of human needs and emotions. Exceptional social skills that result in meaningful personal interactions are therefore likely to remain key in tomorrow’s workplace.
Perhaps the unifying themes here are having empathy and feeling, demonstrating genuine creativity and being adaptable in a way that a machine cannot. We already have a cultural obsession with authenticity, illustrated by the value we place on craft and ‘real-world experiences’. It feels like an attempt to find something of substance to cling to in a world full of superficiality. Will this be even more important when most things are made by computers and robotics?
At the heart of these authentic products and experiences is always a story, and I wonder if this is a key area for the future. Machines can’t go trekking in the wilderness, discover a rare fruit and sell it at home as a new cocktail with an interesting back-story. So perhaps it’s not just the creative collaborators who will thrive in the future, it’s also the storytellers.
In 20 years my daughter’s current Duplo tower building obsession may have led her to a career in architecture, where her role will involve managing various AIs that command a robotic army of welding and cement bots. Her main skills will be in the creative direction of the build, negotiation with stakeholders and selling a tangible experience to potential customers. Then again she might be trekking in Nepal. Whatever the case, we need to be equipping the next generation with the right skills and readying ourselves for what could be the biggest change in modern history.

Posted in Uncategorized | Leave a comment

Coming together to do great things is the secret to design success

In this week’s Digital Distractions, senior UX designer James Reece – a member of the BIMA Hot 100 ­– explains how breaking down boundaries produces the best digital results…

Four men toasting with fresh beer

Have you heard the one about the UX designer, front-end dev and copywriter who went into a bar? They had a friendly pint together, chatted pleasantly about work and nothing untoward happened.

Not laughing? Well, you clearly aren’t an old UX hand, as the notion of such a peaceful carry-on ever happening would have elicited snorts of incredulous mirth around here just a few years ago.

Back then, UX design, editorial and technology teams were siloed and distant, working largely in isolation with minimal collaboration. It was a model that stemmed from the era of industrial product design – where every detail had to be nailed down before sending it off to be manufactured. After this point, change could not happen because it was far too costly – a design change made to a physical product could mean that new manufacturing tools had to be created or factory staff needed to be retrained.

But the nature of digital means that magic can happen where various disciplines intersect. With increasingly complex technical possibilities and ever-inflating consumer expectations, a holistic approach to digital projects is key to ensure the best possible results. This is because today, more than ever, technology is the experience.

Much of my best work as a designer has been produced when working on ideas closely with developers. For example, the design of an app interface went way beyond what I could have conceived alone when a developer took my idea and gave it steroids, manipulating data to create a more dynamic version of the design on the fly. By having different mindsets to collaborate with, lateral thinking flows.

Concepts such as agile, minimum viable product and continual improvement show that, in digital, change is good and to be embraced rather than avoided

It took the digital world a while to realise that it didn’t have to follow the old model and, as our industry has matured, we’ve learned to love one of the fundamental benefits of our digital medium; it has a fluidity that doesn’t exist in industrial design. Ideas can be tested and iterated with marketplace feedback quickly and regularly. Concepts such as agile, minimum viable product and continual improvement show that in digital, change is good and to be embraced rather than avoided.

This contemporary thinking has led to a collaboration between skillsets that is becoming more important than ever, as the challenges of today involve working with concepts such as machine learning, chatbots, VR and the integration of complex data sources. User interfaces are disappearing and leaving technology itself to be the experience, without being masked by a design facade. Services such as Google Assistant and Amazon Echo are examples of a technology-first, design-second world – and in this world, collaboration is the watchword.

As an example, let’s look at creating a chatbot interface. Here, technology is the beating heart of the service. Next, UX design plays the role of ensuring that the experience of the technology feels human and matches the user’s needs. Editorial teams then devise the technology’s ‘personality’ – Google hired a team of writers from places such as A-grade storytellers Pixar and the satirical website The Onion to help make their Assistant feel funny and emphatic. Visual designers add the delightful details – for example, tell the Google Assistant you’re bored and it serves you up a fun game to kill some time.

We all know that technology isn’t getting simpler, so it’s key that digital teams truly understand it in order to devise the right solutions. But to do this we must collaborate: with UX design, editorial and technology experts all working as one to create a technology-led experience, the kind that makes consumers feel weak at the knees.

In the words of Jony Ive: “You have to deeply understand the essence of a product in order to be able to get rid of the parts that are not essential.” In digital, only through collaboration of disciplines can we achieve this level of understanding.

Oh, and obviously having a pint in the pub is usually more fun if you’ve got a couple of like-minded colleagues with you…

Posted in design, process, UX | Leave a comment

Design is only a hypothesis

A genius of the Renaissance reminds us why we should always test and improve our designs.

“Art is never finished, only abandoned.”

Leonardo da Vinci

Now I’m not saying that we are Renaissance geniuses. I’m not even saying we are artists. But isn’t it telling that the biggest artistic genius of all acknowledges that perfection can never be achieved?

When it comes to design, however, sometimes there’s an expectation that, as design experts, we should get everything right first time. And maybe we should. With all the facts on the table, a clear brief, great insight into the audience, and a talented design team who are armed to the teeth with design theory and cutting-edge tools, we should hit the bullseye first try.

But the truth is, even with all that, design is often slightly off the bullseye. Occasionally it’s not even on the board. Should we be surprised by this? Not if we acknowledge that design is only ever a hypothesis rather than a definitive answer.

The initial design is just the first step to getting it right

A design is a hypothesis. It’s the designer’s best guess, based on the information and resources they had available to them at the time. But it’s unlikely that they really had all the information, as there are so many variables to consider (from the known unknowns right through to unknown unknowns).

Many other factors can steer a design’s course: feedback from clients and colleagues, time and budgetary pressures and so on. So when your design is finally ‘done’, it’s unlikely that it will be 100 percent perfect.

This is why testing, refining and iterating design is so important.

Don’t stop until it’s good enough

If a design is a hypothesis, then, by definition, it is the ‘starting point for further investigation’ rather than the finished article. It needs to be tested and improved until it performs as well as it possibly can in meeting its objectives. Is it communicating what it should be? Is it allowing users to achieve their goals effectively? Test the hypothesis and find out, adjust the design, test again, and keep repeating until you are meeting diminishing returns.

Is this all starting to sound more like science than creativity? Well, in many ways, design is the marriage of science and creativity, which brings us back to Leonardo (of Da Vinci fame).

Posted in design, process | Leave a comment

The final word on ‘the fold’ debate

There is a piece of feedback that strikes fear in the heart of any designer: “But what about the fold?”. For many people in the design community, this issue is dead and buried. For clients, less so. But what’s the truth? Are designers being too dismissive of this problem? Are clients misinformed? As with many things in life, the truth is not as clear-cut as either side might wish to believe.

Here’s the final word on the debate. For now.

“Here’s the final word on the 'Below the fold' debate. For now.”

What is the fold?

The ‘fold’ is an old concept from newspaper publishing that said the most high value content should be on the top half of the newspaper’s front-page design. This is because newspapers were typically displayed for sale folded in half, with only the top half visible to the passing customer.

It is used in the context of web design to refer to the area of the page that’s visible without any scrolling. It is assumed that the same logic applies – if you can’t see what you are looking for straight away you might leave, particularly as people don’t tend to scroll. But what’s the truth?

Do people scroll?

One of the arguments against the fold being a problem on web pages is that people scroll, so we don’t need to be too concerned about what they see on first view. This is true – evidence shows time and time again that people do indeed habitually scroll down when they visit a webpage, so we shouldn’t be concerned that people wont scroll (*see caveat below though!).

Is there even a fold?

Today, websites are viewed on many devices, with huge variations in screen size. Many people think that this variation has made the fold debate obsolete. However, despite this variation, responsive web design actually means that all devices end up seeing very similar variants of the header area of a webpage.

* When is the fold a problem?

There is an occasion when people do not scroll. This is when the user incorrectly perceives there to be nothing more to see on the page than what is currently visible in their browser viewport. This is rare, but is sometimes the result of design cues that discourage scrolling – stark, horizontal lines at the bottom of the screen (a ‘false floor’), or designs that perfectly ‘snap’ to the browsers viewport.


The fold is not an impenetrable barrier. People will scroll down*.

We do not need to try and put everything above the fold. However we need to communicate value as quickly as possible, to keep the user on that page and stop them from leaving.

So what they see above the fold is still important – as it sets expectations for the rest of the page. For example, a page headline that successfully communicates the value of staying on the page is incredibly important as it will help stop people leaving the page and encourage them to explore by scrolling down.

People do scroll, but they also do bounce / exit if they don’t quickly see value or aren’t where they thought they were.

The further you put something down the page, the fewer the people who will see it.

This is a numbers game where drop off rates rise the further you get down the page – something scrollmaps in Hotjar (analytics tool) show us quite clearly. Therefore page hierarchy is important – typically, high value content should be at the top, lower value content at the bottom.

Does this mean that ‘long’ pages are a problem? No – interesting content is what keeps someone on a page, and keeps them scrolling down. For example, our new Hospital Bag checklist page for C&G Baby Club has 17min+ page times from PPC.

A page should be as long as we can keep it, without losing the interest of and relevance to the audience. Pages are only too big when they become slow to load – but even then clever tech can negate this problem.

  • So does the fold matter? Yes, in the sense that it’s important to make a great first impression, and to let people know where they are.
  • Do people scroll? Yes, unless you trick them into thinking there’s nothing more to see.
  • Can a webpage be too ‘long’? Only when we’ve run out of things to keep it interesting or relevant.

You’ve made it this far

Now, let’s promise to never talk of the fold again, and instead focus on making web pages awesome for our users. Thank you.

Posted in analytics, design, psychology, research, UI, Usability, UX | Leave a comment

Stop saying that word ‘Millennial’


Barely a day goes by without a new report claiming to provide insights into the surprising habits of so-called ‘Millennials’.

Articles such as ‘How Millennials are changing the face of retail’, ‘Millennials and mobile: what marketers need to know’, and ‘Over 80% of millennial generation see mobile TV content as essential’ pepper the marketing press. A search of the Econsultancy website returns 13,900 results for ‘Millennial’, suggesting that there’s a lot to discover about this generation.

Get closer

Why the obsession with ‘Millennials’? Anyone who works in marketing needs to understand their target audience if they want to communicate effectively with them, and businesses see young people as both a huge opportunity and a tricky challenge. Understanding them, and appealing to them, is key.

However, the problem with obsessing about ‘Millennials’ is that it risks creating distance between businesses and their audiences, rather than a deeper understanding. Occasionally it is useful to talk about a generation as a whole (‘baby boomers’, ‘Gen X’, etc.), but the downside of doing so is that it involves describing huge swathes of the population and their habits with the broadest of brushstrokes. Herein lies the danger: ‘Millennials’ don’t actually exist, but people do.

3 reasons to stop saying ‘Millennial’

1. It creates distance, not empathy

‘Millennial’ conjures up a caricature of a person who spends all day on Snapchat sharing selfies whilst watching PewDiePie and Zoella on YouTube. Sure, these things are popular, but the cliché of the ‘Millennial’ seems to get more absurd with each article shared within the marketing industry, each adding another layer of veneer to the myth of the ‘Millennial’.

If we believe everything we read about this generation, they become so distant from the rest of the population that you might wonder if they are even human at all. To illustrate this, someone’s made an amusing Chrome Extension that replaces the word ‘Millennial’ with ‘Snake People’ in web pages, exposing the absurdity of journalism that describes a generation as being so different to the rest of us that they may as well be aliens.

2. Most people don’t even know what ‘Millennial’ means

Within the marketing community, ‘Millennial’ seems to have become a byword for people born around the year 2000; ‘digital natives’ who grew up with the web and smartphones; or, in the worst circumstances, simply ‘people younger than me’. However, according to Wikipedia, ‘Millennials’ or the ‘Millennial Generation’ actually refers to Generation Y, ‘the demographic cohort following Generation X’, with ‘birth years ranging from the early 1980s to the early 2000s’ – so a ‘Millennial’ could be 35 or they could equally be 15.

This lack of definition makes it a dangerous phrase as it could be interpreted in wildly different ways. Someone in their mid-thirties and someone in their teens are so likely to have such different behaviours and worldviews that it’s not particularly useful to talk about them in the same breath.
We need to be more specific about who we are describing if we want actionable insights.

3. Assumptions are dangerous

Judging by the amount of articles available, there is an insatiable appetite for information about ‘Millennials’. Skimming these stories you can build up a picture of the ‘Millennial’ generation, but this linkbait-fuelled myth is likely to be wrong. A while back I attended a great talk by Dan Healy, User Experience Consultant at Nationwide Building Society.

He recalled the tricky task of recruiting 11- to 17-year-olds to use Nationwide’s new FlexOne young person’s current account. His research with this audience debunked many of the clichés of what you might think ‘Millennials’ want and how you should communicate with them, and highlighted the dangers of making assumptions about their behaviours. For example, something as seemingly dated as signing a form with a pen to open a first bank account was seen by young people as a right of passage and a key moment in becoming an adult, not a symptom of a bank out of touch with young people or technology.

Because we’ve created the myth of the ‘Millennial’, a species oh-so-different to the rest of us, confirmation bias comes into play. We are willing to believe the crazy articles we read as it confirms our existing belief that ‘they’ are different from the rest of ‘us’. This helps explain the huge amount of demand for articles about the habits of ‘Millennials’ online, and the competition to make more and more outlandish claims about this generation. This Millennial Insight Generator mocks this, by generating randomised ‘insights’ about their habits, ranging from the banal to the outrageous.


Don’t be that guy

My hunch is that there are a lot of (middle-aged) marketing professionals out there who are panicking, as they feel out of touch with younger audiences, and are willing to lap up the myth of the ‘Millennial’. The generation coming through are important to every business. They are the new workforce, the new consumers, the new parents. We must connect with them or risk failure. However lets do that by speaking to and understanding them, not by reading link-bait hyperbole online.

Being more specific about who we are talking about (e.g. ’17-21 year old British females in full time education looking buy their first car’) results in tangible insights that just aren’t possible when you use blanket terminology such as the ‘Millennial generation’. My advice is any victim of the ‘Millennial Bug’ is to get out of your office and open up your ears – hang out with people who aren’t like you (or your colleagues!). Commission proper, targeted research with your customers and understand them. User research doesn’t have to cost the earth, there really is an option for every budget; and you might just be surprised by what you learn.

Posted in research, social media | Leave a comment

Google just open-sourced AI – 5 ways this changes everything


We’ve said before that the future of UX is Artificial Intelligence because of the huge possibilities it opens up, but for most projects the benefits of AI have been out of reach, due to the (not unsubstantial) cost and technical implications. So, last Wednesday, when Google announced the Cloud Vision API, there was excitement among the more geekily-inclined Real Adventurers.

Suddenly huge AI potential is at our fingertips, for any project.

What is Google Cloud Vision API?

Google is seemingly on a mission to open up AI and related technology, such as Deep Learning, to the masses, through initiatives such as TensorFlow, the fruit of the brilliantly named Google Brain team. The Brain team have flexed their collective cerebral cortex once more and given us Cloud Vision, an API that allows your app or website to understand the content of images.

In Google’s words, ‘it changes the way applications understand images’.

For the less technical among you, an API is essentially a service that sends data back and forth over the Internet. Google’s APIs allow anyone to access the power of their cloud-based supercomputing. Want your app to include maps or directions? Use the Google Maps API. Want to harness the capabilities of Google Search on your website? No problem, use the Search API.

The Cloud Vision API allows your website or app to send images to Google’s servers and, in turn, receive data that describes the content of the image. So send it a photo of a beach, and its computer vision technology will analyse it and tell you that it contains a palm tree and a sun, almost instantaneously.


Anyone who has used the brilliant Google Photos app will have experienced the clever tech behind the Cloud Vision API. Google Photos automatically categorises the thousands of photos on your phone (the ‘average’ person takes 1,800 photos a year with their phone), allowing you to sort through your pictures in new ways. For example, Google Photos will automatically group all the photos of your cat together, so you can ‘paw’ over them at your leisure. This is only possible because the software knows what a cat looks like – and that’s the key to the power of the Cloud Vision API – it allows software such as websites and apps to ‘see’.

An example of Google Photos' automatic categorization

An example of Google Photos’ automatic categorization


5 reasons this technology is awesome:

1. Object recognition

The biggest deal has to be the potential uses for object recognition in photos. See something you’d like to buy? Point your camera at it, and then find it for the cheapest price online. Maybe you have a healthy eating app – point your camera at a food item and see its likely nutritional values. Is the food safe to eat in pregnancy? The possibilities are vast and the experience should feel effortless for the user.

2. Facial detection

The API allows you to detect multiple faces within an image, along with the associated key facial attributes like emotional state or wearing headwear. Although Google have made it clear that it can’t personally identify people’s faces due to privacy issues, meaning it can’t be used for personalisation, the ability to detect people and their emotions still has great potential. Automated chat AI could harness this capability and respond differently, based on the user’s likely emotional state. Support lines could prioritise support queries from the most irate customers – or make them wait so they can cool down. You decide!

3. Make your products faster and more useful

There’s a tendency to think big with these new bits of tech, but I think it’s a good idea to think small as well. AI-powered micro-interactions could be an opportunity to make an interface more useful, faster and a delight to use. For example, if your coffee-themed app has a ‘share your latte art’ feature, why not suggest photos of latte art from the user’s phone or computer, instead of making them wade through all of their photos looking for them? Developers could also use the API to add metadata to their image catalogues to make it easier for people to find what they are looking for.

4. Moderation

Moderation isn’t at the top of many people’s lists, but it remains important. On a large community or user-generated content project it can be a costly overhead, particularly if it means people trawling through millions of images looking for photos that break guidelines or terms and conditions.
The Cloud Vision API can tap into Google’s SafeSearch functionality and flag photos with inappropriate content (e.g. pornographic or violent). It can also detect popular product logos in photos, potentially useful in scenarios where logos or brands aren’t allowed in a competition entry. The API even has the ability to detect text within images, along with automatic language identification – another potentially useful tool in the moderation of user-driven content.

5. Beyond apps and websites

Google has made it clear that this technology isn’t just for websites and apps. Drones, robots, automated cars and the whole Internet of Things can benefit from being able to see and understand what they are looking at. A robot could approach someone smiling, but avoid someone aggressive (but let’s avoid building Robocop please).
Sony is already using the technology to process millions of pictures being taken by its Aerosense drones.

The boring (but important) bit

As always, we need to remember and respect people’s right to privacy and data protection, and let them know that their content will be processed by or stored on Google’s cloud servers. But if we use it in the right way, and lead with the benefits, it should be a no-brainer. Google also needs to unveil the pricing plan for the product – it’s likely to start off free and then have tiered pricing for ‘enterprise’-level access to the API (the more you use it, the more likely it is you will have to pay for it).

Get inspired!

We are only just getting started with AI. As more of its power becomes publically available through APIs and open source software, consumers’ expectations will start to change. What seems pie in the sky today will be commonplace tomorrow. It’s an exciting time to be working in this industry.

Check out this video from the Google Cloud team to get those ideas flowing…

Posted in AI, apps, Futurology, mobile, photography, technology, UI | Leave a comment

8 tricks of psychology for better customer experiences

8 tricks of psychology for better customer experiences

At The Real Adventure Unlimited, we design for people. Every day, we plan, design and develop products and services that people interact with. In order to do this effectively, a deep understanding of the audience is key. By tapping into the psychology of our users, we can make work that is not only effective for our clients, but also offers a delightful experience for our customers.

Human psychology is a big, often daunting topic, but for those involved in design, strategy, sales or development, the importance of a basic understanding of how we think and why we act in certain ways can’t be underestimated. In this article I list eight quirks of human thinking and behaviour that can help create magical customer experiences.

This blog post is a written version of a learning lunch that I put on at The Real Adventure Unlimited office recently. These lunches are great opportunities for sharing knowledge with each other, and clients.

Cognitive biases

Cognitive biases are nuggets of easy-to-understand psychology goodness. They explain fascinating human tendencies to think and act in certain ways. They are the result of our evolution, culture, and environment. Our brains are hardwired with automatic responses to situations, which speed up problem-solving and decision-making, so we need not consider common situations afresh every time we encounter them. As well as helping us, they can hinder us, as sometimes these biases prompt us to act in seemingly irrational ways. This means they are open to abuse by evil forces in marketing, but we prefer to use them for good, to give the customer a better experience. After all – no matter how smart we think we are, we’re all susceptible to cognitive biases.

There are many cognitive biases, with more being discovered all the time – Wikipedia lists them in the hundreds. Here are just eight cognitive biases and how we can use them in our work.

1. Goal-gradient effect

A coffee shop loyalty card with 12 boxes, two of which are pre-stamped, will be filled quicker than a card with 10 boxes and no pre-filled stamps. They both require the customer to get 10 stamps, so why? It’s because the first card gives them the illusion of progress, and progress is motivating. The goal-gradient hypothesis says you will accelerate your behaviour as you near your goal.

This effect was identified in 1934 by Clark L. Hull, who found that rats running in a maze would get faster, the closer they got to their food. Knowing about this cognitive bias is useful in several ways:

  • The shorter the distance to the goal, the more motivated people will be to reach it
  • Even the illusion of progress is motivating
  • People focus more on what’s left than what’s completed
  • People enjoy being part of a reward programme
  • When starting customers on a loyalty scheme, giving them a head start will help them move through it quicker

The research:

2. Choice paradox

Is more choice better? You may assume that people like to be given lots of choice, but it turns out there’s a limit to how many options humans like to cope with. What’s more, giving customers too much choice could hurt your business.

In 2000, psychologists Sheena Iyengar and Mark Lepper conducted an experiment, selling jam at a food market stall. One day they had a choice of six jams, and on another day they had a choice of 24 jams. Consumers were 10 times more likely to buy when offered only six options than 24; and, intriguingly, they reported greater buying satisfaction. Other studies have shown that in speed dating, you are more likely to select a match with six dates vs. 10. It seems that the more choice people are given, the more likely they are to fail to make a decision.

Some people have since pointed out that Starbucks has 80,000+ drink choices, and yet they seem to be doing alright for themselves. While this may be true, I’d point out that Starbucks shows only a small number of choices on their menu, but they allow the customer to customise their choice as they see fit. Choice is hardest for those who are undecided and most paralysing for the customer when all the choices are displayed at once.

People are happiest when they feel in control, and having choice is a big part of that, but we must remember not to paralyse our customers with too many choices, as they will vote with their feet and go elsewhere.

The research:

3. Aesthetic-usability effect

The aesthetic-usability effect is a bias whereby users perceive more aesthetically pleasing designs to be easier to use than less aesthetically pleasing designs, even if in reality they aren’t. Aesthetically pleasing designs have a higher probability of being used, reminding us of the importance of investing in great visual design. If we want better engagement with our work, then we need to ensure it’s as aesthetically pleasing as possible to the target audience.

We know that design that is pleasing to the eye gives our customers greater confidence in their ability to use (or learn to use) our products. This is no secret to anyone who works in design; brands such as Apple have used this cognitive bias to build billion-dollar companies. As legendary designer and psychologist, Don Norman says, ‘Attractive things work better.’

By investing in great design and coupling it with consumer insight, we can apply the aesthetic-usability bias to features that appeal to the target audience, and ensure new products and services are successful.

The research:

4. Anchoring effect

The anchoring effect is a cognitive bias that describes the tendency for people to rely too heavily on the first piece of information offered (the ‘anchor’) when making decisions.

An example of the anchoring effect can be seen in credit card statements, where you are invited to make a ‘minimum payment’ each month. Research has shown that without these suggested minimum payments, credit card debts are paid off much quicker, with the customer paying as big an instalment as they can. But it is in credit card companies’ interests for customers to stay indebted to them for as long as possible, so they’re deliberately anchoring their customers with the suggested payment. This tempts customers to pay off a small amount, rather than allowing them to make up their own minds on how much to pay each month.

Anchoring is often employed by charities, which provide suggested donation amounts, such as £5/£20/£100 options. The higher donation suggestion (£100) works as an anchor, so the donor is likely to choose the middle donation amount (£20), rather than the lowest one. Restaurant menus use similar tactics, using a single high-cost item on the menu as an anchor to encourage customers to spend more on food.

Studies have shown that anchoring is very difficult for people to avoid, so decision-makers need to remember that customers will be anchored by pricing structures and other information, whether you intend them to be or not.

The research:

5. Surprise heuristic

Whenever we speak to users of our CRM programmes, it’s often the unexpected or unusual aspects of the programme that they remember most vividly, and speak about most positively. The careline you can call 24/7 and get great support from; the pack a mum received in the post that contained bubble bath and suggested they take a break from their busy day and pamper themselves. People react well to surprises, so we need to remember to build them into our work. What’s better than getting a gift in the post from a loved one, when bills and pizza menus are all you usually find on your doormat?

Like many cognitive biases, our tendency to react well to surprises comes from our evolution – in order to survive as a species it’s served us well to seek out and explore new opportunities, to go exploring over the horizon and find new resources. Punctuating the monotony with a pleasant surprise can be the key to creating a great, memorable customer experience.

The research:

6. Social proof

The other weekend I was in Weston-super-Mare – the ‘jewel’ of Somerset. I was with some friends, and we fancied getting fish and chips. Faced with a bewildering choice of chippies on the seafront, and a lack of local knowledge, we instinctively went to the one with the longest queue. Meanwhile, the chip shop next door had virtually no queue and might have had better food on offer. Why didn’t we just go to the one with the shortest queue?

The reason for our irrational behaviour is our desire for social proof (aka ‘informational social influence’). When humans are undecided, we tend to follow the patterns of others. Again, this is innate; following others is an evolutionary safety mechanism. By drinking from the wrong stream, we might get ill. If we go to where others get their water, we’ll probably be ok.

Building social proof into our digital experiences can be a powerful way to tempt the undecided to act – as the final trigger in motivating someone to make a purchase. ‘Our bestseller’, testimonials, likes and reviews are common ways of providing social proof, but harnessing the power of social networks such as Facebook can give a more powerful way of doing this – knowing that 10 of your close friends bought an item can be a more powerful signal than reading 200 reviews.

The research:

7. Peak-end rule

Rather than judging an experience in its entirety, humans have a tendency to judge an experience by its peaks – its most intense points and its end. So even if your two-week holiday sat on a beach in the Caribbean was pleasant but uneventful, having your luggage stolen on the way home may cause your long-term memory of the holiday to be negative. This may seem illogical and, like much human behaviour, it probably is, but it’s something we should consider when designing customer experiences.

At Disneyland, waiting times for attractions are deliberately overestimated, so when customers reach the end of a long queue, they are pleasantly surprised that it didn’t take as long as they were expecting. Suddenly, the tediousness of queueing becomes a positive experience.

Often our customers have to do things that they might not necessarily want to do (like queuing, or registering on a website), but they will quickly forget these niggles if we can create positive peaks and ends that punctuate the experience. Instead of allowing a CRM programme for new mums to tail off, why not end it on a positive note by giving them a book that shows their journey from pregnancy to having a toddler? This positive experience at the end will leave a lasting good memory of the brand, encouraging them to recommend the brand to others or use it again. We should always consider where we can create peaks in a customer journey, or how to end it on a positive note.

The research:

8. Scarcity heuristic

Scarcity heuristic is a bias that places value on an item, based on how easily it might be lost, especially to competitors. Scarcity can lead to snap decisions to buy, hence supermarkets selling out of bread due to ‘panic buying’ during a perceived time of shortage. In our office, similar behaviour is triggered by a company-wide email with the subject line ‘CAKE!’ on someone’s birthday. As you might expect, this defensive behaviour is born out of our survival instinct.

As we can’t help but value scarce items, shops have been using scarcity as a way to drive sales for as long as anyone can remember. ‘While stocks last’, ‘This week only’, ‘Last one!’ are all seen on the high street, but scarcity is also a powerful weapon in digital. Invitations to join services Gmail, Google+, and, more recently, the ad-free social network, Ello, have deliberately been made scarce to encourage demand for them, and increase the value placed on them.

Travel websites such as and Expedia make much use of scarcity signifiers such as ‘Only 1 room left’, later sending emails reminding you that you have ‘Only 2 days left to review’. It’s worth remembering that such signifiers add to the cognitive burden for the reader, and need to be shown only at the right time, as a nudge to motivate them to act. Scarcity can be a powerful ally – but it can just as easily annoy your audience.

The research:

Jerry Springer’s final thought

So there you have it, eight quirks of human behaviour that we can use to create better customer experiences. But before you start rubbing your hands and considering how to use these cognitive biases to trick your customers, remember that, in the long run, customers are likely to form a negative impression of a brand if they feel they are being deceived. Let’s not forget, we’re all vulnerable to the power of cognitive biases.

However, if we use an understanding of psychology to create a good experience for customers, success will follow.

So, until next time, take care of yourselves… and each other.

Posted in psychology, strategy | Leave a comment