Experimentation / CRO

Experimentation and Testing Programs acknowledge that the future is uncertain. These programs focus on getting better data to product and marketing teams to make better decisions.

Research & Strategy

We believe that research is an integral part of experimentation. Our research projects aim to identify optimization opportunities by uncovering what really matters to your website users and customers.

Data and Analytics

90% of the analytics setups we’ve seen are critically flawed. Our data analytics audit services give you the confidence to make better decisions with data you can trust.

4 Mental Models to Help You Conduct Voice of Customer Research

100 years ago, companies competed on features. The right feature was enough to sell a product, no questions asked. 

50 years ago, companies competed on price. As manufacturing exploded, the best bang for your buck is what sealed the deal. 

Today, brands compete on customer experience. With trailblazing brands like Google and Netflix raising the bar with frictionless experiences, the table stakes are raised for everyone yet again. 

Brands now sell on who they imply you are. 

The only way to keep tabs on changing customer perceptions, wants, and needs is by gathering customer data.

Why is Voice of Customer Research so powerful?

Research has the power to turn the competitive tide to your advantage by giving you valuable and effective decision support when it comes to where and how to improve your customer experience.

My years as a staff research scientist at the University of Texas taught me the ability to stay on a clear line of inquiry and craft a solid hypothesis. Both skills which work as well in the marketing world as they did in the world of conservation and research science. 

Good research takes skill and practice, but also solid understanding of the principles at play. If you’re looking to get started, here are four mentals models I love that can help you get in the right mindset to conduct such research, especially concerning voice of customer data. 

Mental Model #1

This model speaks to why humans might think they know something (the name) but often don’t actually know the underlying reasons/details (the something). 

Feynman illustrates the point by saying;

See that bird? It’s a brown-throated thrush, but in Germany it’s called a Halzenfugel, and in Chinese they call it a Chung Ling and even if you know all those names for it, you still know nothing about the bird. You only know something about people; what they call the bird. Now that thrush sings, and teaches its young to fly, and flies so many miles away during the summer across the country, and nobody knows how it finds its way.

As humans we often use language to obfuscate the fact that we don’t actually understand ‘the something’. 

This concept relates to the Jobs to be done framework, which helps identify real user needs over what they might say their needs are. E.g. Harvard Business School marketing professor Theodore Levitt said, “People don’t want to buy a quarter-inch drill. They want a quarter-inch hole!” But if you ask users what they want, their first answer will still probably be that they want to buy a drill. 

This idea explains why we often fall short when it comes to communication, marketing, and gathering voice of customer data. Often, we’re not selling what we think we’re selling. Kodak didn’t sell film. The record industry didn’t sell records. 

To show this model in action, I’ll share three examples 

Example: Open-Response Data Processing 

One of our ecommerce clients solicited open responses with the question: Is there anything holding you back from making a purchase today?

The top reason that website visitors stated was price. But price is just a name; our mental model allows us to dig a little deeper. What people say and what they feel is not always the same thing.

Looking closer at price, we saw three separate trends:

  • Comments about price and product 
  • Comments about price and shipping
  • Comments about affordability

In other words, some people were connecting product value with uncertainty about success, others didn’t like the shipping piece, and a third group simply didn’t have the funds available to complete the purchase.

This illustrates how the way that we code and parse things matters. A price objection is just the name of something. It's a rationalization, not necessarily the reason.

Example: Native Deodorant 

For several years I worked on Native Deodorant, a pricey natural deodorant now owned by Procter & Gamble (acquired during our optimization program work with them.)

Native Deodorant is a high end, convenience product. It’s original headline copy read “Deodorant to stay fresh and clean”

In the Jobs to be Done lingo, do I hire deodorant at $12/stick to stay fresh and clean? No! I can get it way cheaper than that. I hire it because I care about the products/ingredients I use on my skin. Thus we changed the headline “Deodorant, without the chemistry experiment” which fits much better with their customer wants. 

The something was not deodorant, fresh, or clean, those are names. The “something” is the feeling I get when I buy this deodorant, that I’m a person who cares what they put on their body. This is ‘the something’ we needed to convey in the messaging. 

Example: Nanit Baby Camera 

Another D2C ecommerce brand here, Nanit, a $300 baby camera with all the bells and whistles.

See this product detail page, pictured below, a sea of text and feature names without a clear value proposition.That’s not so bad on it’s own, after all, product detail pages are supposed to have details. 

The problem here is that we landed here from a retargeted instagram ad after visiting the site one time. And the issue considering this journey is that above fold it’s just a lot of names of things, but no ‘something’.

But scroll down, and you start to see ‘the something’. An engaging photo of happy parents, along with numbers that actually mean something to the consumer. 10 hours of sleep! 92% efficiency! A sleeping baby in 12 minutes! I want all this for my baby.

It's powerful because it provides a rationale for the purchase, setting the groundwork for why you need a baby camera - and not just any camera, a Nanit camera. With this data, customers now have the questions for which the camera provides answers to! Super powerful.

Mental Model #2

This mental model, courtesy of Viktor Frankl, the author of Man's Search for Meaning, defines knowledge as being equal to experience plus sensitivity. Frankl’s book is all about finding the right answers to life’s various problems - It’s an equation, and helps to keep aware that just because you have an experience with something (or someone) doesn’t mean you ‘know’ this thing or person.

To define what’s meant by sensitivity, let me start with an example. 

Someone on Twitter says; “all digital marketing is all a waste of time.”  

But do they really know all digital marketing is a waste of time? Are they basing this ‘knowledge’ on one campaign they ran three years ago for a niche brand over a one month period with one creative? 

Have they tried all of the various digital channels, creatives, and targeting combinations relevant to your specific brand and audience? 

Probably not. In short, they might not have the adequate breadth of experience with digital marketing that allows them to make such an ultimatum. 

Rethinking Best Practices

This mental model shows how we need to be more strategic and thoughtful when it comes to both our web tactics and our research tactics. What works well in one instance cannot be generalized to all situations, businesses, or customers.  

Let's say that you used some best practice ecommerce guidelines to set up a well-converting homepage selling couches. Does this mean that you can successfully use the same principles to sell coffins, or a fast-moving consumer good?

No, because the tactics from the first situation are not sensitive to the new market, different users wants and needs, buying journeys, or the environment.

Example: A website design 

Recently, I reviewed the result of a website design study. The team who sent it to me stated that the findings indicated the website was perceived as dated, boring, and unfinished.  

Looking at the findings above on the left, which of the five research methods listed on the right would you assume were implemented? 

In this case, the researcher team used the method of user testing. 

But are the above bullet points reflective of real, generalizable knowledge? Can we rely on these conclusions in order to take steps to improve the design?

User testing is intended to observe user behavior, and this is not a technique that is sensitive enough to draw such summative conclusions as what we see above on the left. Some red flags are the use of words like “some” and “most”, and 70% when the total sample was 9 people. 

Needless to say, I gave the research team some critical feedback regarding the misleading set of conclusions that they drew from using this research method, and the tiny sample size. 

Mental Model #3

Not only do we need to know when a design or a conversion tactic is appropriate, but we also need to assess how we apply our research techniques. We need clear goals.

That’s where our third mental model comes in: An answer is meaningless until you have the question. 

This mental model is about not getting stuck in reams of data and drawing random conclusions, but instead being crystal clear about your goals. Because while we need data to help us make informed decisions, answers without questions lead to information that can be hard to action or not valuable to your business goals. 

The VoC Playbook

While Voice of Customer (VoC) research can span a wide range of applications, my team is interested in improving digital experiences in order to increase long term profit and growth.  

We’ve got a playbook in place - one that’s helped us optimize customer experiences for hundreds of diverse businesses. Our playbook includes some guiding principles on what types of research methods to use depending on what questions you have. 

With voice of customer research we’re guided by wanting answers to questions such as:

  • Who are they? 
  • What are their perceptions? 
  • What are their behaviors? 
  • What are their motivations? 
  • What are their anxieties? 

So to uncover user personas, we want to conduct surveys, or to know perceptions, we want to use benchmarking or moderated user research. There is of course a myriad of other models, research methods, and data sets that you can leverage but this is a great starting point. 

At a higher level, this model represents the concept of leveraging an explicit process to contextualize different types of data observed from many different angles or methods. 

Mental Model #4

This brings us to the fourth mental model: The true method of knowledge is experiment.

After data is collected, we need to experiment in order to determine how sensitive it is across environments. This means we move from a state of raw data to a set of information to a final sense of knowledge: 

Data > Information > Knowledge

Until you apply a scientific process of analysis to data, you remain on the edge of knowledge - not in the possession of it. The leap from information to knowledge is where experimentation lives.

Example: Native Deodorant

To illustrate this process in action, we’ll revisit the example of Native Deodorant.

Their original product page looked like the photo below. It looked clean and focused.

Our voice of customer research showed that users agreed it was an attractive, usable website. The reviews were strong.

But objections were also present. There were comments about things like the effectiveness and texture of the product, and complaints about the coconut scent. 

Our attempt to form knowledge about these objections led to some failed tests and a few months of dead ends. We started focusing on objections, trying to mitigate those. Then went to motivational elements to build trust via social proof and messaging to reassure people about the effectiveness. But this was a dead end, lots of failed tests. 

But then we re-examined our research findings and focused on the scent related comments. While users had said they loved the array of unique scents, on the main page only coconut was visible. And well, some users had told us about their specific distaste for coconut.

I wish I could tell an inspirational story about how everything just clicked with the data alone, but it took some creative experimentation to solve the puzzle. 

The problem with scent proved to be the key. After adding a selection of color swatches, similar to on a retail clothing site (inspired by Bombas truthfully….but inspired with all that research data in mind), we tested it - and it was a winner.

This change did move the needle on conversions a bit, but the most meaningful impact was on the average order value. With an increased awareness of the different scents, customers began to buy more deodorant at a time.

This catapulted us into even more ingenious experiments surrounding sample packs, subscription options, personalized upsells, and cross-sells. 

While experimentation is only one form of data collection, it's one of the best tools that we have to avoid falling into data traps and without it our research data couldn’t be turned into knowledge about what would really work for their customers. 


To recap the mental models for research that I’ve covered, remember to always:

  • What people say and what they actually mean can be very different. 
  • Knowledge requires the application of sensitivity. You might not know something as much as you think you do, so use research or experimentation methods to check whether that knowledge applies to your specific situation. 
  • Be careful about data traps. To avoid them, have specific research plans and goals.
  •  Be mindful that data is step one in the process of finding knowledge and the application of experimentation (which measures sensitivity) can help you move to knowledge.

Related Posts

Who's currently reading The Experimental Revolution?