Experimentation / CRO

Experimentation and Testing Programs acknowledge that the future is uncertain. These programs focus on getting better data to product and marketing teams to make better decisions.

Research & Strategy

We believe that research is an integral part of experimentation. Our research projects aim to identify optimization opportunities by uncovering what really matters to your website users and customers.

Data and Analytics

90% of the analytics setups we’ve seen are critically flawed. Our data analytics audit services give you the confidence to make better decisions with data you can trust.

How to create an optimized experimentation strategy using the ResearchXL model

Experimentation shouldn’t be left to chance; if you’re committed to data-driven customer experiences, you need to have a clear and well-researched testing strategy. Emma Travis, strategist at Speero explains what the ResearchXL model is and how it sets you up for experimentation success.

A good customer experience optimization (CXO) strategy relies on A/B testing, but testing shouldn’t be random. While it’s tempting to dive in with one of the myriad of available tools, don’t start testing until you have clarity and insight into your users FUDs (fears, uncertainties and doubts) and the current barriers in your user experience,. This means following a thorough research methodology that allows you to begin testing with confidence by focusing on solving genuine problems for your users. This crucial phase will help you identify both your customer’s needs and the conversion points that serve them (and your business).

Beyond trends and best practices

Relying on best practices, competitor insights, global trends, or even intuition to guide your testing could see you ploughing resources into unprofitable areas. For example, if your goal is to boost inquiries, you might instinctively focus on your contact page, when you should in fact be testing two steps earlier to help drive more traffic to that page in the first place.

Your website is not a one-size-fits all destination; your customers and prospects have their own specific sets of problems, wants and needs, and context is key. By following the ResearchXL process, you’ll not only be able to set strong foundations, you’ll understand your audience better than ever before – when, where and how they want to connect with you, and why they might drop off.



The ResearchXL framework

After over ten years in optimization, our team designed an optimization research process to help you create better test hypotheses: the ResearchXL framework. This framework is proven to work across many industries and business models and helps you determine where to focus your experimentations efforts.


Here are the eight steps that comprise the ResearchXL method.

Step one: UX heuristic analysis

The UX heuristic review is primarily designed to assess the user experience against a set of industry standards, or “guidelines”. This activity gives a top-level view as to where the key issues and areas of opportunity are on any website. These assessments should be carried out by experienced analysts and the feedback triangulated with statistical data points e.g. to what extent do heatmaps support the heuristic analysis findings?

You'll need to structure a walkthrough of the customer journey and record issues and opportunities, as well as grading elements based on a set of criteria, such as:

1. Relevance: Does the page meet the user’s requirements? Does it answer their questions?

2. Clarity: Does the site clearly communicate everything the user needs to know?

3. Value: How does your customer benefit from landing on this page?

4. Friction: Is anything causing doubt or uncertainty? How can this be simplified?

5. Distraction: Does anything detract away from the action you want your customers to take?

The UX heuristic review is more effective when carried out by multiple researchers individually, who then comb their findings and aggregate any scores. Insights can also be discussed as a group in a workshop setting, which can be a great way to increase engagement from wider teams.

Any robust UX heuristic review should also include a technical analysis. Before testing anything, you should eliminate all technical bugs and glitches, to ensure your customers aren’t getting put off by slow-loading sites or glitch-ridden checkout processes. Carry out these key checks:

Cross-browser and cross-device testing

Not everyone uses their phone or is on the latest browser. Use Google Analytics’ Browser and OS reports (under Audience > Technology) to see the traffic and conversion rates for each browser and version. You can use sites such as browserstack.com to easily view your site from every browser.

Speed checks

Nothing puts off users like slow load times. Under Page Timings in Google Analytics, switch on the 'comparison' toggle to identify the slowest pages, then pass them through Google PageSpeed Insights. This should tell you exactly what's causing your pages to lag.

Step two: Analytics Analysis

This step involves 2 stages: 

  1. Configuration audit

Since you’ll be making decisions based on the data within Analytics, it is of upmost importance that the data is accurate. Check your  Analytics account for holes and broken links and remove any reports you don’t need. Make sure it’s collecting data that delivers the insights you’re interested in, and fix any issues that could impact reporting.  Double check data is collecting as expected by double checking all key settings, ideally with an analyst or developer who is experienced with Analytics set ups. It’s also important to check success metrics are agreed and configured. Oftentimes, there are many goals and events set up by former team members that no longer needed, so some housekeeping is recommended to ensure there’s no overlap or confusion.

  1. Data analysis

Once you’re happy that the data is accurate and you’re clear what metrics you’re analysing, you can begin analysing the data. Analytics is about far more than hits and users. It can tell you what your customers are doing (and not doing), which content they are engaged with, how they are moving through your site, and ultimately where your site isn't performing. Dig through the reports and look out for any interesting anomalies in the data that could point to an area of opportunity. Often it’s helpful to have access to the website while you’re reviewing Analytics, to help you better understand the data and begin hypothesing why. 

Of course, Analytics gives you the ‘what’, but not the ‘why’ - that’s where the qualitative ResearchXL activities come in.


Step three: Mouse-tracking analysis

Heat maps give you invaluable insights into how your audience interacts with your site which, when coupled with the Analytics data, can start to further build understanding of where the key opportunities lie.

Here are 3 common mouse-tracking tools you can use:

1. Click maps

Click maps show you where your audience clicks on your website. You'll see which areas are heavily clicked and which parts of your website might be going unnoticed. Click maps also help highlight when users are clicking on something that is not clickable, which is highly likely to cause frustration.

2. Scroll maps

A scroll map shows you how far your customers are scrolling down a page. If users are dropping off before they get to read important content, you may want to test an alternative page structure or optimize the content.

3. User session replays

A user session replay tool literally records a snippet of a customer interacting with your website. You can see where they navigate and click around your website and uncover areas of potential confusion or frustration.  Hotjar, SessionCam, or Clicktale are all great tools you can use.

Step four: Polls

On-site polls can help give actionable insights behind the mouse-tracking data. On-site polls can be configured in numerous ways, from exit polls to pop ups which appear after a certain amount of time or after a particular behviour has occurred.

It’s important to carefully consider what you’re asking and where / when you’re asking it. Asking users what is holding them back from making a purchase 5 seconds after they arrived at your website, is unlikely to yield actionable insights (as users probably don’t know the answer to that yet). Ideally, you want to gather insight from users who are on the “cusp” of conversion, which is where exit intent polls come in. Exit intent means the tool has detected that the users is about to leave the website based on their mouse movements. This is a perfect time to find out why they are leaving, but be sure to test out your targeting to avoid polls being triggered too early. You can also use polls to understand intent, by asking users earlier on in their journey about their intentions, this could help you understand research vs. purchase mindsets, especially when segmented by device.

In some cases where polls aren’t possible, another great source of data is chat logs. Analysing chat logs give insight into users struggles and can also help with optimization of copy and content by highlighting wha questions users are asking.

Step five: Customer Surveys

Qualitative email surveys are also a great way to get inside your customers’ heads. Send an email survey with open questions (not Yes/No) to new customers - aim for at least 200 responses - to give you enough strength of signal within your analysis. Response rate can depend on many factors, if you’re concerned about reaching 200 then consider offering an incentive such as a gift card or discount against their next purchase.

These email surveys can be seen almost as the opposite to exit polls. These are people who have recently converted vs. those who have just chosen not to convert. As such, the data and insights from the customer surveys tends to uncover more about motivations as well as highlighting opportunities via questions such as “why did you choose us?”

You can also gather insight into your customer’s motivations by speaking to the people who speak to your customers every day: your sales and customer service teams. This can be hugely insightful in gathering a deeper understanding of the problems you help your customers solve and the reasons they pick you over competitors.

Step six: User testing

User testing is gives you a glimpse into users experience of interacting with your website. There are a few different ways to conduct user testing, including moderated user testing which involves conducting 121 research sessions. However, remote user testing is a great place to start and can be extremely eye opening despite being relatively low cost.

Tools such as Userlytics, Usertesting and UserZoom give you access to a panel of research participants whom you can select from based on specific criteria. You can then set questions, scenarios and tasks for these participants to conduct, while their screen and voice is recorded. Participants are encouraged to use a method called “concurrent think-aloud” where they talk through their experience as they go. Feedback and behaviour can then be observed and analysed to reveal ssues or sticking points. What’s not where they expect it, and where aren’t things clear?

Combined with heuristics, analytics data and mouse tracking, insights from user testing can begin to answer some of the questions about why users are dropping off from certain pages or interacting with the website in a particular way.

Step seven: Copy testing

Your copy is one of the most important aspect of your website. It helps you get found; provides users with information; guides them through the site; and it communicates your brand. It’s not easy to get data to help you assess whether you’ve got the right balance, but conducting copy testing can help.

Copy testing is similar to user testing in some ways, but rather than being focused on the website functionality, it focuses on content and messaging. Tools such as Wynter allow you to upload screenshots of key pages and gather feedback from research participants. Again, it’s important to think about the type of people you want to gather feedback from, the more relevant to your genuine target audience, the better.

We suggest asking users:

1. To articulate their understanding of the messaging

2. What, if anything, is unclear

3. What action they feel they’re being asked to take

4. What, if anything, confuses them

5. What selling points are most attractive

Step eight: Collating your insights and taking action

By now you should have a long list of issues, insights and opportunities. It’s easy to feel overwhelmed at this point but there’s a few ways you can begin to organize everything you’ve found into an actionable optimization strategy.

To tackle your list of issues, we suggest first sorting them into five categories:

1. Instrument: something that needs fixing with your analytics reporting such as streamlining tags or adding events

2. Just do it: no-brainer problems that require little effort

3. Test: an issue that could reveal an uplift in conversions or a change in behavior

4. Hypothesize: ideas without a clear or obvious fix that you can brainstorm hypotheses and create test plans for

5. Investigate: an issue that requires further testing or digging

Now you have your categories, assign priority levels using a prioritization framework, such as the PXL test prioritization framework. When ranking the issues, consider:

1. Ease of implementation: Time-consuming tasks can be worthwhile, but might delay your quick-wins.

2. Potential: How big would the uplift be once you implement the change? You’ll want to prioritize those aspects with the greatest growth potential.

Use your project management skills to organize tasks into a spreadsheet or project planner, and assign tasks and timelines to responsible parties, such as your development team for the JDIs.

Now that the ResearchXL methodology has given you clarity on what your site and customers need, you can make necessary changes and set focused goals that will set you up for successful high-uplift experimentation programme.

If you’de like to discuss our Reseach XL model, or run through some of the challenges you’re facing in optimising your tests, drop me an email; I’de be happy to have a chat.

If you founds this blog interesting and you’d like to collaborate with us on insightful content which brings value to peers and partners, drop Morteza an email.

Did you like this article?

(Your feedback helps us write better!) worst 1 - 10 best

Did the article resonate with you?

What aspects did you enjoy or find lacking?
Were there elements you felt we should've covered?
Thank you.
Oops! Something went wrong while submitting the form.

Related Posts

Who's currently reading The Experimental Revolution?