The fact is, most users to your website won’t convert. You’ll always have those who were never going to find what they needed from you, but what about the ones who just needed a little more hand-holding? The users from your target audience who would have signed up to your free trial, filled in your form or even purchased from you, but didn’t feel like they were in exactly the right place.
This is why you can’t rely solely on visitor drop-off numbers as a success measure. Doing so could result in “spaghetti testing” and spending precious resources on fixing issues or adding content that don’t actually help. Instead, get straight to the why and tackle underlying problems head on. What are the friction points in your funnel?
Users have fears, uncertainty, and doubts, or FUDs, that prevent your website converting them to leads or customers. If you can pinpoint those FUDs, you can tailor messaging and content to resolve the most common conversion obstacles. Luckily, you can find out your user anxieties the “easy” way: just ask them.
Targeted, customized on-site surveys and polls at various stages of the user journey help you easily capture customer anxieties. With these insights you’ll be able to answer questions such as:
- “What’s holding a user back from action?"
- “What outstanding questions do users have?”
- “Why are users dropping out from this page?”
Exit polls reach out to users looking to leave your website. Checkout page polls touch base after purchase. Even analysis of online chat logs and feedback mechanisms can offer key insights into your user journey. Before we cover the range of survey tools you can use, you’ll first need to ensure your on-site surveys have a clear purpose
Pre-Survey Decisions
There are three critical steps to follow in the planning stages of your survey.
Step 1: Determine the top-level goal of your survey. Is there a specific page in your funnel you are looking to gather insight into? Or are you running more “exploratory” research aimed at identifying the pages where there are the biggest opportunities? This will help you with the steps that follow.
Specific goal example: “Understand why users are exiting the website from the cart page.”
Although identifying FUDs is the most typical use-case for on-site surveys, there are many other use cases too. For example, you can use on-site surveys to help you understand user intent, by asking a simple question about the purpose of their visit. Another example is to explore and gather data to support product development, check out this example from Hotjar’s site.
The possibilities are endless, so don’t forget to consider creative ways you could use on-site surveys to gather data to support a wide range of business decisions.
Step 2: Determine which pages your survey will run on. If your goal is specific to a page, then you can skip this step. If your research is more exploratory in nature, then consult your website Analytics to determine where to focus your efforts. Look for high traffic pages which aren’t performing as well as you’d expect. Also consider the pages which are important to your funnel: product pages and cart pages are prime examples.
Step 3: Determine how and when users will trigger your survey. Timing, targeting, and your approach are key to getting meaningful responses from the users that matter to you. There are a range of considerations here. Firstly, device. Each device works best with a different trigger. On desktop, you have the option to trigger a survey based on exit intent, e.g. when a user moves their mouse outside the main area of the browser window (indicative they are about to leave the website). This isn’t an option on mobile devices (since there is no mouse!) so time based triggers are a good solution. Again, your Analytics data can help here via average time on page data. Remember, you’re looking to gather insight from users who have FUDs, so setting a time based trigger around or slightly above the average time on page will help you target users spending longer than average on the page.
You can also use GTM to be more specific about who you target, e.g. user type (new vs. returning), country, etc.
The Pros & Cons of Post-Conversion Surveys
Sometimes our clients express concerns about the impact of running on-site surveys on the user experience (and ultimately on conversion rate). While this is a valid concern, the insights gathered and subsequent improvements, tend to out-weight any negative, short-term impact. That said, most tools have an option to specify that users should only see the survey once, even if they don’t reply. Similarly, avoid setting up multiple surveys throughout your website. If you want to gather insight about different pages, then either set up targeting rules to exclude users who have already seen another survey (traffic depending) or, simply run the surveys one at a time.
If you’re still concerned about running on-site surveys within your main funnel, you could opt for a post-conversion survey. KLM Airlines had tremendous success with their post-checkout survey executed through Usabilla. In fact, their response rate was so high - at about 70% - that the team thought it must be a bug in the Usabilla system.
After purchasing a ticket on KLM, users were given a quick opportunity to provide input on their experience. There’s a visual metric for rating their overall experience and a few additional questions about potential areas of friction.
Remember, post-checkout surveys will usually give more positive responses, so as you process the data from post-checkout, you do need to consider that these are the attitudes of successful customers rather than abandoned carts or user exits. Either way, it’s a quick-win: you’re still going to learn about your users’ general fears, hesitations, and anxieties.
In general, post-checkout surveys will ask questions about the checkout experience, for example:
- “What almost prevented you from purchasing?”
- “Did you have any questions?”
- “Did you have any issues with using the website?”
You can also use a checkout survey to check in about user motivation, with a question such as, “What was your goal today, and were you able to accomplish it?”
These types of questions are very insightful - and there’s no cost to your conversion funnel because you’re targeting customers after purchase, so you can get away with asking more questions.
Writing your questions
The type of question/s you ask depends entirely on your goal and what you’re hoping to get out of the activity. There are a few options which can be used in different circumstances, but using a combination of methods tends to be the most effective.
Explicit, single response questions
Explicit questions are bold and straightforward. They can be an effective way to get users attention and motivate them to interacting with you, since they appear to be quick and easy questions to answer. For example, “Is there anything holding you back from [the desired action]?” Another similar option is, “Is there anything that might prevent you from [the desired action]?”
However, the issue with these examples is that they are closed questions and therefore limited in the insights they will generate. You need to think about what you want / need out of the survey and consider if a “yes” / “no” answer will get you what you need.
If you’re going for this method, make sure you give respondents who answer “yes” an opportunity to elaborate and encourage them to do so via the survey copy. Based on the theory of consistency, users that answered that first question are more likely to continue.
If you’re less bothered about gathering quantitative data on the spilt between “yes” and “no” answers then you could use a free text field instead of radio buttons, with the same question.
Or
As you can see, there are countless options and while there are some logical ways to approach this, there are no guarantees. If you’re struggling to hit the required response rate or you’re not gathering actionable insights then experiment with phrasing your question differently or tweaking the targeting slightly. Just be careful with changing things too much mid-way through, as you run the risk of invalidaity the research study.
Scales and ratings
If you’re mainly interested in quantitative data then scales and ratings questions can be an effective method. For example, “How well is this page working for you?” or “How would you rate our support pages on a scale of 0-10?”. Again, this relates back to your research goals, and again, make sure you always provide an additional free text field in order to gather supporting qualitative data too.
Implicit questions
Implicit questions are passive and broader. This approach is popular, because they are more open, so you’ll get rich, qualitative insights. These types of questions can be useful when your research goal is more specific, for example, you want to understand if your messaging resonates with users, e.g. “What is your understanding of x, y, z?” However, implicit questions can also be used for more exploratory research “What questions that you're not finding answers to?” is a question that gently probes for areas where fears or concerns are not being addressed.
Determining Sample Size
As with all research methods, thought needs to go into sample size. There are different factors to consider here too, but mainly it’s about traffic levels. We tend to see a response rate of around 1-2% for standard “pop over” on-site surveys. So if you’re hoping for 1000 responses, but only get 5000 website visitors each week, you’ll be waiting a while to hit sample size.
When we run on-site surveys with our clients as part of our exploratory research phase, ResearchXL, we aim for between 200-300, elaborative responses (the visitors that choose “Yes”) to be confident in the strength of signal within the analysis (have a 95% confidence that the answer strength of signal is valid within ~5-10% of what is reported - here is an article on stats for reference). Once we hit this volume we will stop the and conduct the analysis.
There are several techniques that can raise your response rates, but there is a trade-off. Screen-filling surveys can seem aggressive and may disrupt the customer experience, but they do tend to elicit better response rates. Here's an example from Ticketmaster: a close-ended poll triggered specifically when a user leaves checkout before purchase to go to another page.
This survey takes over the full page, requiring the user to exit manually for continued browsing. The disruptive placement and the ease of closed-ended answers lead to much higher response rates: ranging from 5-15%.
Outlining Your Survey Scope
Creating a scope document for your survey is a key way to ensure team communication and understanding. The scope document can be simple or long-form, depending on your needs.
On the left is a long-form example, geared towards education and discussion. It details the goals, segmentation, targeting, and content planning.
The document on the right is an alternative which can be an effective tool to push out quick awareness about the poll, specs and goals, and to keep relevant teams in the loop. It’s a fast, easy read that touches on all the main pointers of the survey plan.
Getting Actionable Data from Open-Ended Answers
Analyzing open-ended answers isn’t as straightforward as for closed questions, but it’s very much worth it when you start digging into the insights. The key here is to identify themes within the responses by going through the reponses and tagging them based on the “theme” of the response. For example, “The delivery time "from 10 working days". It is too long.” would be tagged with something like “delivery lead-time”. Every time a similar comment comes up, we would add a “1” to the “delivery lead-time” tally. This allows you to understand the scale of each insight and this gives you quantitative data too, which really help when presenting back to clients and stakeholders.
At Speero we’ve built a template for coding open-ended data sets from a range of survey types, which helps us go from 100’s of individual responses to a consolidated data set like this:
You can see that this revealed the impact that product availability and poor communication of stock levels is having on the user experience.
Key Takeaways
- On-site surveys are a valuable tool to discover user anxieties, fears, uncertainties, and doubts.
- Set a clear goal for your survey and think about it’s purpose - is it exploratory or more specific? This will help you plan accordingly.
- Choose your where and when you trigger your survey carefully - use Analytics data to make informed decisions.
- Design your questions based around your goal and objectives, what type of data do you want to gather? Experiment with alternative copy, question types etc.
- Decide the length of time you need to run your survey. Base your timeframe on the amount of traffic your site gets and the ideal sample size for your purposes.
- Create a scope document to outline the specifics of your survey to create visibility across your team.
- Use a “tagging” method to code your open-ended responses to give you qualitative and quantitative outputs.