Experimentation / CRO

Experimentation and Testing Programs acknowledge that the future is uncertain. These programs focus on getting better data to product and marketing teams to make better decisions.

Research & Strategy

We believe that research is an integral part of experimentation. Our research projects aim to identify optimization opportunities by uncovering what really matters to your website users and customers.

Data and Analytics

90% of the analytics setups we’ve seen are critically flawed. Our data analytics audit services give you the confidence to make better decisions with data you can trust.

Pricing Page Optimization: How to Order Pricing Plans [Original Research]

How do you order your pricing page: Cheap-to-expensive? Expensive-to-cheap? Randomly?

For this study, we manipulated the pricing page for a survey tool, SurveyGizmo, to see if there are different patterns of user perception and preference (choice of plan) for various layout designs.


The pricing page is a crucial part of the sales funnel for a business. It's where the customer sees the details about what they get and what they have to pay.

There is a very common way to present pricing plan: cheap-to-expensive, left-to-right. According to a review of 250 SaaS websites by process.st, of the ones that list prices, 81% of SaaS companies listed on the 'Montclare 250' list their prices from cheapest to most expensive.

It's a logical way to present things...but is it data driven and does it lead to more conversions?  

In this study, we attempt to answer that question. We tested different pricing plan layouts in this study to get data behind how different layouts affect user perception.

To gather a wide range of data, we implemented task scenarios, eye-tracking, and post-task survey tools to understand how users generally consume information on a pricing table, and how the design of the pricing table may influence which plans they end up choosing.

How to Order Pricing Plans

Data Collection Methods

We used the pricing page for a survey tool (Surveygizmo) as the research subject. Since many people (outside the optimization world) probably haven't heard of Surveygizmo, they would be unbiased toward the site.

However, the idea of surveying many people, and therefore the scenario we'd present to participants, is easily understood. Here is a screenshot of the original pricing page (Note: If you check out the current pricing table on Surveygizmo, you'll notice it's already been changed just a month after our test!).

Screenshot of original SurveyGizmo pricing page, plans ordered cheapest to most expensive

Screenshot of original Surveygizmo pricing page, plans ordered cheapest to most expensive from left to right (cheapest-first).

Here are two variations, the primary alternative (expensive-to-cheap) and a variant that mixes the plans, just for comparison:

Screenshot of the modified pricing table for surveygizmo, ordered expensive to cheapest from left to right.
Screenshot of the modified pricing table for Surveygizmo, ordered expensive to cheapest from left to right (expensive-first).
Screenshot of the modified pricing table for surveygizmo, ordered randomly.
Screenshot of the modified pricing table for Surveygizmo, ordered randomly (mixed).

Our methods included: A task that participants were prompted complete, eye-tracking equipment to analyze viewing patterns, and a post-task survey question. For the task, we wanted participants to examine each plan's features closely. We asked them to find features that only 2 of the 4 plans offered (chat support and analysis tools).

We asked:

Imagine you own a medium sized business and are in need of an online survey tool that offers chat support and analysis tools.The next screen will show pricing options for a survey tool. Browse and compare options to choose a package that fits your needs.

After viewing the web page, we asked which plan they would choose.Using the task and post-task survey question, we want to find out whether the layout of plans affects how users consume information on the table. If so, does the difference influence the plan they would ultimately choose?


Eye-Tracking Results

Here is an eye-tracking map animation for each plan layout:

Pricing plan eye-tracking gif across pricing plan variations. Left: cheap-to-expensive plan order; Middle: mixed plan order; Right: expensive-to-cheap plan order.

Notice that the general pattern is the same for all variants: People tend to start viewing the middle of a page then gaze slightly left.

Here's another view of the variations showing the average order in which people viewed each area of interest (the pricing plans and features offered).

Surveygizmo pricing plan page variations with indication of order of first viewing for each plan.
Surveygizmo pricing plan page variations with indication of order of first viewing for each plan.

Again, this tells us that the same overall pattern of viewing the pricing table exists, but keep in mind the plans are in different orders, so people are seeing the plans in different orders.


Participants processed the table the same way regardless of how plans were ordered. They noticed the two plans listed first (in left-right order) and spent the most time on those first two plans, regardless of which plan or price they were. The first two plan slots got the most attention.

Let's look at some numbers: Because we have 5 areas of interest (each plan and the list of features) across 3 variations, there are lots of possible comparison combinations.

However, we're going to focus on viewing patterns of the 2 most expensive plans (Pro & Enterprise) as this gives a 'price-anchor' benchmark among the variations.

Here are our summary stats for the PRO and ENTERPRISE pricing plans:

Summary eye-tracking stats for Pro and Enterprise plan areas of interest
Summary eye-tracking stats for Pro and Enterprise plan areas of interest

There was a significant difference among layouts in mean time to first fixation [F(2,136) = 12.6881, p < 0.001] and total time fixating [F(2,136) = 7.419, p < 0.001].


Participants spent more time reading about expensive plans and looked at the expensive plans quickly when they were placed on the left side of the table, listed first.

Survey Results

Remember, we asked:

"Considering the your needs, which package would you choose?"

Percentages of participants who chose which plan for each of the three page variations
Percentages of participants who chose which plan for each of the three-page variations. Note, the mixed variation had the Pro plan listed first.

We see that the mixed and expensive-to-cheap variations had a higher number of people choosing the PRO package. Note the mixed variation had the Pro plan in the first slot (furthest left) and the expensive-to-cheap variation had the Pro plan in the second slot.


Participants choose more expensive packages more often when they are listed first, or furthest left in left-right order.


Our results were obtained from study participants who didn't actually need to purchase a survey tool. This issue is the Achilles heel of user testing: Hypothetical situations don't always translate into real-life situations.

There is a possibility that individuals who actually were shopping for a survey tool would view the pricing table differently.

However, we tried to control for this with a proper 'scenario' capturing accurate motivation. With a proper sample size, we saw differences in responses among groups which provides supporting evidence as to what works and what doesn't for pricing table order/layout.

While providing participants with a pre-defined task scenario provides uniform motivation, it may also limit our interpretation of the results since the preferences for a certain plan could be biased by our particular scenario prompt. It's even possible that the differences in plan preference we observed would be more significant —or totally different— if we gave participants a different task (e.g. we asked them to look for survey features that were available with every plan).

Also, note that results are limited to one type of product with a particular price range. We're curious to see how these results apply across different product types (e.g., Subscription vs. SaaS, informational/digital products vs. physical products) and at different price points (i.e., cheap products vs. high-end expensive products).


While most SaaS pricing pages list plans from cheapest to most expensive, we found that users were more likely to prefer more expensive plans when they were laid out the opposite way: most expensive to cheapest.

No matter what layout we presented, users spent the most time looking at the first two pricing plans listed in a left-right order. So, if the cheapest plan was presented first, that's what they spent the most time looking at. Same idea for the most expensive plan.

While user testing does not always translate to behavioral reality (what people say isn't always what they do), these results suggest that you should order your pricing plans from high to low. In any case, it's worth an A/B test.

Related Posts

Who's currently reading The Experimental Revolution?