Experimentation / CRO

Experimentation and Testing Programs acknowledge that the future is uncertain. These programs focus on getting better data to product and marketing teams to make better decisions.

Research & Strategy

We believe that research is an integral part of experimentation. Our research projects aim to identify optimization opportunities by uncovering what really matters to your website users and customers.

Data and Analytics

90% of the analytics setups we’ve seen are critically flawed. Our data analytics audit services give you the confidence to make better decisions with data you can trust.

How ClickUp Was Finally Able to Trust Its Data

Olga Osadcha is the Director of Growth Products at ClickUp. It’s a B2B software for everything — from project management to docs, reporting, whiteboards, and more. Before ClickUp partnered with Speero, they faced several challenges:

  • The testing program had no clear or standard process, it was a mess.
  • The program was heavily tied/dependent on their A/B testing tool.
  • They couldn’t trust their A/B testing tool. 
  • Data was often lost, and it made them constantly question results, lowering test velocity.

The Challenge

During test validation, ClickUp continuously ran into issues and unexplainable mysteries with their A/B testing tool. ClickUp validates their tests in a couple of ways. Primarily through AA tests. 

Next, by checking if the traffic is the same in analytics and testing tools. ClickUp would often have a percentage of users they would lose in the tool, without knowing why.

“Was it decay failing? Or other conditions? It was a constant mystery why a certain percentage of traffic was lost. This would slow down our testing velocity because now, we had to wait for longer to have the needed sample size.”

Olga Osadcha

The third validation they did was to run several AA tests simultaneously and ensure the split was orthogonal: 

  • One control and one treatment group in test 1. 
  • One control and one treatment group in test 2.
  • Then split the combination of both equally.

For a long time, ClickUp’s experimentation team ran this type of validation through their tool without realizing that whoever is bucketed into a treatment group in one test would always end up in the treatment group of the second one. At the same time, the control was a shared holdout group between all these different tests. 

ClickUp wasn’t able to see this happening. The tool simply wasn’t configured to run several concurrent tests. Despite this, its teams relied on the tool and its data to plan the next experiments and generate new ideas.

“When we were analyzing the results, interpreting the data, and coming up with the next steps and validation… we constantly wondered… Does this actually work? Or is this just noise? Bad data?”

Olga Osadcha

The Solution

Speero helped ClickUp solve all of this by decoupling their entire experimentation program from the tool. ClickUp was able to establish an experimentation program from the ground up.

This way, ClickUp developed a good habit of launching tests and interpreting them in a way that made sense. Ultimately, they were able to actually trust the results.

With a program decoupled from the original tool, ClickUp brought a different tool. It worked better for them and their setup. The new tool also lets them automate all the manual processes, (not possible before the decoupling). 

“Even though we have a tool we can trust now, the CRO program itself doesn’t rely solely on that tool and the tool doesn’t dictate what we do. Basically, what we have now is a robust foundation and the new tool helped us speed up and be more efficient.”

Olga Osadcha

ClickUp now loves its new tool. Why? They trust it. They aren’t losing traffic allocated to experiments. Even when they expected the new tool to fail… it didn’t.

“We ran pretty extensive tests with it, and it worked well in situations where the prior tool would absolutely fail.”

Olga Osadcha

The benefits go beyond the tool. These days all their data lives in a warehouse rather than in the tool. They can analyze the results on both sides. Lot less guesswork. Lot fewer reruns and manual tests.

“I had a great experience working with Speero. We had a healthy cadence of check-ins and continuous progress. We had a good machine where we were communicating and making progress weekly.”

Olga Osadcha

The Results

  • A tool-agnostic experimentation program, configured correctly.
  • Automated instead of manual processes with a lot fewer reruns and manual tests.
  • Traffic isn’t lost in the tool or during validation.
  • Results they can trust.

Key Takeaways

1

Validate the data and results coming out of the tool.

Cross-check with the backend, run AA tests, and make sure your WHOLE team trusts the data.

1

Create consistency in the program.

How? With rituals and systems that iterate always forward, towards progressive improvements. Helpful blueprints: Strategic testing roadmap / Results vs Actions / Goal of testing program.

1

Adapt quickly if needed.

If you have constant issues with a testing tool, don’t be afraid to switch it out. Helpful blueprints: XOS toolstack / Client v Server side testing.

Let’s kick things off.
Tell us what you’re looking for

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.