Experimentation / CRO

Experimentation and Testing Programs acknowledge that the future is uncertain. These programs focus on getting better data to product and marketing teams to make better decisions.

Research & Strategy

We believe that research is an integral part of experimentation. Our research projects aim to identify optimization opportunities by uncovering what really matters to your website users and customers.

Data and Analytics

90% of the analytics setups we’ve seen are critically flawed. Our data analytics audit services give you the confidence to make better decisions with data you can trust.

Cisco’s Experimentation Journey: From “Outsourced” to “Center of Excellence”

I am the Web Experience Manager for Cisco in EMEAR, and I’ve been working on increasing our capabilities as a business when it comes to conversion optimization for the past three and a half years.

Here’s our story of bringing experimentation into a $51 billion company with 71,000 employees spread across 96 locations worldwide.

Shifting our focus from tools to users

As with most businesses, before they become data-driven, we had blind obsessions with certain tools that we believed would solve all our customer experience woes.

An example of this is when we introduced a new marketing automation tool that would enable us to scale up the number of forms/landing pages our marketing team could create in different languages. But it’s almost like having the tool itself became the end goal rather than improving the user experience. 

It wasn’t until we did some user interviews and usability testing that we realised our users had some serious issues with these forms. Within the form templates, we were only including a few bullet points explaining the product and what the form was, along with an image.

Using behavioral analytics, we discovered 12% of users were clicking on the non-clickable image—a behavior indicative of searching for additional information or frustration. And in the research, users rightly told us they didn’t want to input their details because they didn’t have enough context about what the form was for. 

user frustration

While we as a business thought these forms were great, really scalable, and going to help us generate more inquiries, we started to realize our customers felt very differently. 

So we tested it. We added a video and written testimonials about the product as well as a sticky form that moved with the user down the page. We also created more contrast between the page content and the form. The variant led to a 42% uplift in form submissions as well as improving secondary metrics. 

Cisco before.
Before
After.
After

We realised we shouldn’t just do things because they are easier or because there’s already a predefined process or tool in place. Instead, we needed to start understanding our users’ wants and needs and then work out how to achieve them.

We found it particularly enlightening to get in front of real users and hear from them directly or, better still, watch them as they struggled to complete tasks on session replays to really understand where the issues are.

Figuring out who owns optimization 

Outsourcing CRO—a false start 

When I first joined Cisco, we were looking to appoint an agency to support us in a few areas within digital, such as social listening and SEO. As for conversion optimization, we weren’t starting from scratch; we had a few people around the company running experiments and optimizing areas of the site, but it was a long way from where we wanted to be.

It, therefore, made sense to outsource optimization to the same agency until we could build that muscle internally. But rather than the agency catapulting the business forward in terms of experimentation, it became a false start. 

There were various delays in kicking off the optimization activity, but we eventually started working with them a few months later. However, by this time, an internal MIO team (Management, Insights, and Optimization) had been formed and was pegged to take over the running of our optimization work. 

Creating the first internal CRO role 

This new team redeployed a web analyst to take on the role of CRO and create test pipelines for each of our journeys.

While the web analyst was strong on uncovering insights through quantitative analysis, there were still gaps in finding testing opportunities using more qualitative data sources. Unfortunately, after trying this setup for a while, it failed to gain any real testing momentum, and we went back to the drawing board.

In my opinion, a conversion rate optimizer needs to be a marketer at heart. They need to understand the technical side of a website, the website components that are available and appropriate to use for different use cases, UX, design systems, and understand persuasive momentum—how people travel through the website.

It's much more than just understanding the quantitative data; they need to understand the human element, too. 

Building a dedicated CRO function 

Around the same time, there were more changes to the team setup. This time, the changes affected the digital managers who, originally, sat at a country level and managed their own country sites and the localised content.

The structure had meant that digital managers were more focused on execution. So we bought them into the regional teams to tap into their expertise. They became more aligned with the overarching strategy and provided resources where they were needed most. 

Three of these former digital managers were brought into the Web Experience team, and along with a number of specialists within the team (UX, SEO, PM), they supported the roll-out of globally standardised journeys across EMEAR and were responsible for the optimization of those journeys. 

It was at this point that we hired an experienced conversion optimizer, and immediately the fortunes of our optimization work began to change. Having the right team in place meant we went from running 3 experiments to running 38 in the same 9-month period.

We also saw an increase in the ratio of successful tests, and improved both primary and secondary metrics across all of our testing activity.

The team was operating as a Centre of Excellence, with cross-functional specialists coming together to identify issues, build a test pipeline for each journey, execute the tests, and create a feedback loop to bake learnings into future experiences.

functional vs cross-functional teams

The journey team meetings became a hotbed of test ideas that could be validated and refined by the conversion optimiser. And we moved away from a scattergun approach to testing toward strategic test plans that were more closely aligned to our business priorities. 

As is often the case, timing played a key role in taking the next step. Within just a few months of this cross-functional team being in place in EMEAR, there was a drive to improve how we did optimization and experimentation globally across Cisco. I led the “People” track of this project and set about discovering how optimization was done in other regions, identifying gaps and proposing a structure that allowed us to scale our testing efforts.  

Workshopping the future

With our new global directive for optimization, we set out our goals:

  1. Double the number of tests we ran per year.
  2. Establish an experimentation process.

To achieve these global goals, we needed to work out a team structure that would enable us to have the right capabilities in the right places. 

Initially, we considered three models:

  1. Centralized;
  2. Decentralised;
  3. Centre of excellence. 
Cisco organizational models for optimization.
Cisco organizational models for optimization.

After getting consensus from those working on different global and regional teams, we settled on the Centre of Excellence (CoE) model that worked so well in EMEAR.

The CoE would be federated in its structure. The idea is to have the right skills and expertise in this virtual CoE to support:

  1. Optimization on globally standardised journeys;
  2. Tests on country sites to optimize for regional and local variations.

We then workshopped all the areas we’d need to set up and run a CoE. I created presentation decks to help set the scene for what this could look like—who the players were, what they might be doing at a high level, and what the ideal CRO process could look like.

Our workshop aim was to identify the skills and expertise we needed (and when). We used the RACI framework to add clarity on what roles we already had, and what we needed to hire or upskill in. This framework also helped us agree on who would be responsible for what going forward.

We had 14 people across Digital, Analytics, Content, and Marketing attend the workshop and used Miro to work remotely together on the same document. 

The Cisco workshop’s RACI framework.
The Cisco workshop’s RACI framework.

After the workshop, I removed any duplicate suggestions and looked at where we had commonalities to create a framework we could use. The Final RACI framework below is based on the workshopped RACI and several rounds of iterations with relevant stakeholders.

Cisco final RACI.
Cisco final RACI 

Now we knew where we wanted to go. Next, we needed to take stock of where we were, so we could create a roadmap taking us from point A to B. 

Creating an experimentation program roadmap 

People across our business have taken Speero's Experimentation Maturity Audit, which evaluates our position in the following four areas: 

CXL experimentation audit.

The audit is helping us:

  • Benchmark our current capabilities. 
  • Get term and scope alignment. By comparing individual scores across the team, we can assess how aligned we are on language and the scope of experimentation. 
  • Find gaps. See how some areas are perceived to be lacking vs. others to help set priorities and plan activities to move from our current state to the desired state. 
  • Direct conversations around what we prioritize. 
Example of experimentation program maturity audit findings.
Example of experimentation program maturity audit findings. 

Using the insights we gained from the audit, we set out the roadmap to allow us to focus on the areas that will have the biggest impact on leveling up our optimization expertise within the business. 

We’re still in the process of fleshing out the details of the roadmap, so I’m not able to share any plans at this stage, but suffice it to say that the next 12 months are going to be equal parts challenging and exciting.

The EMEAR Centre of Excellence has shown us that this kind of model can work well at Cisco, when underpinned with the right skills and resourcing. Creating a global optimization CoE has more complex challenges than a regional CoE; however, I’m confident we’re on the right path. 

Conclusion

While we are still early on in our journey, we’ve made good progress. I’ve realized that not everything needs to be perfect; to make progress, you just need something to get started. 

Finding the right tools and partners can help, but an open and questioning mindset is needed. Applying such a mindset across the business takes considerable time and effort, so start small in specific departments that are closely aligned to the concept of experimentation, such as digital or marketing teams.  

And for anyone who’s worked in an enterprise-scale organization, you’ll appreciate the complexities involved in introducing new ways of working, so change management will be key. Be realistic when you plan your goals and roadmap.

The tendency in large organizations is to try to boil the ocean and come up with a detailed plan upfront, which doesn’t really align with some of the key tenets of experimentation—get started, fail fast, learn insights, iterate, and optimize constantly. 

I’m incredibly excited and proud of where we are as a team and am looking forward to where we can take experimentation—and the impact it will have for our users and customers. 

Related Posts

Who's currently reading The Experimental Revolution?