A conversation with Speero CTO, Silver Ringvee, and Kameleoon’s Chief Product Officer, Frederic De Todaro
One of the biggest drivers of success in customer experience optimization is how well the program leverages data from its entire martech stack. So when should you build your own custom integrations vs. using those provided by the tools themselves?
Our CTO, Silver Ringvee, and Kameleoon’s Frederic De Todaro chatted about the value of custom vs. plug and play integrations, aiming to find out when companies should consider one over the other.
The bottom line: most experimentation programs can rely on plug n’ play integrations, but when it comes to more advanced or sophisticated testing, businesses often need custom integrations.
The good news is that the future seems bright for optimization-friendly integrations and solutions.
The debate: custom vs. plug and play integrations
Silver
Hi Frederic, Congrats on the Must-have integrations piece. You bucked convention and bucketed integrations into three groups. Would you do it again?
Frederic
Yes. Most large companies who are serious about their experimentation program need something beyond Google Analytics and Google Optimize. I am not saying they need to go out and buy the most advanced solutions, but they should have data analytics, UX/DX analytics, and the “make your life easier” group of tool integrations in place.
You’re the one who explained one-size fits all integrations aren’t always helpful. Are you still willing to defend that position? Surely, you wouldn’t advise a business just getting started with A/B testing to go out and build something custom?
Silver
No. For those just starting out, the plug and play option can work. But despite this option being easiest to implement we found that only 23% of businesses surveyed in our 2021 Experimentation Benchmark Report, strongly agreed that they have their analytics tools integrated with other marketing and advertising tools.
And 50% of respondents disagreed completely with the statement. Given those low percentages it appears that despite plug and play integrations being an easier option most businesses aren’t using them.
We also tend to work with relatively mature test and learn programs, who often need more customized solutions.
Because we’re helping businesses create long term growth and profit we need to be able to measure the long term impact of an experiment. This includes metrics like customer lifetime value, retention, churn etc. To accomplish this, you’ll need a custom solution that integrates between warehouses, data lakes, Customer Data Platforms (CDPs), Customer Relationship Management (CRMs). and other backend technologies.
We find that as businesses experimentation programs mature, they run into limitations with their plug and play integrations. For example, you might need to change the format in which the data is transferred or stored (i.e. custom dimension vs event). That’s when I begin to recommend custom JavaScript, a Data Layer, or API solutions over plug-and-play integrations.
Frederic
Sure, but in many cases I still believe it makes complete sense to have plug and play integrations, even for more mature programs.
A good example of this is our native Google Analytics (GA) bridge. Not enough custom dimensions? No problem, you can switch to Events at a click of a button. You only want to send one call per session; do it. You can run multiple Google Analytics accounts on the same website, no problem either, and so on.
To me, our Google Analytics integration is a perfect example of what a native bridge should do and be: flexible enough to make your customers' lives easier and work with all kinds of custom setups.
It’s not that I’m not a believer of custom integrations built off of public APIs. It’s just that I don’t think they’re as helpful as most experimentation and personalization programs believe. As your survey highlights, most programs haven’t scratched the surface of what they can do with plug and play integrations.
Silver
Yeah, choosing a custom integration should only be considered if there’s a specific use case that you can’t solve with existing options. As custom builds have overheads and maintenance.
But with almost all businesses we work with there are metrics or logic they want to define specific to their business, which plug and play integrations can’t handle. Meaning you end up with data that’s not working for the business situation, use case, or the questions the business wants to answer.
This can lead to situations where data in different tools isn’t adding up – quite often, though, this doesn’t mean that there’s something wrong with the setup itself. Rather, the problem is that tools have different logic for calculating certain metrics. This can cause people to not trust the data and the whole experimentation program can be called into question.
I’d also really like to see companies being able to own their data more, as opposed to working with black-box systems and aggregated metrics.
Frederic
I agree that’s why we offer public APIs to create custom integrations with developer tools like GIT and MS VS Studio IDE. Developers know their Martech stack and what they need to do, and they also have the skills to build what they need.
We also have a Custom Data Value Bridge and APIs to make it simple for developers to build their own custom integrations.
Silver
Looks good. But there’s still a few things to keep in mind when using standard API integrations. Standard data integrations generally allow you to:
- Create, Read, Update, and Delete (CRUD).
- Give you a snapshot of the data at a specific point in time in a specific format.
You’ll often need a developer if you need real time data or data in a different formats, unless you build a custom integration.
It’s also worth bearing in mind that for many use cases, integrations should send data into a warehouse type solution. Otherwise these plug and play integrations can continue to create data silos rather than break them down, because their ability to integrate with other business systems isn’t always available. It’s fine if you only have a basic setup and small off-the-shelf tool stack but otherwise you’ll likely be adding fuel to the data silo fire.
Frederic
In my mind with public APIs, you’re really only limited by your resources and imagination. I like the Zapier model as it lets the developer community build (and share) their own integrations.
Combined with automation APIs, for example, you can set and act on triggers when whatever event occurs. That could be something as basic as sending a notification on Slack when an experiment starts or finishes, or alerting a team member about a KPI trend.
I saw a use case where a company wanted to launch experiments when it saw, via its custom analytics solution, a spike in traffic captured due to a TV ad airing. In this case, the analytics solution could link to Zapier and then to Kameleoon by using the APIs of both tools.
Silver
Agreed, the above model provides less tech-heavy options, compared to fully custom solutions that require managing some sort of infrastructure (Extract, Transform, Load (ETL), data pipeline, etc.). But most plug-and-play solutions don’t go this far. We’re still in the phase where having some flexibility is considered a good thing.
I'm a strong believer that there's no point collecting a load of data if you can use it. If that means you need custom integrations to make it useful, you shouldn't hold back.
If you’d like to learn more about custom vs. plug and play integrations or how to get the most out of your martech stack when it comes to A/B testing, check out our data analytics services.