What If Your Optimization Team Could Answer Their Own Data Questions?
We built an Optimizely Opal custom tool that gives your testing team direct access to analytics data and here's why it matters more than how we did it.
Last week, i was in Denver at an Optimizely event at VF Headquarters, surrounded by optimization practitioners, brand leaders, and, as it turned out, Optimizely's Chief Product Officer. A lot of the conversations during the event kept circling the same theme, just framed in different ways:
"We know what we want to test, but we can't get the data to prioritize."
"Our analysts are buried. I need a simple number and it takes three days."
"The testing team lives in Optimizely. The data lives in Adobe. And there's a wall between them."
And these aren't new complaints. But what struck me was who was saying them, not junior practitioners, but directors and VPs running programs at scale. The bottleneck between analytics data and optimization decisions is a structural problem. And it's costing these teams velocity they can't afford to lose.
During the event, a conversation about Opal custom tools sparked an idea we'd been circling for a while for clients we have that use Optimizely for testing and Adobe for analytics, what if we just removed the wall?
The Bottleneck Few People Talk About
Here's a stat that should bother every analytics leader, from brands we’ve studied, analysts spend between 1/2 to 2/3rds of their time on ad-hoc data requests. Not building models. Not uncovering insights. Not designing measurement frameworks. Pulling numbers for other people.
And most of those requests are one time, single-use. Someone asks, "What's the conversion rate on this page?" An analyst opens Adobe Workspace, builds a freeform table, applies the right segment, checks the date range, exports it, and sends it over. Fifteen minutes of work for a number that gets glanced at once and never referenced again.
Multiply that across every test idea, every executive question, every "can you just pull..." request, and you start to see the real cost. It's not just the analyst's time, it's the decisions that don't get made while they're busy answering the easy questions.
Gartner's research backs this up, less than half of data and analytics teams effectively provide value to their organization. Not because the people aren't talented, but because the structure forces talented people to do low-leverage work.
For optimization teams, this bottleneck is especially painful because every hypothesis begins with a question about current performance. If answering that question takes days instead of minutes, your entire testing pipeline slows down.
Adobe Knows This. They Built a Solution. It's Not for You.
At Summit 2025, Adobe announced the Data Insights Agent, a conversational AI interface where you ask business questions in plain English and get instant visualizations. It's exactly the right idea. Ask a question, get an answer, skip the Workspace learning curve.
There's one problem, it's only available in Customer Journey Analytics.
If you're on traditional Adobe Analytics, which is still the majority of Adobe's installed base, you don't get it. No natural language querying. No AI-assisted analysis. No conversational interface. You get Workspace, and you get the queue of people waiting for someone who knows how to use it.
This isn't meant as a criticism of Adobe's strategy. CJA is where they're investing. But it leaves a gap for the thousands of organizations that are years away from a CJA migration or may never make one.
Opal Can Fill That Gap Today
If you're an Optimizely customer, you already have access to a platform that can bridge this divide. Opal's custom tools framework lets you connect the AI layer to any external system with an API. And Adobe Analytics has an API, a pretty good one actually.
The Adobe Analytics Reporting API 2.0 can do almost anything Workspace can. Pull metrics and dimensions. Apply segments. Break down data by any combination of variables. It can even enumerate its own available dimensions and metrics, which means an AI tool can learn what questions are possible to ask before it asks them.
We connected these two things. We built an Opal custom tool that translates natural language questions into Adobe Analytics API calls and returns conversational answers. No Workspace. No analyst queue. No context switching.
An optimization manager can type, "What's the conversion rate for mobile visitors on the checkout page this month?"
And get back an answer right inside the platform they already use every day.
This Changes Three Things
1. Speed to insight becomes speed to test
When the optimization team can self-serve basic analytics, the ideation cycle compresses dramatically. You go from "I think this page underperforms" to "this page converts at 2.3% versus the 4.1% category average and here's a hypothesis" in minutes, not days.
That acceleration compounds. Faster ideation means more tests in the pipeline. More tests means faster learning. Faster learning means better results.
2. Analysts get to do analyst work
This is the part that gets misunderstood. Democratizing data access isn’t about replacing analysts with automation or AI, far from it. When we finally realize real data democratization, we unlock our briliant analysts to do higher value work they are capable off but often can’t get to because they are buried answering off requests for metrics.
When 60% of an analyst's time is freed from "can you pull this?" requests, that time goes to the work that actually moves the business forward like attribution modeling, segmentation strategy, experimentation design, anomaly detection. The hard problems. The ones that require judgment, context, and creativity, the things AI can support but can't replace.
The best analytics teams i've worked with are the ones where the analysts spend their time on the highest-leverage problems. Self-service tools is a big part in how you get there.
3. Your data quality becomes visible
This is the unexpected benefit, and it connects directly to something i wrote about last year in "AI as a Forcing Function for Organizational Maturity."
When only analysts interact with your data, they quietly work around the inconsistencies. They know that "purchase" and "Purchase" and "purchase_complete" all mean the same thing. They know which segments are reliable and which ones have edge cases. They've memorized the tribal knowledge.
When you open that data up to a broader team through a conversational interface, the mess becomes visible. People ask questions that surface naming inconsistencies, undocumented segments, and metrics that don't mean what everyone thinks they mean.
That sounds like a problem. It's actually a gift. It creates the organizational pressure to fix what analysts have been working around for years, to document, standardize, and govern the data layer properly. Because now all the new tools we have require it.
What This Looks Like in Practice
I'm deliberately not turning this into a tutorial, we've published step-by-step guides for building Opal custom tools if you want to go that route. What matters here is the pattern, not the plumbing.
The tool works like this:
1. A user asks a question in natural language inside Opal
2. Opal routes the question to our custom tool
3. The tool interprets the question, maps it to available Adobe Analytics dimensions and metrics, and constructs an API request
4. The Adobe Analytics Reporting API returns the data
5. The tool formats the response into a conversational answer and sends it back to Opal
The entire round trip takes seconds. The user never leaves Optimizely. The analyst never gets interrupted.
We've been running this internally and with select clients. "I can just *ask?"
Yes. You can just ask.
The Bigger Picture
What we built is one tool connecting two platforms. But the pattern is bigger than that.
The organizations that will lead over the next few years are the ones that connect their tools into a system where data flows to wherever decisions are being made without requiring a human intermediary for every question.
Opal's custom tools framework makes this possible today for anything with an API, your CDP, your data warehouse, your CRM, your analytics platform. The Adobe Analytics integration is a starting point, not an endpoint.
Industry forecasts suggest that by the end of the year, 40% of analytics queries will be made using natural language. The shift is already happening. Will your team build the muscle now or scramble to catch up later?
Where to Go From Here
If you're an Optimizely customer using Adobe Analytics, you're sitting on an opportunity most organizations haven't realized exists. The tools are available. The APIs are mature. The gap between your optimization team and your analytics data can be closed today, not someday.
Want to build your own Opal Agent? Start with our technical guides:
- Building Your First Optimizely Opal Custom Tool
- Building Custom Optimizely Opal Tools: A Complete Guide
Want to explore what's possible for your program? Book a strategy call and we'll walk through what a connected optimization-analytics workflow looks like for your stack.
jason thompson is CEO of 33 Sticks, a strategic analytics partner for organizations navigating complexity. For two decades, he’s worked at the intersection of data and judgment and he still gets unreasonably excited about a well-structured API response.