Why We Created "You Are The Analyst" — And What It Teaches About Real-World Analytics
There's a particular kind of confidence that comes with being a junior analyst. You've mastered the tools. You can write beautiful SQL queries. Your dashboards are clean, your calculations correct. You present your findings with certainty because, well, the numbers don't lie.
Except when they do.
i was that analyst once. Overly confident, moving fast, eager to demonstrate competence. i'd build an analysis, verify the math was right, and present it to leadership without pausing to consider: What am i missing? What assumptions am i making? What questions should i be asking that i'm not?
More importantly, i didn't think to find collaborators who could poke holes in my thinking before it went public. i worked in isolation, treating analysis as a solo performance rather than a collaborative investigation. The result? i presented flawed conclusions with complete confidence, learning only later, sometimes painfully so, where my thinking had gone wrong.
That experience planted a seed that eventually grew into You Are The Analyst, a weekly game where we present realistic business scenarios filled with analytical errors and invite the community to spot the problems. But this isn't just about catching mistakes in someone else's work. It's about developing the instincts, questions, and professional courage that separate competent analysts from trusted advisors.
The Wisdom of Not Knowing
Here's what we've observed watching experienced analysts over the years: They don't know, and that's exactly why they're so good.
When a seasoned analyst reviews a dashboard or sits in a meeting where confident conclusions are being drawn, they're not thinking "yes, that's right" or "no, that's wrong." They're thinking:
How was this calculated?
What's the source for these numbers?
What time period are we looking at, and why?
What are we not seeing in this view?
What assumptions would have to be true for this conclusion to hold?
They have questions. Lots of them. Not because they're skeptical by nature, but because experience has taught them how many ways analysis can mislead — often unintentionally.
The inexperienced analyst presents with certainty. The experienced analyst presents with carefully calibrated uncertainty: "Here's what I'm confident about, here's what needs more investigation, and here's the risk if we proceed with current information."
This shift, from knowing to questioning, doesn't happen through reading textbooks or taking courses. It happens through repetition. Encountering scenarios where the obvious answer was wrong, where clean data was too clean, where attribution models created false narratives, where tight timelines led to costly mistakes.
The problem? Those lessons usually come through painful real-world failures. You present flawed analysis. Leadership makes a bad decision based on it. You spend months rebuilding credibility.
What if we could compress that learning curve? What if analysts could practice encountering flawed scenarios, asking the right questions, and making judgment calls in an environment where the stakes are low but the learning is real?
That's why You Are The Analyst exists.
How the Game Works (And Why It's Not Really a Game)
Every Friday, we post a realistic business scenario on LinkedIn. It might be a post-campaign performance report, a quarterly business review, a pricing analysis, or dashboard results.
The scenario includes:
Context: Industry, company type, what's at stake
Data or claims: Numbers, charts, executive summaries
A decision point: Budget allocation, strategy recommendation, board presentation
And embedded within that scenario are 5-8 analytical errors:
Math that doesn't reconcile
Calculations that look right but aren't
Attribution models that inflate performance
Missing context that changes conclusions
Survey methodology that introduces bias
Statistical comparisons that mislead
Assumptions treated as facts
We invite the community to play analyst: What's wrong with this analysis? What questions would you ask before signing off?
Then something remarkable happens.
The Multiplier Effect of Community Learning
During last week’s game, which featured a cruise line's promotional campaign analysis with a suspicious 745% ROAS and channel attribution that didn't add up, over a dozen people participated. But they didn't just submit answers like students taking a quiz. They investigated together.
Early players caught the obvious errors: "Wait, these channel totals don't match the overall bookings." That established safety because others are finding issues too, I'm not being overly critical.
Mid-thread, players built on each other's observations: "And if the total is wrong, then the paid social attribution percentage is inflated..." The collaborative investigation deepened.
Later players brought strategic concerns: "Even if the numbers were right, are we considering that these discount customers may never return at full price?" Someone who didn't catch a single math error raised the most important business question.
This is the multiplier effect that makes the community format essential.
When you read a case study alone, you're limited to your own perspective. You might catch 3 of the 8 errors and assume you've done well. You move on.
When you investigate with a community:
You learn from approaches you wouldn't have thought of: "I focused on the math, but they questioned the data source itself. I should add that to my checklist"
You feel safe asking questions: "Those rounded numbers feel estimated, not measured" gets validated by three other analysts who had the same instinct
You see expertise modeled in real-time: A senior analyst doesn't just say "this is wrong,” they articulate why it matters and what data they'd request to verify
You build confidence through collective validation: When multiple people spot the same red flag you noticed, you internalize: My skepticism isn't paranoia. It's competence.
i'm not the only teacher in this game. Every player who shares their thinking is teaching something like a question they ask, a pattern they recognize, a way they'd communicate the concern to leadership. The learning compounds.
What Players Actually Learn (Beyond Error-Spotting)
On the surface, You Are The Analyst teaches people to spot flawed data. But that's not what makes it valuable. The real learning happens in three deeper dimensions:
1. Pattern Recognition That Accelerates Judgment
In last week’s scenario, several players noticed something subtle: The clicks (186,750) exactly matched the landing page visitors (186,750). Not 186,742 and 186,755. Exactly the same.
"This looks too clean," one game player wrote. "In my experience, these numbers never match perfectly. Are we sure these are measured independently?"
That's pattern recognition, the instinct that perfect alignment might indicate a data collection issue, not a perfectly efficient campaign. It's the kind of "spidey sense" that experienced analysts develop over years of encountering real-world messiness.
Here's what's powerful about this, by explicitly calling out these patterns, we make tacit knowledge teachable. New analysts don't have to wait five years to develop that instinct. They see it modeled, practice recognizing it, and add it to their mental library of red flags.
Other patterns the game reinforces:
Suspiciously round numbers often mean estimation, not measurement
Attribution percentages that sum beyond 100% reveal multi-touch counting issues
Conversion rates that "feel high" deserve calculation verification
Survey results presented without methodology should trigger questions about bias
"40% higher than average" claims require knowing: average of what? over what period?
2. The Soft Skills No One Teaches Explicitly
One of our scenarios involved a CMO asking for sign-off on a campaign analysis by 9am the next morning, despite handing it over at 6pm. The analyst (the player) notices multiple errors but faces professional pressure: the CMO seems confident, the board meeting is imminent, and pushing back might feel like "being difficult."
Multiple players responded with professional boundary-setting language:
"I won't sign off on analysis I haven't properly validated"
"This timeline doesn't allow for thorough review"
"We need dashboard access and 48 hours, or we flag this as preliminary"
"This is not my emergency"
These players weren't learning to calculate conversion rates. They were practicing something more important, how to protect your professional integrity when facing pressure to move faster than rigor allows.
This is the hidden curriculum of You Are The Analyst:
Risk articulation: Connecting analytical errors to business consequences ("If the board approves $2.5M based on this wrong attribution...")
Executive communication: Prioritizing which issues to lead with when you only have 10 minutes
Collaborative investigation: Knowing when to loop in colleagues before finalizing
Calibrated skepticism: Distinguishing between healthy questioning and obstructionist cynicism
These are the skills that determine whether you become a trusted advisor or remain an order-taker. And they're almost never taught in analytics courses.
3. Understanding Cascading Consequences
Here's what makes the game's errors meaningful: They Connect!
In that cruise campaign scenario:
The channel breakdown totals were wrong (5,420 bookings listed, but only 3,420 actual)
Which meant the paid social attribution was inflated (84% vs. actual 53%)
Which led to wrong conclusions about channel performance
Which informed a flawed recommendation to scale paid social in Q1
Which could result in $2.5M budget misallocation
Players learned to trace the domino effect, It's not just that the math is wrong. It's that wrong math compounds into wrong strategy, which leads to real financial consequences.
One player commented, "I used to think my job was just getting the numbers right. Now I realize my job is preventing leadership from making million-dollar decisions based on faulty logic."
That's the shift we're trying to create.
Why This Resonates with the Analytics Community
You Are The Analyst isn't solving a new problem. Every analyst has experienced:
Being handed flawed analysis and asked to "just sign off"
Spotting errors that others missed or ignored
The uncomfortable moment of telling someone senior their data is wrong
Pressure to work faster than thoroughness allows
What the game does is validate that these situations aren't unique to you. They're endemic to the field. And more importantly, there are better and worse ways to navigate them.
When an analyst writes "I've been in exactly this situation at work" and others reply with their own versions, something shifts. The isolation breaks. You realize:
You're not paranoid for questioning clean data. You're experienced.
You're not being difficult by asking for more time. You're being responsible.
You're not alone in valuing rigor over speed. There's a community that shares your standards.
The game creates belonging through shared pattern recognition. And that belonging gives analysts permission to trust their instincts, ask hard questions, and push back on pressure to compromise quality.
What Organizations Should Learn From This
If your analysts aren't regularly questioning data, pushing back on tight timelines, or flagging methodological concerns, you might not have highly skilled analysts. You might have a culture problem.
The behaviors You Are The Analyst rewards like asking "how do you know?", verifying calculations before presenting, requesting time for proper validation, are the same behaviors that prevent costly mistakes in real organizations.
But many workplaces implicitly discourage these behaviors:
"Just give me the number" (discourages questioning)
"I need this by end of day" (prevents thorough review)
"The CEO wants to see growth" (creates pressure to spin data)
The game demonstrates what good analytical rigor looks like in practice. It shows that experienced analysts don't just execute requests but rather they:
Validate sources before trusting them
Verify calculations even when they "look right"
Ask about methodology, not just results
Connect tactical errors to strategic consequences
Communicate uncertainty alongside findings
Protect stakeholders from making decisions on insufficient data
If you want analysts who catch million-dollar mistakes before they reach leadership, you need a culture that rewards this kind of deliberate, questioning approach.
The game can't fix your culture. But it can help your analysts practice the behaviors you should be encouraging.
The Paradox of Learning Through Play
There's something counterintuitive about using a "game" to teach serious professional skills. Shouldn't analytical rigor require serious formats like certifications, structured courses, or technical training?
Maybe but here's what we've learned, play creates psychological safety that enables deeper learning.
When the stakes are low, experimentation is high. Players will try unconventional observations, ask "dumb" questions, or admit confusion, things they might not risk in a real work setting where their competence is being evaluated.
When failure is expected (the scenario is supposed to be flawed), risk-taking is encouraged. You're rewarded for skepticism, not penalized for it.
When the format is engaging, attention is sustained. Players spend 15-30 minutes crafting thoughtful responses to a LinkedIn post, thats time they might not invest in a mandatory training module.
The game works like a flight simulator, pilots can practice catastrophic failures without dying. Analysts can practice catching errors, pushing back on flawed timelines, and articulating concerns without the career risk of getting it wrong in real life.
What Success Actually Looks Like
We measure success not by how many errors players find, but by how their approach evolves:
Early players (First few weeks playing the game):
Focus on obvious math errors
List issues without prioritizing
Engage as "students taking a test"
Experienced players (sustained participation):
Ask probing questions before declaring errors
Connect analytical flaws to business consequences
Engage as "consultants advising a client"
Reference earlier games: "This reminds me of that attribution issue from a few weeks ago..."
The ultimate success indicator? When a player messages us something like, "I used this framework at work this week and caught a similar issue before it reached leadership."
That's when we know the pattern recognition has transferred from game to practice.
The Bigger Vision: Cultivating Thoughtful Analysts
You Are The Analyst is part of a larger belief about what analytics should be.
At 33 Sticks, we reject the model of "analysts as order-takers" people who execute requests without question, generate reports on demand, and optimize for speed over accuracy. That's not analysis. That's data retrieval.
Real analysis requires:
Curiosity: Not accepting conclusions at face value, even when they come from leadership
Judgment: Knowing what's "good enough to act on" vs. "needs more rigor"
Integrity: Refusing to put your name on work you haven't validated
Communication: Articulating uncertainty, risk, and limitations alongside findings
Courage: Pushing back when timelines, data quality, or methodology are insufficient
These aren't just nice-to-have soft skills. They're what separate analysts who add value from analysts who add noise.
The game exists to make these qualities teachable. To show that objectivity isn't a mindset you adopt once, it's a series of micro-behaviors you practice repeatedly until they become habit:
Checking calculations instead of trusting them
Asking "how do you know?" instead of accepting claims
Investigating suspiciously clean data instead of celebrating it
Pushing back on timelines that prevent validation
Every scenario is an opportunity to practice these behaviors. Every community discussion reinforces them. Over time, they become instinct.
A Personal Reflection
i started this post talking about my younger self, the junior analyst who was overly confident, worked in isolation, and presented findings without adequate scrutiny.
If that analyst had access to You Are The Analyst, here's what might have been different:
i would have seen experienced analysts model the questions i wasn't asking: Where did this data come from? What's the calculation behind this metric? What are we not seeing?
i would have practiced articulating uncertainty in a safe environment, learning that admitting "i need more time to verify this" is a sign of competence, not weakness.
i would have found collaborators who could reality-check my thinking before it went to leadership, turning analysis from a solo performance into a team investigation.
Most importantly, i would have learned that the wisdom of experienced analysts isn't that they know more, it's that they question more.
That's the shift we're trying to create, one scenario at a time.
An Invitation
If you're an analyst who's ever felt that nagging doubt when reviewing data that "looked right" but felt wrong, then this game is for you.
If you've ever wished you could practice pushing back on flawed analysis without career consequences, then this game is for you.
If you're trying to develop junior analysts but struggling to transfer the tacit knowledge of "what to watch for,” then this game is for you.
We're not trying to trick you. We're trying to train the instinct that makes you pause and ask one more question. The instinct that catches mistakes before they become million-dollar decisions. The instinct that transforms competent analysts into trusted advisors.
Every week brings a new scenario. Every scenario is an opportunity to practice the wisdom of not knowing.
The game continues every Friday. Will you play along?
Follow 33 Sticks CEO, jason thompson, on LinkedIn to join the investigation or visit 33sticks.com to learn more about our approach to human-centered analytics.