Using U-Haul’s Growth Report to Spark Better Analytical Thinking

There’s a moment in every analyst’s career when they stop just reading data and start questioning it.

We’re not talking about being cynical or skeptical for the sake of it. We’re talking about curiosity-driven, clarity-seeking, insight-focused critical thinking.

Take U-Haul’s 2024 “Growth States Report.” It ranks U.S. states based on one-way U-Haul rentals, suggesting where people are moving to and from. At a glance, it’s a tidy list with intuitive appeal. But scratch the surface, and it becomes a perfect teaching tool, not because the data is “bad,” but because it invites better questions.

So let’s use this dataset not as something to critique but as something to practice on.

Step 1: What is this data actually measuring?

It’s easy to mistake a dataset for what we hope it tells us. In this case, U-Haul’s report is widely interpreted as reflecting migration trends.

But pause and ask:

  • What behavior does the dataset directly measure? One-way U-Haul truck rentals.

  • What behavior are people inferring from it? Population movement and migration trends.

Gap identified. As analysts, we should always ask: What does this data directly capture and what assumptions are we layering on top?

Step 2: Who is missing from the data?

Every dataset tells a story but only from the perspective of those included.

In this case:

  • What about people who used professional movers?

  • What about higher-income families who didn’t rent trucks?

  • What about those who used competitors like Penske or Budget?

This brings up one of the most powerful analyst questions:

Whose story isn’t being told here?

Whether you're analyzing ecommerce trends, employee engagement, or customer satisfaction, this question often reveals the biggest blind spots.

Step 3: What external context would change how we interpret this?

Raw percentages can be misleading. For example, “51.7% of moves to South Carolina were inbound” sounds impressive until you realize it might be based on a relatively tiny number of moves.

So we ask:

  • What’s the total volume?

  • How big is the base we're comparing to?

  • Are these differences statistically meaningful?

Now translate that to your work:

  • Are we seeing a true trend or noise in a small sample?

  • Are results being driven by the behavior of the dataset or by real change in the population?

The goal is not just to see the numbers but to understand their weight.

Step 4: Could business mechanics be distorting the data?

This one’s subtle but important.

U-Haul sets prices dynamically based on supply and demand. If there's a truck shortage in Colorado, they might restrict outbound rentals or raise prices. That influences behavior.

So ask:

Are business operations influencing the behaviors we’re analyzing?

This matters when you're looking at things like product usage data (was a feature unpopular or just buried?) or sales metrics (did pricing experiments nudge behavior more than the product itself?).


Step 5: What questions were never asked but should’ve been?

This is where analysts shift from interpreters to advisors.

We can take a dataset and say, “Here’s what it says.”
Or we can say, “Here’s what it doesn’t say and what we’d need to ask to learn more.”

Some questions we might ask in the U-Haul example:

  • How do these trends look when we normalize by population?

  • How have rental volumes changed year over year?

  • How do other migration datasets (e.g., USPS address changes, census data) compare?

  • Are there geographic or seasonal constraints we’re ignoring?

Each of these turns raw data into usable insight.

Final Thought: Every Dataset Is a Doorway But You Still Have to Walk Through

The U-Haul Growth States Report isn’t “bad data.” It’s incomplete data, like nearly every dataset you’ll ever work with.

The difference between good and great analysts is in how they show up at that doorway. Do they:

  • Accept it at face value?

  • Critique it out of habit?

  • Or step through with curiosity, asking what’s behind it, beyond it, and left out of frame?

At 33 Sticks, we believe the most valuable insights aren’t always inside the data, they’re in the questions you’re willing to ask about it.

Want to Practice This With Your Own Data?

Next time you’re handed a dashboard, a spreadsheet, or a stakeholder report, ask:

  • What is this really measuring?

  • What assumptions are baked in?

  • Who isn’t included?

  • What might be distorting the data?

  • What other data would help complete the picture?

Answering these doesn’t just make your analysis better, it makes your insights trusted.

jason thompson

Jason Thompson is the CEO and co-founder of 33 Sticks, a boutique analytics company focused on helping businesses make human-centered decisions through data. He regularly speaks on topics related to data literacy and ethical analytics practices and is the co-author of the analytics children’s book ‘A is for Analytics’

https://www.hippieceolife.com/
Next
Next

Meta's New Incremental Attribution Model: Truth, Hype, or Another Layer of Opacity?