Skip to main content
Blog Article

How I Learned to Measure UX (the Hard Way)

Measuring Impact: How to Assess UX Engineering Outcomes
Last updated on Jan 19th, 2025

There was a time in my career when we'd roll out a shiny new interface and just… hope for the best. We'd have standups where someone might say, "It feels better," or "I think this flow is smoother," and we'd all nod like that meant something.

I remember one project in particular where we'd redesigned a key section of the app. It looked better, no doubt. We had cleaner visuals, fewer steps. We shipped it, high-fived, and waited for praise to roll in. A few days later, the product manager burst in with a fresh batch of analytics and a concerned look on their face. "So… did this actually help users? Because our support tickets just went up."

That was the moment it hit me: we didn't know how to measure what success looked like.

The Mistakes That Taught Me the Most

The first thing I learned? Measuring UX outcomes isn't about having answers—it's about knowing what questions to ask before you start changing things. And I hadn't been doing that.

At one point, we introduced a new onboarding flow that we thought was super intuitive. Turns out, it was just confusing in a new and different way. We were tracking time spent on the onboarding page and saw that people were staying longer, so we assumed that was good. Nope. They were lost.

Eventually we realized: maybe "time on page" isn't always the badge of honor we think it is. Maybe people staying longer means they're struggling, not engaged. So we started asking a different question: How fast can a new user get through this, and do they stick around afterward?

The Awkward Reality of "User Success"

Another time, we launched a feature that got a lot of clicks. I mean, a lot. We thought it was a hit. But once we started gathering actual feedback, we found out most users didn't even know what the feature was supposed to do. They kept clicking it trying to figure it out. Oof.

That was a big lesson: just because a feature is popular doesn't mean it's useful.

We started combining usage data with small feedback prompts—just simple, honest questions: Was this helpful? Did this work the way you expected? Hearing from users in their own words changed everything. It stopped us from declaring victory too early.

Speaking in Metrics Without Sounding Like a Spreadsheet

Once we got better at collecting meaningful data, we had to learn how to actually talk about it—without putting people to sleep.

Telling stakeholders "bounce rate dropped by 15%" meant almost nothing. But when we said, "More people are sticking around because the new layout gives them more confidence in the product," that landed. Even better when we added something like, "We're seeing a 15% increase in signups because of it."

The more human we made the story, the more people cared. Because it's not about numbers for numbers' sake—it's about what those numbers mean for real users. Like, how much easier it is for a parent to log a daily report in a childcare app, or how many fewer steps a small business owner takes to get insured.

The Goalposts Move—So Should Your Metrics

We also learned not to get too cozy with any one metric. There was a time we obsessed over page load speed. We were convinced it was the biggest hurdle to user happiness. And it was—for a while.

Then, once we got performance where it needed to be, we started seeing support tickets shift from "this is slow" to "I don't understand what to do next." Suddenly, comprehension was the issue, not speed. So we changed what we measured.

That flexibility—knowing when to let go of the old yardstick and pick up a new one—is just part of the game now.

What I Know Now (That I Didn't Back Then)

I don't have a perfect system for measuring UX impact. I've definitely shipped features without clear metrics. I've misunderstood what the data was telling me. I've made assumptions that didn't hold up.

But over time, I've learned to slow down and ask better questions before diving into a redesign. I've learned to pair analytics with actual human feedback. And I've learned that the real win isn't just making something look better—it's making it work better, in ways we can see and explain.

UX Engineering isn't just about clean code and smooth animations. It's about outcomes. It's about impact. And the only way to know if we're doing it right is to measure, listen, and keep learning—even after launch.