Overcoming maddening measurement constraints

In order to measure design in meaningful ways, we first have to overcome some very real constraints. Measuring design’s impact isn’t as simple as flipping a switch (though some tools get you pretty close) and watching data roll in. There needs to be a desire at all levels to include these measurements in the organization’s decision-making fabric—and to keep them there.

You might think, “Oh, we’ve got MixPanel set up, it’s got this stuff.” Really? All the events are there? All the tables are set up to highlight those specific measures? If so, bravo. But more likely, you still need input from others to make that magically appear and work. This is about appetite, instrumentation, and having the right resources to make sense of the data. And as anyone who’s tried to measure design knows, these constraints can be vast.

Is there appetite?

Measuring design’s impact requires a true interest from senior leadership in what these insights bring to the table. I’d settle for a ‘lukewarm’ interest if it got some measurements going. It’s not enough to tack on a few metrics or run the occasional survey. Leadership needs to invite design into the landscape of measures that define the organization’s health and success.

We’re talking about:

  • Behavior change in core audiences.

  • The validity of solutions and directional insight toward new ones.

  • Connecting the dots between what’s happening and why it’s happening.

These metrics need to be taken as seriously as those from product, finance, or operations. Design metrics inform future innovation and improvement, shape roadmaps, drive priorities, add evidence, and dispel nonsense. They strengthen the connection between the organization and its audiences.

But appetite isn’t just about leadership; it’s also about everyone else who has to prioritize these measures over their usual work. For many teams, especially those outside of design, it might feel like extra work or even a distraction. In organizations where silos are strong, this can become a power struggle. Design needs to be seen as a collaborative force working toward the bigger picture, but convincing other teams to make room for design measurement often means breaking down deeply ingrained silos—or engaging in some “horse trading” of priorities.

Then, there’s the philosophical aspect. Design brings a blend of qualitative and quantitative data, a “whole picture” view that examines both the what and the why. Many leaders, however, are more comfortable with purely quantitative data—or as I like to call it, half-picture data.

I’m not saying it’s not important to know whether something happened. But if that’s as far as your knowledge goes, is that enough to act? Shifting these deeply embedded perspectives can take time, persistence, and a fair amount of relationship-building—arguably the greatest tool in the design toolbox—to bring design’s voice into the data conversation.

A story of constrained appetite

Let’s make this real: Imagine a key signup flow with a significant drop-off rate between key moments. The product team notices the numbers and flags them as a concern. Marketing chimes in, noting conversion discrepancies. Executives demand answers. But when asked why users are dropping off, the guessing begins.

There’s no built-in mechanism to ask users why they’re leaving. So, teams debate whose theory makes the most sense, launching A/B tests, building new features, and spending resources—all while flying blind. This is incredibly risky (It’s also pretty dumb). You burn more resources guessing than you would conducting one qualitative exercise that directly asks, “Why are you leaving?”. HotJar’s exit-intent surveying, for example.

Not only could that single question provide clarity on the drop-off, but it might also surface insights about the broader end-to-end experience. Without that data, you’re left with conjecture. With it, you’re equipped to take meaningful action. (Imagine me, or maybe yourself, shaking the hell out of marketers, PMs, and executives as I explain this.)

If we can build an appetite for design’s role in a data-driven organization, then we can start having the same conversations as other teams—like product, marketing, and ops. Until then, it’s an uphill battle.

The Instrumentation Challenge

Then we have instrumentation, which is an even trickier challenge. As I mentioned in the first article, design teams rarely control the actual buttons and levers. We don’t launch the code or finalize the product. And yet, to measure design’s impact, we need data and tool instrumentation—data that only teams like product, marketing, or engineering can implement.

This reality creates an immediate constraint: if we don’t control the necessary instrumentation, how do we get measurements in place? More often than not, this requires “campaigning” with teams, persuading them to prioritize design-related instrumentation among their many other initiatives.

Depending on the company’s stage and data maturity, you might find yourself in a whirlwind of reactive data fixes and firefights just to get the bare minimum instrumentation in place. If the organization is still debating who owns what in terms of data and metrics, this becomes a tug-of-war. (I like to play “Responsibility” by MXPX during these moments—“Responsibility, what’s that?”)

The Constraint of Data Synthesis and Literacy

Collecting data is only part of the equation; someone has to make sense of it. And not all designers are equipped to synthesize data into clear, actionable insights. Many designers find themselves expected to interpret data from a world that’s highly quantitative, which isn’t always their strength. This creates a data literacy challenge, where expectations to “own the numbers” can feel daunting.

Moreover, design teams often lack dedicated analytical support. Marketing teams might have analysts who specialize in furnishing insights that guide strategy and tactics. Design, however, is frequently left to share those resources—or go without entirely. Without proper analytical support, design teams face a significant disadvantage in proving their value through patterns and insights.


So, to Review: Tactics for Overcoming These Common Constraints
If you’re wrestling with these unfortunately common constraints, here are some practical steps you can take:

    • Educate leadership on design’s strategic role by connecting metrics to business outcomes like revenue, retention, and innovation.

    • Collaborate on shared goals with product, marketing, and others, showing how design metrics support their priorities.

    • Pilot small wins that showcase the power of combining qualitative and quantitative insights to inform decisions.

    • Tell compelling stories with real-world examples of how design has solved problems and saved resources.

    • Align with product/engineering roadmaps to prioritize data instrumentation for critical design measures.

    • Form cross-functional data teams to coordinate efforts and share ownership of measurement goals.

    • Start small by focusing on one high-impact area for instrumentation before scaling.

    • Leverage existing tools like Mixpanel, Amplitude, or Hotjar to jumpstart data collection.

    • Invest in upskilling designers through training in data tools, analysis techniques, and storytelling for insights.

    • Partner with analysts in marketing or product teams to co-analyze data and share insights.

    • Create digestible reports with dashboards or visuals that clearly tie design metrics to business outcomes.

    • Advocate for a dedicated analyst or shared resource to support the design team’s data needs.

    • Celebrate small wins by sharing successes broadly and building momentum for future efforts.

    • Leverage champions within leadership or other teams to amplify the importance of measuring design.

    • Document and share impact consistently to demonstrate how design directly supports organizational goals.


Next up: The Risks of Not Measuring Design

These constraints—appetite, instrumentation, and data synthesis—set the stage for a number of risks. What happens when an organization can’t or won’t measure design? In the next article, we’ll dive into the risks, including what’s at stake for design teams when their contributions are left unmeasured and, ultimately, undervalued.


So, strap in. We’re just getting started.

Previous
Previous

Your EHR Sucks

Next
Next

The Madness of Measuring Design