Benson
January 15, 2026 Essay

The thing about measuring UX

Quantitative signals are easy to collect and hard to interpret. Qualitative signals are the opposite. Neither is optional.

Note This is placeholder / sample text

There is a version of this argument that gets made constantly in design circles: that metrics reduce the richness of human experience to a number, and that by chasing the number you lose the thing. This is not wrong, exactly. But it is incomplete.

The fuller version is that quantitative signals are easy to collect and genuinely hard to interpret, while qualitative signals are hard to collect and — once you have them — far easier to interpret. You know why someone dropped off a flow when you watched them do it. You do not know why when you see a 34% abandonment rate.

What I have come to believe is that both are necessary not because they complement each other in some abstract sense, but because they answer different questions. Analytics tells you that something is happening. Research tells you why. Conflating the two, or treating one as a substitute for the other, is where organizations get into trouble.

The practical problem is that analytics is cheap and research is expensive. So the temptation is always to push analytics further than it can go — to infer intent from behavior, to treat correlation as explanation. This is where dashboards become dangerous: they create the feeling of understanding without the substance of it.

I don’t have a clean solution. What I do think is that the most important skill in this space is not knowing how to run a query or design a study, but knowing which question you are actually trying to answer before you choose your method.