04 / Design Leadership
December, 2025Thoughts Around Running a Design Function
UX impact is often invisible to the business. The reasons vary, but one recurring pattern is this: businesses tend to operate with a Chicago School, economics-driven mindset in which decisions are grounded in measurable, spreadsheet-based outcomes. Design impact, however, is harder to quantify in those terms, which often makes its value less visible despite its significance.
This piece outlines practical ways to connect design impact to business success and approach IC management with clarity.
When UX is not tracked, its success is reduced to a gut feeling.
This is not a criticism of designers. Most designers care deeply about impact. But caring about impact and instrumenting for it are entirely different skills, and the second one is almost never taught. The result is that design teams consistently underreport their own value, leadership fills the gap with intuition, and resourcing decisions get made on vibes instead of evidence.
In practice, this can be summarized in three areas that make a design function genuinely strategic: tracking UX outcomes in ways that are not fully dependent on product efforts, leading individual contributors in a way that gives them real ownership, and knowing when to zoom in on problems without becoming a micromanager. These may sound like separate problems, but in the end they all shape the effectiveness of UX across the organization.
Part one: Making UX legible to the business
Intuition alone is insufficient for validating design impact, and the answer is not to become a data scientist. Building the minimum instrumentation that makes design decisions traceable to business outcomes already goes a long way.
For designers, tools like Jira or Linear are essential: not only for showcasing impact, but for end-to-end process tracking and a coherent way of working with engineering. A ticket is not just a task; it is a record of a decision and its outcome. Designers who keep their work invisible to the engineering workflow will always struggle to demonstrate value, because that value exists nowhere others can point to.
The data stack that actually matters for a design team is not complicated. It is three things: behavioral data from session recording tools (Datadog RUM, FullStory, Hotjar; pick one and use it consistently), event tracking instrumented at the right granularity, and qualitative synthesis layered on top to explain the numbers. The trap most teams fall into is collecting behavioral data without ever explaining it, or producing qualitative insight that has no quantitative anchoring. Both are incomplete arguments.
Three inputs, one synthesis layer, one clear output: the minimum viable UX tracking stack
The three metrics that actually matter
Part two: The designer's project journey and what guardrailing actually means
Micromanagement is a red flag. But it is not a single behavior; it is a spectrum of involvement and actions. The word gets thrown around so often that it has lost diagnostic value. Calling something micromanagement without asking what caused the manager to zoom in is like treating a symptom without looking for the underlying condition.
Across many teams, leaders often "zoom in" not from a desire for control, but because they detect a project breakdown that is already signaling risk. The solution is not to stop managing closely altogether. The solution is to treat close oversight as a diagnostic signal, then address the underlying issue directly instead of managing around it.
This intentional zoom-in is guardrailing. Here is the protocol.
A designer's project journey, oscillating within guard rails, with intentional zoom-in moments marking where the manager engages
A designer's project journey is not linear. It oscillates. It loops back. It has moments of clarity and moments of genuine confusion. The guard rails define the space within which that oscillation is healthy and expected. The zoom-in points mark the moments when the oscillation has gone beyond what the designer can self-correct, and when the manager's job is to engage, not to observe.
Three principles of intentional zoom-in
Part three: The two variables that define every designer-manager relationship
Once you accept that guardrailing is the right frame, two variables determine how it plays out in practice. They are simple to name and genuinely difficult to calibrate.
The width of Variable A is defined by the expertise, strength, resilience, and accountability of the designer. The wider the guard rails set by the manager, the less involvement is required, because the designer has the range to self-correct. What creates the "micromanagement" problem is when the manager gets involved while the designer is still moving forward. Not every oscillation is a warning sign. Most are simply part of the work.
- Make goals clear: ambiguity is the enemy of autonomy
- Right actions × time = trust; consistency builds the relationship
- Higher ratio of right actions = experience → promotion path
The number of zoom-in points during a project is a simple performance metric for both the designer and the manager. For the designer: are the right tools and knowledge in place to move forward? Are there enough answers to the fundamental questions? For the manager: were goals made clear? Was the right accountability medium established? Is the guard rail set at the correct width for this designer?
- Follow up at both the start and resolution of every problem
- Identify the reason for zooming in: was it a designer gap or a management gap?
- Every individual has their own way of moving forward. Be flexible.
Variable B is the more revealing metric. A high zoom-in count on a project with a senior designer is almost never about the designer: it signals that the brief was unclear, expectations were not set, or the environment created friction the designer could not resolve independently. A high zoom-in count on a project with a junior designer might be exactly the right amount of engagement. The variable is not good or bad in isolation. It is meaningful only in relation to Variable A.
Putting it together: the hands-on manager who isn't in the way
The phrase "hands-on manager" has been corrupted by association with control. In practice, the strongest hands-on managers are deeply context-aware without being directive. They know the state of the work because they looked at it, not because they demanded a status update. They know where a designer is struggling because they asked a strong question at the right moment, not because they reviewed every deliverable in real time.
This is not a passive leadership style. It demands more situational awareness, not less. It requires you to have a strong enough read on each designer's Variable A to know whether a given oscillation is normal or a warning sign. That read is only possible if you have invested time in understanding each person individually: their learning style, their relationship with ambiguity, their tells when they are stuck but too proud to say so.
The compound effect: when tracking and leadership reinforce each other
The reason these three threads (data tracking, guardrailing, and individual calibration) belong in the same piece is that they compound. A team that is well led produces work that is more consistent and more trackable. A team whose work is tracked produces evidence that reinforces good decisions and surfaces problems early, before they become manager-level interventions. And a manager who understands the data is better positioned to set the right guard rail width for each designer, because they know what the work actually produced, not what they hope it produced.
The design leader who has all three running is not doing more work than one who has none of them. They are doing different work: the kind that makes their team's value legible, their designers' growth visible, and their own management decisions less reactive and more grounded.
That is the underlying logic of adaptive leadership in design: not a methodology, not a framework, but a set of habits and calibrations that, taken together, create the conditions for designers to do their best work and for that work to matter where it matters most.
The point is not to eliminate friction, but to make it meaningful. Track UX outcomes so value is visible, set guard rails so ownership stays with the designer, and calibrate support to each person's range. When those three systems work together, designers grow faster, managers intervene with purpose, and the business can finally see what UX is actually changing.

