Design
Strategy

How to Measure the ROI of Design Thinking

By Lisa Helminiak

November 9, 2016

When clients ask how they should measure the return on investment (ROI) for using design thinking, my heart sinks.

Through experience I know design thinking saves time and money. It helps product teams work together and leads to better product outcomes. But measuring its impact is notoriously difficult.

How do you measure the money you don’t waste building out technology that would have failed to meet user needs? How do you measure the impact of establishing a culture focused on customers, rather than on efficiency and productivity alone?

Fortunately, new research offers frameworks to begin measuring the impact of design thinking in the context of business.

Using Multiple Perspectives to Measure the Impact of Design Thinking

In 2015, a research team associated with Stanford’s lauded d.school surveyed 403 design-thinking practitioners (most from larger, for-profit businesses). Their paper, titled Measuring the Impact of Design Thinking, affirmed that organizations continue to struggle in determining ROI. However, it also found that those most committed to the task recognized that design thinking can’t be measured as a single concept. (Köppen, Meinel, Rhinow, Schmiedgen, Spille, 2015)

The companies surveyed seem to acknowledge a Butterfly Effect from design thinking, and practitioners reported attempts to track it from a variety of perspectives:

  • Customer Feedback—customer satisfaction, net promoter scores, response to specific campaigns, usability metrics, client feedback
  • Design Thinking Activities—number of projects, people trained, coaches trained
  • Quick Results—concepts finished, projects launched, projects funded, projects in development
  • Anecdotal Feedback—evaluation forms, qualitative feedback at each stage of the design thinking process, surveys
  • Traditional KPIs—Increased Sales, ROI per project, and other financial measures
  • Culture—team efficiency, engagement, collaboration, motivation
Linking ROI to Business Drivers for Design Thinking

Earlier this year, Bernard Roth and Adam Royalty, two central figures from the d.school published a separate paper, titled Developing Design Thinking Metrics as a Driver of Creative Innovation. In it, they suggest ways to move beyond “execution oriented” metrics to those that would track “creative behaviors” instead.

First they identified three main drivers leading companies to pursue design thinking:

  1. To better understand customers or end users
  2. To protect business share from disruption and startups
  3. To develop more innovative methods and team dynamics

Then they devised new metrics for each driver.

Measuring Empathy

A key tenant of design thinking is gaining empathy with customers/users to discover unmet needs. The idea is that if we better understand needs, we can design better solutions and increase revenue or save money.

Roth and Royalty suggested the following metrics for measuring a project team’s empathy with customers/users:

  • Track the number of days the team goes between observing or interviewing customers or users (with the goal of reducing time between interactions).
  • Track the number of customers or user interactions over the life of a project.
  • Track interaction back to user personas to measure the diversity of customer or user insight.
Measuring Business Value

Another focus of design thinking is creating innovative products and services that add value to organizations.

Roth and Royalty suggested measuring value and novelty of project outputs on a grid where the vertical axis endpoints are “Valuable” and ”Not Valuable” and the horizontal axis represents ”Novel” to “Not Novel”.

The goal of the measurement is to understand if design thinking projects are perceived as valuable to the company and if they take the company in new directions. (The authors recommend that team members vote anonymously and that scores are averaged to determine grid placement.)

Measuring Innovation

According to Rother and Royalty, an earlier study (Dow & Klemmer, 2011) showed that more iteration leads to stronger prototypes, and stronger prototypes lead to better products. So they proposed two ways to measure how well a team iterates on an idea.

  • Measure the number of prototype iterations per feature. Measuring per feature is important because it allows for comparison between projects, regardless of the size of the project or feature set.
  • Measure the number of concurrent prototypes. Another study (Dow et al, 2010) suggests that developing prototypes in parallel (rather than in series) results in stronger outcomes.
Getting Beyond Cost Savings to Show Broader Benefits

One of Azul Seven’s clients, a large county government, began using human-centered design thinking three years ago.

Initially, they focused on traditional measurements of the cost savings created by more seamless digital content, systems and infrastructure we helped them create for their eGov initiative. This measurement wasn’t directly attributable to the design thinking work alone, but the user-centered processes we established clearly contributed to overall savings.

Once they could show the cost savings, the county was able to focus on how design thinking impacted other aspects of their work. Today, they focus less on reporting cost savings impacts and more on team efficiency and satisfaction. In the long term, these metrics are probably better indicators of how well the county provides services to residents. (This is especially pertinent as the county faces the prospect of a smaller workforce following a large trend of baby boomer retirements.)

Evolving Metrics that Work for Your Team

Software company Intuit is another example of an organization that’s embedded design thinking deeply throughout its culture and operations.

The 2015 study mentioned above (Köppen, Meinel, Rhinow, Schmiedgen, Spille, 2015) describes how the company evolved a story-based approach to evaluating its effectiveness. Intuit began with some of the metrics mentioned above but found that as design thinking regularly reframed challenges, the metrics also needed to be reframed. As a result, they now craft narratives that capture both qualitative and quantitative information into a broader evaluation of the business benefit.

Measuring the impact of design thinking isn’t easy, but it’s necessary to gain traction for the methods we practitioners see working day-in and day-out. The takeaway here is to start with an array of metrics that you know you can begin tracking immediately. Then tweak your measurement systems as your projects and processes evolve, always with the goal of proving long-term organizational value.

Want to learn more about how you can start measuring the impact of human-centered design? Or want to share what’s working in your company? Contact us.

Sources:

Köppen E, Meinel C, Rhinow H, Schmiedgen J, Spille L (2015) Measuring the impact of design thinking. In: Plattner H, Meinel C, Leifer L (eds) Design thinking research. Springer, Switzerland, pp 157–170

Roth B, Royalty A (2016) Developing design thinking metrics as a driver of creative innovation. In: Plattner H, Meinel C, Leifer L (eds) Design thinking research. Springer, Switzerland, pp 171–183

Dow SP, Klemmer SR (2011) The efficacy of prototyping under time constraints. In: Design thinking. Springer, Heidelberg, pp 111–128

Dow SP, Glassco A, Kass J, Schwarz M, Schwartz DL, Klemmer SR (2010) Parallel prototyping leads to better design results, more divergence, and increased self-efficacy. ACM Trans Comput Hum Interact 17(4):18

Want to learn more about design for innovation or biomimicry?

Related Articles

3 Key Practices for Integrating Design Thinking & Agile Development

9 Simple Ways Front-End Developers Can Hand Off Better Code

Great Design Requires Great Communication

Want to get in touch?