Sami Relander — March 18, 2019

Service Design 2.0


The philosophy and practice of Service Design is the current de facto standard in designing and developing digital applications and online services. Service Design, with roughly a 20 year history, has its roots in many diverse academic and business domains and stands firmly at the intersection of user needs and business requirements. Service Design is a battle-tested methodology fine-tuned through years of industry best practices.

Service Design consists of a collection of tools that together seek to uncover and provide answers to the most central questions in creating any new, or improving any existing, digital entity. The Service Design toolset includes e.g. focus group interviews, user stories and personas, customer journey maps, user testing, mockups and prototyping.  

The problem

Service Design relies largely on qualitative data and methods. Qualitative methods aim at understanding and explaining the phenomena at hand while quantitative ones aim to measure the phenomena and test certain hypotheses. The question of qualitative vs. quantitative approaches has long been debated in the behavioral and social sciences. It is widely accepted that both have their applications, and in many cases it’s best to combine them for both in-depth insight and certainty of results. However, as we argue in this blog post, it may be necessary to revise our understanding of this traditional division in the light of recent advances in data science and artificial intelligence.

Based on the current reliance of Service Design on qualitative methods, one could argue that the discipline is rather one-sided, something that potentially skews the process and biases results. In its current form and practice, Service Design risks over-emphasizing subjective perceptions and creating far-fetching narratives from small samples of test persons that are not selected randomly. This does not provide designers a comprehensive understanding and may lead the design process astray. Furthermore, because of the small sample size, the focus is often on the most typical and mainstream user needs and scenarios, ignoring the more marginal ones that stay under the radar. This may be a hard starting point for the development of new digital services that should be differentiated from their competition through e.g. understanding weak signals from the field and catering to latent and emerging consumer needs.

Powerful as it may be, the Service Design toolset isn’t complete. To be clear, “completeness” doesn’t refer to not having enough tools but to having the most appropriate tools in relation to the challenge at hand. Efficiency (doing the thing right) shouldn’t be granted chronological priority over effectiveness (doing the right thing). What we need are completely new tools, not merely fixes or improvements to existing ones.

Analysis of variance

Analysis of variance tests differences between two or more means in a sample.

The solution

It appears that many are missing a big opportunity when neglecting the use of traditional quantitative methods that have been around for hundreds of years. With the growing abundance of high-quality metric data at our disposal, we have a treasure chest of information that we’re overlooking in the design process. When augmenting the currently one-sided approach to Service Design we should be increasing the use of quantitative data and methods. This is a prerequisite to any application of modern artificial intelligence and machine learning based simulation tools. For clarity, we underline that using descriptive statistics (mean, median, mode, standard deviation etc.) to understand the nature of some set of variables is not considered a quantitative method in the sense that we use it here.

Our starting point here should be with multiple regression analysis, the most basic, yet also the most powerful, of quantitative tools. From a conceptual point of view, the understanding of having some number of independent variables predicting one dependent variable takes us away from merely analyzing the status quo to actually predicting user actions and business outcomes. This is a fundamental and profound change not only in data used and methods applied, but, more importantly, for our thinking and argumentation.

After mastering multiple regression analysis there is a long list of other applicable tools, like System Dynamics, that can be used to model complex phenomena over time; fuzzy logic based inference systems to model inexact, vague terms that we use in ordinary language; AHP for multi-criteria decision making; conjoint analysis, probability modelling, Monte Carlo simulations, self-organizing maps etc.


We’re at the dawn of a profound shift in how we design digital services. Over the next years we’ll see fundamental changes, as what has worked before, won’t be enough going forward.

Thus, service designers should seek to understand and deploy the basic thinking and intermediate-level techniques of mathematics and statistics into Service Design. Not for their own sake or for any process related instrumental value. Rather, for the simple fact that a richer, more flexible toolset can utilize quantitative data for better design outputs. Absolute, intrinsic value lies only in the end result of happier users or clients.

If current service designers fail to embrace this change, we’ll see their importance slowly erode as far superior simulation based design philosophies and practices take center stage over the coming years. Denying, dismissing or downplaying the need for Service Design – and service designers –  to go through a major update, cannot change the fact that it should and will.

In closing let us emphasize that our argument here has not been against the Service Design field in general or any specific tool or single method. However, as much of the current discourse around developing the Service Design field centers around trying to develop existing qualitative tools (that use qualitative data), our argument was that this simply isn’t enough. The current Service Design toolset is incomplete and requires completely new tools – not merely fixes or improvements to existing ones. These new tools will use different data that should produce deeper insights and thus better design outputs. We’re not suggesting that existing tools should be substituted, we’re merely submitting that their use be augmented and amplified with newer, quantitative ones.

Next up

Throughout 2019 we will provide a number of additional posts that showcase some concrete examples of how Evermade is using these new tools for the benefit of our clients.

Thanks to all of our designers who contributed to this blog post and and gave me feedback. 

Sami Relander

A project in mind?
We'll get back to you in 24 hours.