Evaluate user data - practical experience

last updated:

Teams struggle with evaluating user data. More than in any other phase of an innovation project, it is important to proceed analytically and playfully in equal measure - and this is often where the problems start.

The problems manifest themselves in many ways. A few impressions of the synthesis phase from inexperienced team members from past projects:

“I don't know how we should document the interviews. What is too little, what is too much?”

A lot of user data - a lot of inspiration?

Lakshman can understand the concerns of the teams. Instead of relying on documentation, he convinces his teams to engage in narrative engagement with user data. That means team members tell each other the user stories they heard from their interviews: an effective way to share knowledge.

 

“What I found inspiring in the interviews, my colleague found trivial. How can I objectively determine what is relevant?”

Teams need collective inspiration

The short answer: you can't. Qualitative user data has been collected to inspire the team and think in new ways. If you are inspired alone, then that is not enough.

In her projects, Judith attaches great importance to teams coding their data together. The so-called free coding means that the teams place individual data and observations in a meaningful context, e.g. "statements that show how media consumption works as a mood enhancer". find inspiring and relevant.

 

“We have always looked for a guide for the evaluation. A step-by-step approach. That would have helped us.”

Scheme F does not work with qualitative data

Moritz Gekeler can understand the desire for a clear sequence of steps. However, he knows from his own experience that it is more effective to adapt the evaluation to the data situation. There is a simple logic behind this: the teams want to create innovations, and innovations describe an unknown goal. And how should it be possible to reach an unknown goal by having the team follow a predetermined sequence of steps?

Instead of relying on instructions, practitioners first play with their data - as a team or on their own: they arrange their data in different ways in relation to one another. This works particularly effectively on digital whiteboards. Working with sticky notes is often suitable for larger teams.

  • Does it show that promising data have a similar content to each other? Moritz often forms clusters at this point, with which similar data can quickly be brought together.
  • In other cases, Judith often sketches a 2x2 matrix for herself when her intuition tells her that some data belong together but represent different intensities.
  • Data that indicate a chronological sequence can be entered on a time axis, more specifically in frameworks such as user journeys and service blueprints.

 

“We have the feeling that our users have no problems at all. What now?"

Exciting insights often remain implicit Users rarely face a problem and remain paralyzed. you do something Even if their behavior is not optimal. It therefore makes sense not only to look for the problems, but to take a close look at the behavior of the users and draw conclusions from them.

Rael Futerman uses Activity Theory as a framework for how users act in certain situations. What tools they use. And what are their goals. Rael and his teams often discover new design opportunities when they recognize how individuals have already found makeshift solutions for themselves that are still hidden from most other people.

Beat and his team take a similar approach. With their jobs-to-be-done framework, they analyze hundreds of details on a topic (e.g. visiting a museum) and in this way identify the different levers that exist to continuously improve the existing experience.

But even these sophisticated frameworks are constantly being adapted to new situations and new projects. And adaptation requires wide-awake teams who enjoy getting involved with new qualitative user data over and over again.

Previous page