The Experiment Results Canvas helps you to make sense of the outcomes of a qualitative experiment in a workshop setting.
you want to make sense of (qualitative) experiment results in a workshop setting.
As a startup founder, the Lean Startup methodology tells you that you should be running experiments all the time. You should be out validating your idea, problem-solution fit, and product market fit.
In theory, this makes total sense. In practice, however, it can be quite difficult to make sense of the results of your experiments. Especially in the early stages, when you are running qualitative experiments, it is hard. First of all, you need a way to collect and share the information you received with your team. And, even more important, how are you going to draw conclusions from the results?
People tend to focus on the interviews they have conducted themselves. They notice the things that they already agree with, or have noticed before. This leads to a strong confirmation bias. The method you use should be able to deal with this.
Initial interviews are necessarily exploratory in nature. At this point, the team can’t really know in detail what they are looking for. There won’t always be a clear script to follow, and the answers are varied.
You simply can’t follow the approach that would be used for a large scale survey and use statistics. The low number of responses and the unstructured nature of the results make that impossible. This makes it very difficult to make a clear ‘validated or invalidated’ decision. From a statistical point of view your results are a complete waste of time. But at the same time, they are a treasure trove of information about your customers.
So, what can be done? How can we explore qualitative interview results in a workshop setting in a meaningful way? A way that is as objective as possible? How can we extract as much useful information from the interviews as possible?
Homework: First, going over the results in detail is required. That takes more time than you generally have in a workshop. So, first of all, homework is required. Doing time-consuming and sometimes expensive interviews and then avoiding to dive into the results is a total waste. If the team is prepared to interview 10 people, then they should also be prepared to read all the results before going into the workshop. Only looking at your own results means you’ll open the door to extra confirmation bias.
Framework: Second, a framework is needed to organize the results. The Experiment Result Canvas below was created as one (highly effective) way of doing that.