Don't assume the reason for struggle

When usability testing it sometimes feels very obvious why the user misunderstood something. But the human mind is a mystery - let us not assume that others' minds work the same way as ours. Always go and unpack the reason for the struggle.
DON'T
- [the UI says performance is bad]
- Participant: "Ok, performance is good"
- Researcher: [Hm... a usability issue. He thought the green means good performance, but it only means good confidence level.]
DO
- Participant: "Ok, performance is good"
- Researcher: [Ok, we have a usability issue. He misunderstood it. But why? What made him think that?] "How do you know it's good?"
- Participant: "Well, it says 14% improvement. And the confidence level is green so we are sure the improvement is real."
- Researcher: [So he understood that green meant good confidence. That's not the issue. Why did he think that the 14% was an improvement?] "I see - so green confidence means that the improvement is real. How do you know that the 14% is an improvement?"
- Participant: "It must be an improvement since it is a positive number. If performance went down, we would see a negative number. And what's that funny shape in front of the 14%, I don't know."
- Researcher: "I see, thank you." [Gotcha! He expects to see a minus sign if metrics go down. On top of that he couldn't recognize our custom-designed downward pointing arrow as such.]
It might feel awkward to ask the (seemingly) obvious from participants, especially when we have observers too. To avoid this, it's best to prepare observers for this situation.
"Don't be surprised if I ask the obvious. I do this because I don't want to perpetuate any of our assumptions. I wanna hear from the participant directly about how they understand things. It's surprising how often our best assumptions prove to be wrong."
A good researcher must have thick skin and a low ego.
Member discussion