When people think about design research, the first target audience that comes to mind is the consumer. However, we often design engagement strategies for audiences inside of organizations, i.e. employees. And politics and hierarchy can make conducting research on employees especially challenging.
Joe Kappes (formerly of Cooper) has designed employee tools for a data analytics company. We set out to understand how he recognized and overcame the challenges of working for an internal audience.
CPE: Tell us a little bit about the goal of your project.
Joe: Our goal was to understand the software needs of 18 independent teams, evaluate each team’s custom-designed tools, and finally to design standardized software to unite the company.
What stood out in your team member interviews?
As we began user research, interviewees lamented about a range of issues – from clunky software to lack of coffee. After a few days, something peculiar happened: interview responses started converging; employees were mentioning a particular software issue over and over again, and discussion about organizational change was squarely avoided.
How did you encourage interviewees to speak to their individual needs? And what advice can you give other designers in similar situations?
Those are both great questions. To move forward, we had to make sure our interview methods were conspiracy proof, I can speak to a couple of the tactics we employed.
Employees often give canned responses because they fear retaliation. To elicit honest responses, you must build strong rapport and trust with interviewees. This is even more important when someone’s livelihood is on the line. Before you start interviews, gain the clients’ approval to anonymize responses and let them know that you will receive better data if you have gained the employees’ trust. At the start of the interview, assure employees their responses will not be shared verbatim. Make sure to abide by your promise!
Interview a large set of employees.
To ensure the privacy of employees, interview a large group of individuals and talk with multiple people from each team. In this instance, we had one-on-one conversations with nearly 40 employees. This helped anonymize the responses and decrease the risk of retaliation.
Repeat the same question over again.
Interviewees were obviously sharing the questions we asked with their colleagues. To overcome this, we reframed the same question with slight tweaks. We found interviewees would change their responses if they were re-asked the question in a different way. As they reframed their answers, they shared more details.
Change questions as much as possible.
We tend to end interviews with an aspirational question, e.g. “If you could wave a magic wand and change one thing to improve your situation, what would it be?” This question prompts the interviewee to reflect on the most important topic covered in the interview, and discuss it without constraints. Given that our subjects were trained to respond to the words “magic wand” in the same way, we changed the prompt to get at the same idea with a different framing e.g. “If you had a million dollars to throw at a single problem…” or, “When I leave, I’ll have enough time to solve one problem, what should I focus on?” Even slight variations were enough to draw diverse responses.
Ask for “boring details.”
In normal user testing, subjects will make a few moves, and then pause to make sure you’ve followed their train of thought. However, these subjects knew our scenarios and had practiced executing them quickly. At one point, a subject said, “Next I would fill the spreadsheet with the appropriate data. Look, in this separate window I have done so already. Let us continue…” Instead of moving forward, we asked them to step back and walk us through every boring detail between here and there. By sharing “the boring details,” they moved away from the script.
Ask questions out of order.
It’s easier to tell the truth than remember every detail of a rehearsed story. Mix up the questions in your interview guide to disrupt the expected flow of questions.
Through employing these techniques, Joe’s team conducted 40 valid employee interviews. The data empowered his team to ideate and design four innovative products and services, which have been smoothly and successfully implemented.
Get practice designing and conducting research interviews in Cooper Professional Education’s Design Research Techniques course, or learn effective tools for improving the employee experience in your organization when you bring our Mastering the Art of Feedback course in-house.