My content strategy responsibilities for MailChimp’s Knowledge Base have recently grown to include user research. As a part of that research, I want to learn who uses the KB and what they find helpful, extraneous, or missing. Then, I’ll translate the findings into actionable personas the KB team can use to stay user-focused when we make decisions.
But where to start?
I faced an uphill battle: much content strategy writing seems to assume the content team already knows who their audiences are or should be. Or, it’s written from an agency perspective that assumes the presence of a user researcher who will answer those questions.
To get started, I grounded myself in user research and usability best practices, thinking through how to apply them to content as I read. Here are a few resources that provided a rich foundation that helped me regroup when I inevitably took a wrong turn:
- Erika Hall’s Just Enough Research
- Leah Buley’s The User Experience Team of One
- Steve Mulder’s The User Is Always Right
- Steve Portigal’s Interviewing Users.
After reading up, I developed my research question. My first attempts were too narrow or based on faulty assumptions. After discarding versions like “How do KB users compare to MailChimp’s users?” (too narrow), and “What are users’ opinions about the current KB?” (opinions don’t correlate with needs), I landed on “Who uses the KB?”
Next, I explored research methods that could answer that question.
Vague as it is, “actionable understanding” is the main way we define success for the KB. While visits and page sessions are easy to track and measure, there’s no guarantee that one person’s short visit was any less helpful to them than another person’s long one.
People’s goals, reading habits, motivations, and daily lives are too complex and subjective to assign definite meaning to their clicks, scrolls, and exits on a help site. Instead, identifying trends on the KB means getting to know our readers by gathering and analyzing qualitative data.
Enter the survey
Surveys are a great way to capture qualitative data in bulk. I spent a few days drafting questions, and then shared them with a couple people on our Research team. They gave the draft a great edit, adding and removing questions, and encouraging me to think more deeply about my research goal. Novice mistakes are inevitable, and getting your work in front of experienced researchers is invaluable.
I realized I’d made many of the classic survey mistakes. I asked users what they liked, asked them to remember how they did something, asked too many specifics about how they used the KB.
But by the time the survey was ready to send, I’d learned which types of behaviors applied to my research. Which led me to asking better questions:
- What types of organizations do our readers represent?
- What role does email marketing play in those organizations?
- Where does email marketing fit into people’s work day?
- How experienced are our readers with email marketing in general?
- What proportion of MailChimp’s user base relies on the KB?
Broader questions like those are better suited to a survey. Plus, they give just enough information to segment user types, which helps me select interviewees, which is where most of my usable data will come from.
Learning through iteration
Creating content and researching it turned out to be similar processes: both are iterative ways to approach your subject. The persona planning process served the same purpose the first draft does for writing—it helped me figure out what I was really trying to accomplish. And iterating on the survey questions helped my approach evolve even further.
What about you? Have you done user research for a help site, or know of any resources on the topic? I’d love to hear from anyone doing similar work.