Strategies for improving technical documentation through user feedback
by David Garcia, Principal consultant
Ensuring technical documentation evolves alongside user needs and expectations presents a consistent challenge. However, by actively engaging with and listening to users, we can not only identify gaps but also gain valuable insights into the pain points within our documentation.
In this article, I'll delve into the specific strategies we employ at TechDocs to gather and integrate user feedback into our documentation process.
Getting feedback before publishing
As document writers, we serve as the first users of our documentation. Before sharing documentation more broadly, conducting an internal review proves invaluable.
Ideally, we seek feedback from three distinct profiles:
-
A subject matter expert: To ensure the content is technically accurate.
-
A fellow technical writer: To have a second opinion on clarity and adherence to style standards.
-
Someone without previous context: To have a fresh perspective on clarity and usefulness. This could be an individual within your organization unfamiliar with the specific feature or product being documented.
Timely commitments from reviewers are crucial to avoid schedule delays. If time constraints arise, we prioritize getting feedback from SMEs and technical writers to ensure both accuracy and clarity are addressed before publication.
One last tip: Always keep in mind this review period when considering estimations, as it's frequently overlooked.
Direct feedback mechanisms
A convenient method for users to express their thoughts is by incorporating a feedback feature directly onto the documentation page.
While providing a link to the GitHub repository may suffice for developer-heavy audiences, it comes with limitations. Not all users are familiar with GitHub, and requiring them to leave the documentation page to provide feedback, without anonymity, may deter some from sharing their thoughts.
That's why we opt to include a feedback tool directly on the page itself. This approach allows for anonymous feedback with minimal steps involved.
For instance, here at TechDocs Studio, we use the PushFeedback widget. Tailored for documentation sites, it enables users to highlight issues directly without navigating away from the page.
Disclaimer: We're the creators of PushFeedack! It's free for open-source projects, and reasonably priced for everyone else :)
Subsequently, we forward the feedback to an issue tracker, ensuring that every suggestion is thoroughly addressed.
User interviews
Direct engagement offers deep insights that might be overlooked through other feedback channels.
Whenever we get the opportunity, we aim to have a conversation, either through video or in person, with users who've recently worked with a specific part of our documentation. Such interactions provide us a firsthand glimpse into their journey, the challenges they face, and the sections they found particularly helpful.
However, organizing these discussions can be particularly challenging when the documentation is intended for external users. So while it's a valuable method, it's more about seizing the moment when the opportunity presents itself rather than relying on it as a consistent feedback mechanism.
User testing sessions
User testing sessions, where individuals interact with documentation in real-time, offer significant value. While traditional methods like group discussions are beneficial, injecting an element of fun or competition can elevate the quality of feedback even further.
Consider the innovative approach taken by Finboot to test their documentation: they organized a live "Escape Room" challenge. Here's how it unfolded:
The scenario: An evil adversary had set a challenging puzzle. The participants' mission? Solve it using only the company's documentation.
The gameplay: The entire company is divided into two teams. Their goal was to follow the documentation and send blockchain transactions in the correct sequence to solve the puzzle.
The result: This wasn't just a one-off event to do something different for one day but an actionable feedback session. Throughout the challenge, we gathered insights, spotted inconsistencies, and identified areas of ambiguity. Post-event, the commitment to refine the documentation, fueled by the shared experience, was stronger than ever.
User surveys
We use surveys as a way to capture the general sentiment towards documentation. While some users might not have recently interacted with the documentation, they can provide an overview of their overall perception.
For this reason, we prioritize in surveys generic questions such as:
-
How helpful do you find our documentation (1-10)?
-
Did you notice any topics or details that needed to be included?
-
How can we improve our docs?
Monitoring communication channels
Feedback isn't always direct. We regularly monitor communication channels, issue trackers, and social media. You'd be surprised how often users discuss documentation or have problems that can be potentially fixed with better docs.
Sometimes, indirect feedback is even more telling than its direct feedback. It's raw, unfiltered, and frequently captures users' immediate reactions to their pain points.
Whenever we encounter issues where the documentation didn't meet expectations, we create a separate issue. This ensures we circle back and consider potential refinements to enhance our docs.
Final thoughts
Technical documentation is always a work in progress. It's a continuous cycle of feedback, refinement, and adaptation. A holistic feedback strategy that includes direct feedback and indirect feedback ensures that our documentation is constantly evolving, relevant, and user-focused.