<img src="https://secure.leadforensics.com/32006.png" alt="" style="display:none;">
How to Build a User Testing Strategy for Content Development
Publishing Accessibility Authoring eLearning

How to Build a User Testing Strategy for Content Development

Dec 04, 2018

UX testing, meaning User eXperience testing, has become a standard practice in many industries to ensure businesses are truly meeting the needs of their customer base. So why should the publishing industry be any different? User testing is nothing new, but publishers are only just beginning to understand how much they stand to gain from it.

Why Publishers Should Test Content

It’s widely accepted that websites should be tested for usability. Afterall, if it’s not serving its intended function the site will be rendered useless. Though we may not associate content with functionality, its purpose is indispensable. Content imparts valuable information, but if your readers can’t digest it the value is lost.

This is why it’s essential for publishers to test content for usability. It’s important to know definitively that the content being produced is appropriate for the intended audience, clear, and easily readable. If it’s not, the audience will quickly be lost. User testing allows publishers to define what aspects of their content needs improvement, avoiding bigger problems and rewrites down the line.

Use Readability Software with a Grain of Salt

Readability software is a popular tool to estimate how easy content is to comprehend. Unfortunately, it’s just that: an estimation. While readability formulas can calculate the length of words and sentences, they can’t check for many of the aspects of readability that are most important. That’s why publishers should never rely exclusively on any program to gauge readability. Instead, make a habit of user testing content to help develop audience-appropriate material.

Asking the Right Questions

Many publishers attempt to assess content by asking questions about whether participants liked and understood the content. This strategy makes sense on the surface, but it does a poor job of uncovering more subtle content shortcomings. “Liking” the content is positive, but liking the content doesn’t always imply that it’s actually appropriate for your audience.

Asking users to assess their own comprehension has even more obvious drawbacks. It’s very possible to read every word in a sentence and still miss the meaning behind them. The only way of answering the comprehension question is by assessing it directly.

Moderated vs Unmoderated Usability Tests: Which to Choose?

There are distinct pros and cons of different testing methods. The best testing method for your business all depends on your needs, goals, and budget.

Moderated Usability Tests

A moderated usability test offers the most in depth look into the user’s experience. The goal is to identify the purpose of the content and create tasks to see how users interact with it. Ideally, test participants should be reminded that the content is being tested, not them. To test comprehension, have participants read your content and use their newly acquired knowledge.

Imagine you’re selling digital learning packages for college level biology. One includes a semester long subscription, the next unlocks exclusive quizzes, study guides, and video lessons, and the third adds on lifetime access.

You could simply ask participants to select a subscription, but that would not explore how well the content describes the available options. Instead, the question could be rephrased as “select the least expensive subscription that meets your needs”. This slight adjustment requires participants to consider the content much more than the first question. To be sure that they selected the right option, all participants should also be asked to specify what their needs really are.

Having participants think out loud is another good way to spot weak points in the content. If a participant must read a single sentence multiple times to understand it, it’s due for a rewrite.

Unmoderated Usability Tests

While moderated user tests offer a substantial amount of insight, an unmoderated user test is a useful tool as well. This is especially true for publishers with small budgets, time constraints, or the need for a large sample size. There are many testing tools available, but the tool you use is less important than the questions you ask.

Multiple-choice tests take more time to create than open-ended questions, but they aren’t as time-consuming to score. To start, these tips can help you create a user-friendly multiple choice quiz:

  • Don’t try to confuse participants. Include only one distinctly correct answer
  • State questions in a positive form
  • Incorrect answers should be plausible for those who failed to understand the content
  • Weed out any answers that give clues to other questions
  • Skip “all of the above” and “none of the above” answers
  • Since guesses won’t help you test your content, include “I don’t know” as an option

A Third Option: Cloze Tests

 Whether you’ve heard of them or not, it’s likely that Cloze tests will feel familiar. A Cloze test omits select words from a content sample and asks participants to fill in the blank. Based on the Gestalt theory of closure, it requires the brain to try to complete the sentence logically. Although Cloze tests aren’t a typical user testing strategy, they’re highly useful for assessing comprehension of digital learning content.

To get started, use a small sample of text under 250 words. Replace every 5th word with a blank space and ask test takers to fill in the blank with the word they believe the author would have used. Answers can be scored by dividing the number of correct responses by the total number of missing words.

Anything less than 40% indicates content that’s inappropriate for the audience and needs a rewrite. Between 40-60% suggests somewhat challenging content, while 60% or higher indicates optimally effective content.

What and When to Test

Content can be tested at any time during the development process. Using a readability formula is a good first step, followed up by more in depth assessments as content is further refined. Always consider the purpose of the content to inform the focus of your questions. Ignore questions about irrelevant details and focus on tasks that are essential to user experience.

When done well, UX testing detects what users really comprehend, not what they claim to. This valuable insight should always be a part of publishers’ content development process. Whether content hits the mark on the first go or requires some reworking, you’ll walk away with valuable insight to give your audience what they want. And if your authoring tool is modular enough and allows you to update easily, as our platform MyEcontentFactory does, you can adjust in real time! Want to give it a try?

Contact Us

 

 

Leave a comment
New Call-to-action

NEWSLETTER

Flat Fees in academic publishing Whitepaper
Roi Case Study Gutenberg Technology
what is the future of educational publishing

RECENTS ARTICLES

TWITTER