Skip to content

How to Make Your Content More Readable

how-content-more-readable

Your team has created, socialized, and measured content in just about every possible way … except for one of the most important metrics: how your audience is engaging with the words.

You want visitors to find information, stay engaged, or complete a task. But once readers’ eyes hit the words on the page, if it takes too much effort, their interaction falls off and you have churn. You know this anecdotally. Yet most don’t measure it scientifically.

Which of the thousands if not millions of words on a website are helping or hurting? What content is too dense or confusing?

Content teams work hard to create compelling content, but they have a natural blind spot. They’re too close to their creations – the blogs, thought leadership, and marketing pieces – to see them through the audience’s eyes.

HANDPICKED RELATED CONTENT: 27 Reasons Why Your Content Sucks

AI tools can benchmark content understanding

Now, with advances in natural language processing and artificial intelligence, a new breed of technology can test content for readability and clarity, which go to the heart of user experience and engagement. It can move organizations from a subjective approach, often fraught with editorial friction, to an objective, metric-based approach.

In this article I look at how to test for readability across your organization. For CMOs and chief content officers who want more engaging content, you now have ways to measure and benchmark clarity across the organization. And these tools can also help individual writers and creators produce better quality content.

Let’s define readability and clarity

Content clarity is the user experience of how difficult or easy it is to read text. Why is that important? We know from neuroscience that processing words places a far greater cognitive load on the brain than images. Plus, attention spans are shorter, meaning visitors have lower tolerance for confusion.

Processing words places a far greater cognitive load on the brain than images. @FergalMcGovern @VisibleThread. Click To Tweet

Fortunately, there are several widely used measures. The Flesch Reading Ease Index, created by Rudolf Flesch in the 1950s, calculates the average syllable per word and the density of long sentences and assigns a readability score – the higher the number, the easier content is to read.

The Flesch-Kincaid measure, derived from the index, scores readability at grade level, approximating the number of years of education required to easily understand content (though with this tool, a lower score means higher readability). The U.S. Navy developed the Flesch-Kincaid measure in the 1970s to ensure that soldiers under stress in the field could easily understand written instructions in their manuals. While today’s audiences most likely aren’t in combat, they do face the stresses of information overload and too little time.

Your readers are checking out and churning

Research shows that the average U.S. citizen reads at a seventh- to eighth-grade level – and it’s not simply a problem of low academic achievement. Some studies show even highly educated people disengage rather than spend the mental energy to unpack dense, complicated prose. Josh Bernoff, author and a contributor to the Harvard Business Review, surveyed 550 businesspeople in 2016. As many as 81% said poorly written content wasted too much of their time.

81% of respondents surveyed said poorly written #content wasted too much of their time via @jbernoff. Click To Tweet

That’s one reason Reader’s Digest and Time magazine are successful. Reader’s Digest has a readability index of about 65 and Time scores about 52. Give people the choice between reading something at that level versus run-on sentences of 40 or 50 words with several competing ideas, it’s clear which they’ll prefer.

What is good readability?

There’s no one size fits all, but the gold standard of readability for business communication is grade eight or lower.

You should limit long sentences and passive voice to 5% or less of the total content. Active voice rather than passive makes it clearer who needs to do what. By shortening sentences and using active voice, you can make even technical subject matter more readable and engaging without any “dumbing down.”  The best writing makes complex topics easy to understand.

Limit long sentences and passive voice to 5% or less of total #content, says @FergalMcGovern @VisibleThread. Click To Tweet

The clarity problem isn’t new; but organizations are now producing vast quantities of content for their audiences. And with that proliferation comes a quality control issue. In 2010, President Barack Obama signed the Plain Language Act to make U.S. government-created content less dense and confusing. Federal agencies must now use “plain language” to communicate more clearly with citizens.

There’s no equivalent plain-language law in the commercial sector, but, given how fiercely brands compete for audience attention, they should be addressing it voluntarily.

One financial organization my team worked with set an internal goal that all consumer-facing content must be of a grade eight to 10 readability level – an ambitious goal. Our organization also worked with government agencies that have similar objectives. For example, the Australian federal government aims for a grade five readability level as part of its digital service standard. All companies that create content for customers and other stakeholders should set similar goals for published content.

Behavior change is hard, but doable

Getting people to change their writing habits is far from trivial. Large organizations that use numerous agencies, freelancers, and internal contributors face tremendous challenges with consistency and brand coherence. Simply getting people to test their content can be a big challenge.

Leadership should adopt a “people, process, tools” approach. The people element must focus on two factors: awareness and education. Many commercial and government organizations send employees to writing-for-the-web-style training courses to raise awareness of readability. These courses not only cover webpages, but also inward-facing content including letters and product brochures.

The process element focuses on mapping the current publication process and identifying which steps can be automated. Unfortunately, many organizations rush to roll out tools without an adequate understanding of what the current or future state of their process will look like. This rarely succeeds.

How to execute content evaluation

With the people and process elements sorted, now you’re ready to consider tools.

Choose your tools wisely

One of the biggest benefits of tools is that you can measure readability with objective metrics. You move from subjective opinions to quantitative measurement. These measures ease the editorial friction between content creators and editors/publishers. Companies should consider a two-pronged solution:

  • Lightweight clarity tools in the hands of writers can help them measure, edit, and revise work to an agreed upon readability metric before submitting.
  • Audit tools for the content team allow it to control, review, and score content across the enterprise. These tools allow editors and content leads to benchmark quality over time, track the readability (and engagement) of writers and their published work, and even test and vet new writers for clarity.

While web analytics can show page views, dwell times, and usage paths, they won’t reveal issues with the content itself. Rather than obsessing over poor content performance on the back end, consider an investment in technology that can improve content on the front end, tailoring words to how your audience’s brains want to process them.

Web analytics won't reveal issues with the #content, says @FergalMcGovern @VisibleThread. Click To Tweet

Selecting readability tools and technology

As you consider which tools to adopt, ask these questions:

  • Will it support regular business users as well as subject matter experts? Determine who inside your organization will use the technology and whether it supports their needs.
  • Will it force too much friction or behavior change? For example, some solutions expect a business user to switch from Microsoft Word to edit content directly in a content management system; this approach rarely works.
  • Can it measure improvement of content across possibly hundreds of content creators? Many low-end tools are great for single users but have no dashboards for organization-wide metrics. These are critical to show ROI.
  • Is it suitable for work practices in a larger commercial or government organization? Less expensive and free tools are great for copy-and-paste-style text analysis by individuals but don’t natively support MS Word, PDFs, or direct URL analysis. If you map your process, you’ll quickly see which tools will work and which won’t.
  • How will you know if people are using the solution? The tool should give you usage dashboards across all content contributors.
  • Will it allow you to audit sections of websites for readability or hundreds of documents? A solution that forces the user to analyze individual URLs or individual documents one at a time will not cut it for whole or partial site and content audits.

Companies are spending huge amounts on content management systems and looking for ways to improve the user journey. But to what end if they continue to overlook half the content problem? Automation now lets us measure and find the problems in millions of words. Humans can then focus on editing and fixing those words to make them clearer and easier to read and engage with. Shouldn’t we?

A version of this article originally appeared in the February issue of  Chief Content OfficerSign up to receive your free subscription to our bimonthly, print magazine.

Get tips and insight on how to create a better content experience for your audience at the upcoming Intelligent Content Conference March 20-22 in Las Vegas. Register today using code BLOG100 to save $100. 

Cover image by Joseph Kalinowski/Content Marketing Institute