09 Nov 2017
There’s measuring content quality and there’s being stupid

There’s Measuring Content Quality And There’s Being Stupid

The best thing about creating content in the digital space is that absolutely every aspect of it is measurable. And the worst thing? Absolutely every aspect of it is measurable. The problem with measuring content quality is that the results are only as good as the tool. 

There are metrics for all manner of things, such as readability, the catchiness of the headline, SEO optimisation and, I assume, whether you sound like Hunter S Thompson after the mescaline kicked in. But for all the talk of machine learning and data analytics, most of what we rely on for measurement for content and beyond is rudimentary. 

For instance, over the past two decades, the movie review aggregation site, Rotten Tomatoes has been a dominant force in the entertainment industry. But its methodology is clumsy. It boils critics’ takes on a movie down to a binary “fresh” or “rotten” rating, then ranks a film based on the percentage breakdown. Without any nuance, it means that a movie where the overall sentiment is “ok, I guess” could get a “fresher” score than an ambitious movie adored as a masterpiece by most, but not all, critics. 

Is the synthesiser better than the paper of record? 

This sort of broad stroke dominates measuring the quality of your content, too. From Yoast to Expresso to Hemingway Editor app, there’s a wealth of apps on the web designed to help you edit your copy to improve readability. [Warning: highly unscientific proof point ahead]. Out of curiosity, I put the main news stories from The Australian, Sydney’s Daily TelegraphThe New York Times and The Sydney Morning Herald websites into the Hemmingway Editor app. It ranked the readability of the Tele and Herald as OK and The Oz and Times as poor.  

On the other hand, with a far better grade than any news story, the lyrics to Starship’s We Built This City, are “good”. Now, say what you will about the content of the Tele, an inability to understand what it is saying is not a problem. And if I’m being honest, no matter how many times I’ve heard that earworm of a song, I, at best, still only vaguely know what it’s on about. (“Marconi played the mamba”, I guess?). 

So, you could get a poor rating because the piece is full of over-technical jargon and meandering, badly constructed sentences. Or because it explains complex ideas for an educated audience with a decent vocabulary. And maybe your good score is because you just wrote the equivalent of what GQ described as “the worst song of all time 

Loved by machines, hated by humans 

Five minutes into building a website, you’re instantly assaulted with a series of “rules” that allegedly guaran-damn-tee to improve your Google ranking. “Google values long content, so every page should be the length of the entire Harry Potter series” “use the keywords X number of times, including at least once in the intro and a subhead, and then cross-stitch it onto a cushion for good measure”. If you follow them all, your content sounds like one of those people at a party that just waffles on and on, but also jarringly uses your name in every other sentence, so you keep paying attention.  

The ranking is important, but there’s little value in achieving it if people click away as fast as they click in because it sounds like a robot wrote it. Besides, Google’s algorithms change all the time, so gaming the system only works for a while. Better to create content that’s genuinely engaging. 

What happens next will shock you 

I quite like headline analysers such as CoSchedule’s to create clickable content. But I’ve learned, and don’t ask me how, that if you add a swear word that’s not completely dissimilar to the phrase “monster trucker” it immediately improves the rating. By a few percentage points, too. However. unless you are writing a tribute to actor Samuel L Jackson, using this trick to improve a headline’s SEO is not a good idea.  

Similarly. you can create a headline that scores more highly than the one that more accurately reflects what your content is about. You want headlines that will reach the right people, not just more people. 

Should you still measure? 

Does that mean you should throw measurement-based editing tools out the window and just write with your heart? Not quite.  

It just means that you can’t hand all the thinking off to a machine, any more than you’d let your GPS navigation guide you off a cliff. Here are some reasons to measure content quality anyway. 

  • You still want the odds to be ever in your favour

    If you can make a few tweaks here and there that increase the odds of ranking better on search or social or engaging your audience, why not take advantage of that? 

  • They can be red flags

    These tools work best when you use them as a heuristic rather than an iron-clad rule. If something gets pinged, at least consider if there’s a better way to express it. If you’re thinking about whether a sentence or headline is crap or if you wrote it weird, then it’s bound to improve your writing.  

  • It forces you to think about the audience.

    Ok, you got a certain score for readability. Are your Starship or the New York Times? Does the score reflect that? I still use a site like Rotten Tomatoes, because even though it’s flawed, I generally click through to the reviewers or publications with similar tastes to me. There’s also one or two reviewers whose dislike of a film almost guarantees I will like it, so even that information is useful in guiding my decision to see a film. 

  • It helps you to the learn the rules

    Unless you’re a seasoned writing professional and SEO guru, you may, in fact, have NFI what you are doing when creating content. You should follow the rules until you get good enough to know how to break them properly. Then you can distinguish between when you’re ignoring grammar rules because you’re being sloppy or when you’re doing it for dramatic effect.  

  • Humans are also terrible at measurement. 

    Thanks to any number of logical fallacies, from anecdotes to confirmation bias, we’re not that good at knowing whether something works, either. It’s why we have homoeopathy. You might think what you wrote makes perfect sense or nails SEO. The machines might think otherwise. 

  • It gets refined

    More sophisticated machine learning and analytics will improve the quality of these results to the point where people with mullet perms singing about being “knee deep in hoopla” won’t rank higher than one of the most respected newspapers in the world.  

Many of tools for measuring content quality are blunt instruments at best. Still, a blunt tool is better than biting through a tree branch with your teeth. Provided you use your head to determine what’s valuable and what’s not, they serve a purpose. And speaking of heads, I’m sorry for putting We Built This City into yours.  

 

Leave a Comment