Note: I almost never keep my resolutions for writing more. But I’m really trying this time, with a new resolution. For every post I write, there will be a Tom Lehrer reference. If you find it, tweet it to me at @mjgoldsmith, and I’ll know you’re part of the cool-kids club.
Why can’t it all be eCommerce?
I’m super-jealous of analytics pros who focus primarily on eCommerce and lead generation sites. Well-defined goals. Dollars and cents. Easy peasy.
I work on sites with content. Governments communicating how they do their government things. Nonprofits trying to help people. This content informs and influences. It may give directions that lead to an offline process. It may be a small-niche complex scientific paper targeted at a small but influential audience.
Amidst all this variety, I’m constantly asked:
How can we tell if our content is engaging?
Engagement. It’s everywhere. An expression, full of words and metrics…and signifying nothing. Google Analytics 4 introduced the metric “Engagement Rate”. A session is “engaged” if a user spends 10 seconds on a page, visits 2 pages, and/or completes a conversion event. But does this truly mean the reader was engaged?
How your analytics tool defines engagement is likely not the way you define it. It’s unlikely that it answers the questions you’re asking. Because the full question isn’t “is our content engaging?” It’s actually “Is our content effective at __ ?” You need to think to fill in that blank. Your analytics tool can’t do it for you. And once you fill it in, you need ot figure out which metrics in the tool will answer it. And the answer may not even be in the tool.
Content effectiveness varies wildly not just between organizations, but within them. There is no one global metric for every website that can tell us that our content is effective. GA4’s engagement rate may be an interesting metric, but it is not a measurement of your content effectiveness.
Imagine tracking the engagement of a contact us form. Form submissions on the page are steadily decreasing. It could be that the form has a bad user experience. Or it could be that your website is so well laid out, organized, and easy to understand that everyone gets the information they need. Engagement on the form is not indicative of its effectiveness.
Imagine tracking the effectiveness of a 20,000 word scientific paper on river pollution. What makes that content effective? Maybe we check if users get to the bottom. Maybe we track all the citations that lead back to it. The number of times the user clicked the “print” or share button. No matter how detailed we get, these are all proxies for intent. Maybe the person who reads to the bottom is an important person who will give the author an important job. Or maybe I just shared or printed the piece because I wanted to share it with a colleague and highlight how much of it was bullshit.
The fact is, we are still very far from knowing whether a user is reading our content, amazed and starry-eyed.
The measurement of content effectiveness must be a multidimensional, multidisciplinary endeavor. Google Analytics by itself won’t tell you if your content is effective. Salesforce by itself won’t tell you if your content is effective.
This series will be a meandering stream of the various ways we can pick and choose how we measure the effectiveness of our content.