Metrics For Content Effectiveness: Part 3 of probably 4: The pageview is dead. Long live the pageview.

Metrics For Content Effectiveness: Part 3 of probably 4: The pageview is dead. Long live the pageview.

Remember last time, I said pageviews was a crappy metric for measuring content effectiveness? Well, today, I’m here to tell you about an exception.

I had a client who was changing up some content on a particular topic. If we looked at engagement rates (like the dwell and scroll, clicks on the page, etc), it was pretty good. It’s niche content that you’re unlikely to just stumble upon. Most people who made it to the page had a very specific thing to do and they did it, regardless of how difficult it was. But they just needed more people to visit the page. They wanted to update their content to be good, digestible, but also easy to find.

This client was, unquestionably, the absolute authority on this topic. But you wouldn’t know it if you looked at organic search results. They sometimes came in at the bottom, even when the query included their organization name. Organic search was instead dominated by “middlemen businesses” that would invariably give you the same information, but not before trying to sell you something.

The client made changes to the content. There was a lot of usability testing, surveys, and talking to the customers. They didn’t change or adjust their content specifically for SEO purposes. But they kept it in mind.

So how could we measure the success of this change? The users who already made it to the page almost always completed their task, so our content engagement rate was already good. It was unlikely to improve much. Instead, we looked to see if we could get an increase in pageviews from organic search, without a marked dip in engagement. We also looked at a time-to-task metric – how long it took for users to get to the page and click on a few specific links. We wanted that to go down in relation to what we saw in the usability testing.

We didn’t just look at one metric’s change over time. We looked at three within the context of each other. And that let us know if our content updates were successful. If just one metric got better or worse, that didn’t mean we’d succeeded, or failed. But put all three together, we could see where our changes were making an impact.