how to measure content performance on my website

Pageviews are not content performance metrics.
I know they feel like it. The number goes up, things seem good. But most teams we see are tracking the same two or three metrics across every single page on their site.
Blog posts, case studies, pricing pages. All measured the same way.
That’s like grading your sales team and your support team on the same KPI. A blog post attracting organic traffic has a completely different job than a product comparison page sitting in the middle of an active buying cycle.
Measure both with pageviews and bounce rate, and you’ll draw the wrong conclusion about both.
The real problem starts before anyone opens an analytics tool. If you haven’t defined what each piece of content is supposed to do, no metric you pick will tell you anything useful.
You end up with dashboards full of numbers that don’t inform a single decision.
We wrote about how to think about content measurement when different pages serve fundamentally different purposes.
Honest question: does your team currently measure blog content and conversion pages with different metrics, or is everything getting the same treatment?
what is attribution modelling and why does it matter

100% of the credit.
That’s what most analytics tools give to the last touchpoint before a conversion. Every other channel that influenced the decision gets nothing.
A decision-maker sees your LinkedIn ad on Tuesday. Doesn’t click. Searches your brand Thursday, reads a case study. Next week they open an email, click through, fill in a form.
Your report says email drove that conversion. LinkedIn contributed zero. The case study contributed zero.
Now multiply that across every conversion for a year.
Most mid-market companies we see are making five- and six-figure annual budget decisions based on exactly this data. They’ve been optimising toward last-click for years without realising it.
The pattern is predictable. A channel looks weak in last-click reports, so it gets cut. Six months later the pipeline thins out. The channel that got cut was doing the awareness work that fed everything downstream.
By the time anyone connects those dots, the budget has already shifted and the damage is baked in.
Attribution modelling exists to prevent this. But choosing the right model isn’t straightforward, and each one tells a materially different story about where your revenue actually comes from.
If your team is about to make budget allocation decisions for next quarter, this is worth reading first.
how to build a website reporting dashboard that gets used

40 widgets. Real-time visitor counts. Colour-coded everything.
Six weeks later, nobody opens it.
The weekly meeting goes back to someone pulling numbers into a spreadsheet. Sound familiar?
This pattern repeats constantly with reporting dashboards. And the instinct is always to blame the tool. Switch platforms. Try better visualisations. Add more data.
None of that fixes it.
The real failure happens before anyone touches a tool. Most dashboards are built around metrics that happen to be available. Not around decisions that someone actually needs to make.
Those two starting points look similar. They produce wildly different dashboards.
One becomes a wall of charts people glance at once. The other becomes something a marketing lead opens every Monday because it answers “should I shift paid budget to the new landing pages or not?”
The specificity of the decision shapes everything. What metrics appear. How they’re grouped. What gets cut.
Most teams skip that conversation entirely. They jump straight to layout and data sources. Then wonder why adoption falls off a cliff.
If your team built a dashboard that quietly got abandoned, we wrote about where that process usually goes wrong.
how to track where my website leads come from

There’s a number that shows up in almost every analytics report we review.
It’s usually the largest source of leads. Teams point to it in meetings. Sometimes they build strategy around it.
And in most cases, it’s fiction.
The number is “direct” traffic. When Google Analytics can’t figure out where a visitor came from, it labels the visit as direct. Missing UTM parameters, stripped referrer data, app-based clicks that don’t pass source information. All of it gets dumped into the same bucket.
On many mid-market sites, this inflated bucket becomes the biggest reported lead source. Which means decisions about what’s working and what to cut are being made on data that was never real.
It gets worse. Without proper tagging, paid clicks show up as organic visits. SEO looks like it’s outperforming. Paid looks like it’s underperforming. Teams cut campaigns that were actually generating revenue because the credit went somewhere else.
In the companies we’ve looked at, 30-50% of lead sources are either misattributed or missing entirely. Not edge cases. The norm.
Most tracking conversations start with “which tool should we use?” The actual problem runs deeper than tooling.
We published our thinking on where this breaks down.
what is the difference between vanity metrics and actionable metrics

We’ve looked at a lot of mid-market dashboards. Most of them have the same problem.
Not too little data. Too much confidence built on the wrong numbers.
Here’s what’s counterintuitive: a team with zero analytics often makes better decisions than a team tracking vanity metrics. Because at least the first team knows it’s guessing.
The second team feels informed. They walk into board meetings with charts that only go up. Total views, cumulative users, raw traffic. Numbers that can’t decline by nature.
Green arrows everywhere. No signal anywhere.
The real test for any metric is simple. When it moves, does your team change what it’s doing? Or does everyone just nod and move to the next slide?
If the honest answer is “that’s nice” rather than “we should change X,” the metric is decoration. And a full dashboard of decoration is genuinely dangerous, because it replaces curiosity with false certainty.
Most reporting setups we encounter track plenty of numbers but answer very few actual business questions. The gap between those two things is where bad decisions get made.
We wrote about what separates vanity metrics from actionable ones, and why the same number can be either depending on context.
How To Set Up Google Analytics On My Website

Most teams treat Google Analytics setup as a one-step task: install the tracking code, confirm page views appear in the dashboard, move on.
This fails for a specific structural reason. GA4 out of the box tracks almost nothing that matters to a business. It records page views and basic session data, but it has no idea what a “conversion” means on your site, which form submissions matter, where users abandon key journeys, or how different traffic sources contribute to actual outcomes. Without configuring events, conversions, and filters after installation, you end up with months of data that looks active but answers none of the questions your team will eventually ask.
The result is a familiar one: leadership wants to know which pages drive enquiries, marketing wants attribution clarity, and the analytics setup that’s been “live” for six months can’t support any of it. The rebuild is always more expensive than doing it properly the first time.
We published a guide that walks through the full process, from account creation through to the post-installation configuration that separates useful analytics from expensive noise.
How To Track Conversions In GA4

What most teams do:
Migrate to GA4, toggle a handful of events as key events, and assume conversions are being tracked. Maybe they follow a quick tutorial. They see numbers appearing in reports and move on.
What actually matters:
The mechanics of marking events as key events are simple. The real work is upstream. Which events genuinely represent meaningful business outcomes vs. routine user behaviour? Are your events structured cleanly enough to produce trustworthy data? Have you actually verified they fire correctly, or are you just assuming the numbers in your reports reflect reality?
GA4’s event-based model is more flexible than Universal Analytics, but that flexibility is a trap for teams that haven’t deliberately designed their measurement strategy. Everything is an event now. Page views, clicks, scrolls, form submissions. Without a clear framework for what matters, you end up drowning in data that tells you very little.
We put together a detailed guide covering the full process, from understanding GA4’s event model to setting up both simple and complex conversions, debugging, and the specific mistakes we see constantly on mid-market sites.
New on our blog: How To Track Conversions In GA4.
How To Measure ROI On A Website Redesign

3 things I wish every stakeholder understood about measuring redesign ROI:
1. If you didn’t capture baseline metrics before design work started, you cannot credibly calculate ROI after launch, no matter how good your analytics tools are.
2. The ROI formula is simple math, but the hard part is defining “gain from redesign” honestly when a redesign changes dozens of variables at once.
3. Vague objectives like “the site looks dated” or “competitors refreshed theirs” are valid reasons to redesign, but they produce zero measurable outcomes, which means your ROI calculation will always be a guess.
We define what success looks like in numbered terms during discovery, before anyone opens a design tool. That single practice is the difference between a credible ROI number and a story you tell yourself afterward.
What Should I Track On My Website
How to build a website tracking setup that actually informs decisions without drowning in meaningless dashboards:
1. Define your website’s specific job before you open any analytics tool, because a B2B services company with a long sales cycle needs fundamentally different metrics than an ecommerce store.
2. Identify the 3 to 5 actions that represent a “successful visit” for your business, whether that’s a form submission, a phone call, or a specific content engagement pattern.
3. Map the visitor journeys that lead to those actions so you understand what pages and sequences actually drive conversions.
4. Cut everything else until those fundamentals are reliably measured and reviewed, because pageview counts without context tell you almost nothing useful.
The biggest tracking mistake we see in mid-market companies isn’t too little data or too much data. It’s that nobody asked “what decisions will this data actually inform” before setting anything up.
We detailed the full process in this article.
Analytics Isn’t Reports: The Tracking System That Makes Websites Measurable

Share You can’t optimise what you can’t measure. Most websites ship with “analytics installed” but no measurement system, so teams guess, argue, and default to opinions. The pattern repeats across mid-market companies: a website launches, someone installs Google Analytics, a few conversion goals get set up, and the team assumes measurement is handled. Six months later, […]