Start With a Baseline, Not a Hope
You measure ROI on a website redesign by comparing specific business metrics before and after launch, using a baseline you captured before any design work started. Without that baseline, you are guessing. Most teams get this wrong not because they lack tools, but because they start measuring too late, often weeks after the new site goes live, when the pre-redesign data has already been muddled by the transition.
The formula itself is straightforward: (Gain from redesign – Cost of redesign) / Cost of redesign × 100. The difficulty lies entirely in defining “gain from redesign” with honesty and precision. A redesign touches dozens of variables simultaneously, so isolating its impact requires discipline, forethought, and a measurement system that was in place before the first wireframe was drawn.
In our projects at NexusBond, we define what “success” looks like for the redesign during the discovery phase, not after launch. That single practice changes everything about whether a team can eventually calculate a credible ROI number.
Why Most Redesign ROI Calculations Fail
The most common failure mode is vague objectives. Teams approve a redesign budget based on feelings: the site looks dated, competitors have refreshed theirs, the CEO saw something they liked at a conference. None of those reasons are wrong, but none of them produce a measurable outcome either. When you cannot articulate what the redesign should improve in numbers, you cannot later prove it did.
The second failure is measuring the wrong things. Traffic and pageviews are the default metrics teams reach for, but a redesign rarely exists to increase raw traffic. That is what SEO campaigns and paid media do. A redesign typically aims to improve what happens once visitors arrive: conversion rates, engagement depth, lead quality, time to action, or user self-service rates. If your post-launch report focuses on sessions and bounce rate, you are reporting on weather, not outcomes.
The third failure is ignoring the cost side of the equation. Teams frequently calculate ROI against the agency invoice alone, forgetting the internal time spent on content migration, stakeholder reviews, QA testing, and the productivity dip during the transition period. An honest ROI calculation includes every hour and every pound your organisation spent, not just the external fees.
Define Your Metrics Before You Brief the Designer
The metrics you track should connect directly to the business case that justified the redesign. If the redesign was approved because your sales team reported that prospects found the old site confusing, the metric is sales-qualified lead volume or demo request conversion rate, not homepage impressions. If the redesign was driven by customer support costs, the metric is support ticket volume per active user or self-service completion rate.
Here is a practical framework we use at NexusBond to tie redesign goals to measurable outcomes:
- Revenue-facing goals: Lead conversion rate, average deal size influenced by the website, pipeline velocity from website-sourced leads, e-commerce revenue per session.
- Efficiency goals: Support ticket deflection rate, time to find information (measured via session duration on key pages), content update cycle time for internal teams.
- Engagement goals: Scroll depth on key pages, video play rate, return visit frequency, multi-page session percentage.
- Brand perception goals: Survey-based Net Promoter Score for the website experience, qualitative feedback from sales calls about prospect confidence.
Pick two to four primary metrics and track them rigorously. If you try to measure everything, you will report on nothing with conviction. The metrics you choose should be ones your leadership team already cares about. This is not the time to introduce a novel KPI that requires a 10-minute explanation in every board meeting.

Capturing a Reliable Baseline
Your baseline period should cover at least 90 days of data from the existing site, ideally longer if your business has seasonal patterns. A 30-day baseline is fragile because one unusual week (a viral social post, a holiday, a major outage) can skew the numbers enough to invalidate comparisons.
What we typically find on mid-market sites is that the existing analytics setup has gaps. Forms are not tracked as conversions. Key CTAs fire no events. The CRM integration is broken or incomplete. This means your baseline data is less reliable than you think. If you discover these problems before the redesign starts, you have time to fix tracking on the old site, run it for 90 days, and then begin the redesign with confidence that your “before” numbers are solid.
If your current tracking is unreliable, that itself is a finding worth documenting. In that case, set up proper tracking on the old site as early as possible, even if it means delaying the redesign kick-off by a month. A redesign with a clean baseline will prove its value. A redesign without one never will. For a deeper look at how to build tracking into site projects from the beginning, see our measurement systems guide.
What to record in your baseline document
Create a simple spreadsheet or shared document that captures the following for your baseline period:
- Primary conversion metrics (e.g., form submissions, demo requests, purchases) with weekly and monthly totals.
- Conversion rates by traffic source, so you can later distinguish whether changes came from the redesign or from a simultaneous shift in your traffic mix.
- Page-level performance for your most important pages: product pages, pricing, contact, and any high-traffic blog posts that drive leads.
- Technical performance metrics: Core Web Vitals scores, average page load time, mobile vs. desktop split.
- Qualitative notes: known UX complaints from sales or support, recent survey results, heatmap observations.
This document becomes your reference point for every post-launch conversation. Without it, debates about whether the redesign “worked” devolve into opinion.
Calculating the True Cost of a Redesign
The cost side of the ROI equation deserves as much rigour as the gain side. Most teams undercount by 20-40% because they exclude internal labour and opportunity costs.
External costs are the easy part: agency or freelancer fees, stock photography, new tools or platform licences, hosting changes, and any third-party integrations purchased for the new site.
Internal costs are where the accounting gets honest. Every hour your marketing manager spent reviewing wireframes, every afternoon your product team spent rewriting copy, every sprint your developers spent on CMS customisation counts. Estimate conservatively using a blended internal hourly rate. For a mid-market B2B company with 50-150 employees, we commonly see internal time add 25-50% to the external project cost.
Transition costs are the sneakiest. These include the temporary dip in organic search rankings that often follows a URL structure change, the lost leads during the week of launch when forms were misconfigured, the retraining time for staff who manage the CMS, and the three months of bug fixes and tweaks that no one budgeted for but everyone expected. Document these as they happen. They are real costs, and ignoring them inflates your ROI number dishonestly.
Measuring Gains: Short-Term and Long-Term
A common mistake is evaluating the redesign too early. The first two weeks after launch are noisy. Returning visitors behave differently because the interface is unfamiliar. Google needs time to recrawl and re-index. Your team is still fixing edge-case bugs. Drawing ROI conclusions from week-one data is like judging a restaurant by the soft opening.
The 30/90/180 review cadence
We recommend a structured review schedule that separates signal from noise:
At 30 days post-launch, you are looking for red flags, not wins. Has anything broken badly? Are conversion rates significantly worse than baseline? Is there a traffic segment that has dropped off a cliff? This review is about triage, not celebration. If core metrics are roughly flat or slightly improved, you are on track.
At 90 days, you have enough data to make meaningful comparisons to baseline. Seasonal variations are partially controlled for if your baseline was the same quarter the prior year. This is when you calculate your first credible ROI estimate. Compare your primary metrics against the baseline, normalise for any traffic volume changes, and identify which pages or funnels improved most.
At 180 days, the redesign has had time to compound. SEO improvements have propagated. Your team has optimised based on early data. Content gaps identified post-launch have been filled. The 180-day review is your definitive ROI assessment. It is also the moment to ask: what should we optimise next based on what we have learned?
Normalising for external variables
Your website does not exist in a vacuum. Between your baseline period and your post-launch assessment, other things changed: you launched a new ad campaign, your main competitor raised their prices, your sales team grew by two people, or a pandemic reshaped buying behaviour in your industry.
Perfect attribution is impossible, but you can apply reasonable controls. The most practical method is to segment by traffic source. If organic traffic converted at 2.1% before and 3.4% after, while your organic traffic volume stayed roughly constant, that improvement is almost certainly attributable to the redesign. If paid traffic conversion also improved, but you simultaneously doubled your ad spend and refined your targeting, the redesign deserves less credit for that channel’s gains.
Another useful technique is page-level comparison. If your pricing page was fundamentally redesigned and its conversion rate jumped from 5% to 9%, while a blog post that was migrated without changes stayed flat, that is strong evidence the design change drove the improvement. This granularity is far more convincing to stakeholders than site-wide averages.
Putting a Pound Value on Improvements
To calculate ROI as a percentage, you need to express gains in the same units as costs: money. For e-commerce businesses, this is straightforward because revenue is tracked directly. For B2B businesses, it requires a few more steps.
If you sell through a sales team, the chain looks like this: website visit → lead → sales-qualified lead → opportunity → closed deal. You need to know your average deal value and your conversion rates at each stage of this funnel. If the redesign increased your monthly form submissions from 80 to 110, and your historical data shows that roughly 1 in 8 form submissions becomes a closed deal worth £12,000, then the redesign is generating approximately 3.75 additional deals per month, or £45,000 in incremental monthly revenue.
Over 12 months, that is £540,000 in additional revenue attributable to the redesign. If the total cost was £120,000 (external plus internal), your ROI is (£540,000 – £120,000) / £120,000 × 100 = 350%.
This kind of calculation requires you to be honest about your assumptions. If your close rate has also improved because of new sales training, attributing 100% of the funnel improvement to the website is not credible. A conservative approach is to apply a discount factor. If the website was one of three significant changes during the period, attribute a third of the incremental revenue to the redesign. A lower but honest ROI number is far more useful than an inflated one that nobody trusts.
Valuing non-revenue improvements
Some redesign benefits do not show up on the revenue line. Reduced support ticket volume, faster content publishing for your marketing team, and improved accessibility compliance all have value, but expressing them in pounds requires a different approach.
For support cost reduction, calculate the average cost per support ticket (staff time × hourly rate, plus tool costs divided by ticket volume) and multiply by the reduction in tickets. If you were handling 400 website-related support enquiries per month at £8 per ticket, and the redesign’s improved self-service features cut that to 280, you are saving £960 per month.
For internal efficiency gains, estimate the time your content team saves per publishing task with the new CMS and multiply by their hourly rate across the number of publishing tasks per month. These numbers are smaller individually but compound over the multi-year life of the website.

Common Pitfalls That Distort Your Numbers
Even with good intentions, several traps can make your ROI calculation misleading:
Survivorship bias in page comparisons. Teams love to showcase the pages that improved dramatically and quietly ignore the ones that got worse. An honest assessment includes both. If your new services page converts 60% better but your new contact page converts 15% worse, report both and investigate why.
Ignoring the counterfactual. The relevant comparison is not just “old site vs. new site” but “new site vs. what the old site would have done during the same period.” If your industry grew 20% during the comparison period, your old site might have seen a 15% revenue increase even without a redesign. Your redesign’s net contribution is only the gain above that natural growth.
Confusing a platform migration with a redesign. If you moved from a slow, unreliable hosting setup to a fast, modern one as part of the redesign, a significant chunk of your performance improvement may come from the infrastructure change, not the design. When possible, separate these variables. When not possible, acknowledge the ambiguity.
Cherry-picking the comparison window. If your baseline was a slow quarter and your post-launch window is your busiest season, the comparison is meaningless without seasonal adjustment. Always use year-over-year comparisons when seasonal variation is significant.
Building ROI Measurement Into the Project Plan
Measurement should not be a post-launch afterthought bolted on when someone asks “so, did it work?” At NexusBond, we build measurement requirements into the project from the discovery phase onward. This means defining the KPIs during stakeholder interviews, configuring event tracking during development (not after launch), and validating that data flows correctly during QA, not three weeks later when someone notices the form submission event is not firing.
Practically, this involves a few specific steps that most project plans skip:
Include a measurement specification in your requirements document. This is a simple table listing every user action you want to track, the event name and parameters, where the data will be sent (Google Analytics 4, your CRM, a data warehouse), and which KPI it feeds. This document prevents the classic post-launch scramble of discovering that nobody set up conversion tracking.
Run parallel tracking for at least two weeks before launch. If your new site is on a staging environment, instrument it with the same tracking as production. Compare the data outputs to confirm they match expectations. This catches configuration errors before they corrupt your post-launch data.
Assign ownership of the post-launch review. Somebody specific, not “the marketing team” in general, needs to own the 30/90/180 day reviews. That person should have access to both the baseline document and the live analytics, and they should have the authority to request development time for tracking fixes.
Reporting ROI to Stakeholders
How you present the ROI number matters almost as much as the number itself. A dense 40-page analytics export will not land with your board or executive team. What works is a one-page summary that shows three things: what you set out to achieve (with the specific metrics defined pre-launch), what actually happened (with the comparison to baseline), and what it means in financial terms.
Lead with the money. If the redesign generated an estimated £200,000 in incremental revenue against a £75,000 investment, say that in the first sentence. Then support it with the metric-level detail for those who want to scrutinise the methodology. Include a brief section on what did not improve and what you plan to optimise next. This kind of transparency builds trust and makes it easier to secure budget for ongoing optimisation.
Avoid the temptation to report only vanity metrics that look impressive but mean little. A 40% increase in homepage sessions is not valuable if those sessions do not convert. A 12% improvement in your demo request conversion rate from organic traffic is a much more powerful story, even though the percentage sounds smaller.
What To Do Next
If you are planning a redesign, set up your baseline tracking now, before any design work begins. Define two to four primary metrics that connect to the business case. Document the full cost, including internal time and transition disruption. Then commit to the 30/90/180 review cadence and assign a named person to own it. If you are mid-project and realise you never captured a baseline, start tracking immediately on the current site. Even a 30-day baseline captured late is vastly better than none. The teams that can prove their redesign ROI with confidence are not the ones with the fanciest dashboards. They are the ones who decided what to measure before they decided what to build.


