Why Every Website Needs a CRO Audit (And What It Includes)
What a CRO audit is really for
A conversion rate optimisation audit, done well, is a diagnosis. You have a website that gets traffic, and the traffic isn't converting at the rate you'd expect. The job of the audit is to find the specific reasons, in priority order, with evidence.
That's a different exercise from what most "CRO audits" deliver. The cheap ones are checklists of UX best practices ("add testimonials above the fold", "make your CTA buttons larger"). The expensive ones are the same checklist with a logo on it. Neither tells you why your particular site is leaking conversions, only what other sites in general have done. That's not a diagnosis.
The audit that's worth paying for produces a short list of specific friction points on your specific funnel, ranked by likely impact, with the evidence that makes the case. You shouldn't need a designer to act on it. You should need an afternoon and a deploy.
The five pillars
A useful audit covers five areas. Each one feeds the next, and skipping any of them produces partial answers.
1. Analytics, looking at the funnel as a real funnel
Where do users enter, and what's the path from entry to conversion? GA4 or Plausible or whatever you use will give you the structural answer. The interesting questions are the second-order ones. Which entry pages convert at half the site average, and why? Where does the dropoff cliff sit in the funnel? Is it consistent across devices, or does mobile fall off a step earlier than desktop? Funnels are usually more interesting on the steps you didn't expect to be the problem.
2. Heatmaps and session recording, used surgically
Heatmaps are not a deliverable. They're a diagnostic tool, and the way you use them is to start with a hypothesis from the analytics ("the pricing page converts at half the homepage rate") and then look at recordings of that specific page. You're looking for: where users hesitate, where they rage-click, what they scroll past, what they try to click that isn't clickable. Twenty recordings of users on the suspect page will tell you more than five hundred recordings of users in general.
3. UX and design review, against the actual user
Hold the page up against what the user came to do. Is the headline answering their question, or is it answering yours? Is the CTA visible without scrolling on a mid-range Android in a Manchester train carriage? Are the form fields the right ones, in the right order, with the right input modes? The tightest reviews come from doing the user's task, on the user's device, in the user's network conditions. Most teams haven't done this in a year, and the changes that fall out of it are often the ones with the biggest impact.
4. Copy and message, against intent
Copy reviews mostly don't fail on tone. They fail on intent match. The visitor came with a question, and the page answered a different one. A common version of this: a "Get a quote" CTA on a page where the user is still trying to work out whether you do the kind of work they need. The page is asking for a commitment the user isn't ready to make, and the conversion rate suffers. The fix is matching the CTA to where in the consideration cycle the visitor actually is, not adding a louder button.
5. Technical friction
The friction users complain about least, because they don't know to. Forms that fail on a specific Android browser. Payment flows that hang on 3D Secure. JavaScript errors that break the submit button only when ad-blockers are active. Test the conversion path on three devices, two browsers each, with and without an ad-blocker. You'll find at least one bug. We always do.
Diagnosis, not bait
The mental model that helps most: CRO is mostly about removing friction, not about adding more conversion bait. Adding a sticky bar, a popup, an extra testimonial strip, a second CTA: these are conversion bait, and they sometimes work in the short term, and they almost always make the page worse to use. The audit's job is to find what's already in the way and remove it.
A useful question to ask of any proposed change: would I want this on a site I was visiting as a customer? If the honest answer is "no, but it converts", you're trading short-term lift for long-term brand. Sometimes that trade is worth it. Often it isn't, and the cumulative effect of those trades is a site that converts well today and fewer people return to next year.
A worked example, anonymised
A B2B services site. Around 9,000 monthly visitors, mostly from organic search. The overall conversion rate to a "book a call" form was around 0.6 percent, which the team felt was low for their category.
The diagnosis took about a week. The funnel showed something specific: the case-study pages converted at 0.4 percent against a site average of 0.9 percent on landing pages. Those pages got more traffic, by a wide margin, than anything else. Recordings on the case-study pages showed users scrolling all the way through and then leaving. There was a "Book a call" CTA at the bottom, and a button-style link in the navigation, and that was it.
The hypothesis: a visitor reading a case study has just been told that the team did good work for someone else, but the page never converts the moment of interest into an action. The CTA at the bottom is the wrong CTA at the wrong moment.
The change: a single contextual block at the end of each case study, headed "Talking about a project like this?" with a one-line description of what the case study had just shown, and a "Have a similar problem? Tell us about it" form, two fields, no calendar. The original "Book a call" stayed, lower down.
The result, measured over the following six weeks against a stable traffic baseline: case-study conversion went from 0.4 percent to 1.1 percent. The form replaced about half the calendar bookings (which was fine; the team handled the lead either way), and the total volume of qualified leads from those pages roughly tripled.
That's a single change, on a single page type, drawn from one specific finding in the audit. The point of the example isn't that "case study pages need contextual CTAs". The point is that audits produce narrow, specific hypotheses, and narrow specific hypotheses are what you can actually deploy and measure.
What the audit doesn't deliver
It doesn't deliver a redesign. It doesn't deliver a brand strategy. It doesn't deliver a content plan. If your audit comes back with "you need a new website", you didn't get an audit, you got a sales pitch for something else. The findings should be actionable on the site you have.
It also doesn't, on its own, raise your conversion rate. Implementation does. The audit is a map. The work is walking it.
The summary
A CRO audit is a diagnosis. The five pillars (analytics, recordings, UX, copy, technical) are the structure of the diagnosis, not a checklist to tick. The output is a small number of specific friction points with evidence and a recommendation each. Most of the gains come from removing things, not adding them. If your site has traffic that isn't converting, an audit done well will tell you why; the implementation, ideally on the site you already have, is where the lift comes from.