I've been building websites for clients for over 20 years. In that time I've seen a lot of things go wrong after launch — but nothing surprised me quite like watching three separate clients get pulled into ADA accessibility lawsuits over issues I could have fixed in an afternoon.
We're talking about things like low-contrast text and font size on footer links. Genuinely small stuff — the kind of thing that slips through on a busy project, gets missed in QA, and then sits quietly on a live site for months or years. In each case, the client was swept up in what amounted to a class action targeting dozens of businesses at once. Frivolous? Probably. Stressful and expensive regardless? Absolutely.
That experience stuck with me. Not because the issues were hard to fix, but because they were so easy to miss — and so easy to find, if anyone had thought to look.
After the third time, I decided to build something that looked.
Here's the thing about launching a client site: the moment you push it live and close the project, the clock starts ticking. The client wants to tweak something. Their SEO consultant promised changing that would help. Fast change, no QA, no accessibility check, push it live.
Sites don't stay healthy on their own. Content editors upload images without alt text. A new page template introduces duplicate title tags across 40 URLs. A third-party script gets added and quietly tanks performance. A plugin update changes some CSS and now your footer links don't meet contrast ratios anymore.
Nobody's watching. The client assumes everything is fine because you built it. You've moved on to the next project. And the site just drifts — slowly accumulating the kind of technical debt that doesn't show up in a browser but absolutely shows up in search rankings, load times, and increasingly, in legal correspondence.
I've seen this pattern repeat itself across hundreds of client sites over the years. It's not negligence. It's just the nature of how web projects work. You ship, you move on, and nobody ever goes back to check.
I want to show you what the web actually looks like under the hood, because I think most developers underestimate how bad it is — including on sites they built themselves. Buckle up, here come the stats.
Performance
Only 57.1% of websites pass Google's Core Web Vitals assessment on desktop. On mobile it drops to 49.7%. Think about that — roughly half of all websites are actively failing Google's baseline performance benchmark right now. And the business impact is real: 53% of users abandon a site that takes more than three seconds to load, and there's an 8–10% conversion improvement for every 0.1-second improvement in load time.
SEO
Around 94% of all webpages receive zero organic traffic from Google. Some of that is content and competition — but a lot of it is fixable technical issues. According to SE Ranking, 54% of websites have duplicate title tags and 50% have duplicate meta descriptions. Only 26% of websites use alt text on images consistently. These aren't hard problems. They're just unexamined ones.
Accessibility
This is the one that keeps me up at night. The WebAIM Million report — an annual audit of the top one million websites — found that 94.8% of homepages have at least one detectable WCAG failure. The average homepage has 51 accessibility errors. The most common issues are low contrast text, missing alt text, empty links, and poor heading structure. Sound familiar? These are exactly the kinds of things I watched clients get sued over.
I've had this conversation more times than I can count. Client calls, mentions they're not getting much traffic. You ask if they've had an audit done. They say no, but the site was professionally built so it should be fine.
This is the dangerous assumption — and it's almost universal. There's no dashboard telling them their Core Web Vitals are failing. No notification when an image gets uploaded without alt text. No alert when a plugin update quietly breaks their heading structure. The site just sits there, looking perfectly fine in Chrome on a fast connection, while quietly underperforming everywhere else.
The uncomfortable reality is that most sites ship reasonably clean and then degrade. The site your client has today often isn't the site you delivered 18 months ago.
Back to those three clients I mentioned. What made each situation so frustrating was how preventable it was. We're not talking about deep structural accessibility failures — we're talking about the kind of surface-level issues that a 5-minute automated scan would have caught.
The broader trend makes it worse. More than 5,000 ADA digital accessibility lawsuits targeted websites in 2025 — a 37% increase over 2024. Settlements typically run $5,000 to $75,000, not counting legal fees and remediation costs. Nearly half of all companies sued in 2025 had already been sued before — meaning getting hit once and not fixing the underlying code just puts you back in the crosshairs.
The instinct is to slap on an accessibility widget and move on. I'd strongly advise against it. In the first half of 2025, 22.64% of all ADA lawsuits were filed against websites that already had an accessibility widget installed. Widgets don't fix the code — they layer over it. Courts and plaintiff firms know this, and they've gotten good at identifying sites where the widget is there but the underlying violations aren't.
The only thing that actually works is code-level remediation. Which means someone needs to run a real audit first. And with the DOJ's Title II rule taking effect April 2026 — requiring government entities to conform to WCAG 2.1 Level AA — the pressure is only going to keep building.
Here's the part that should really give you pause: in the context of WCAG conformance, a single failure of a success criterion means the specific page — or the entire website, depending on the scope of the test — fails to conform to that level of WCAG. One footer link with insufficient colour contrast. One image missing alt text. That's all it takes to go from "compliant" to "non-conformant." And non-conformant is exactly what plaintiffs are looking for.
I know what you're thinking: just run it through Lighthouse and call it done. But a complete audit actually requires several tools working together:
That last one matters more than most developers realise. Automated accessibility scanners can only detect around 30–50% of potential WCAG issues. They're excellent at catching technical failures — missing alt text, colour contrast ratios, empty links. But they can't interpret context. They can't evaluate whether alt text is actually meaningful, whether a form flow makes sense to someone using a screen reader, or whether the overall experience is usable for someone with a cognitive disability. That requires a human. Ideally one who uses assistive technology. Automated scanning is the floor, not the ceiling.
That's three to four tools plus human review, hours of work, and enough technical context to prioritize what actually matters. Most developers — myself included, for most of my career — don't have the bandwidth to do this for every client site. And most clients would never think to ask for it. So it doesn't happen.
Even when it does, the output is raw data. CSVs, JSON dumps, colour-coded spreadsheets. Turning those findings into something a non-technical client can understand and act on is a whole separate job. That's the part that makes audits feel like too much work for what they are.
This is the problem I set out to solve when I built SEOgent.
An agentic audit collapses that entire workflow. Instead of running four separate tools, exporting data, cross-referencing results and writing up findings manually, you run one process — crawl, SEO analysis, performance benchmarks, accessibility checks — and get back a prioritized, narrative report. One run. One output. Already written.
The difference isn't just speed, though it's dramatically faster. It's the synthesis layer. A good agentic audit doesn't just surface what's broken — it tells you what to fix first, why it matters, and what the likely impact is. That's what turns raw data into something you can hand to a client.
For developers, this changes the math on audits entirely. What used to be a half-day engagement becomes something you can run in the background, review for accuracy, and deliver with confidence. The grunt work is handled. You add the expertise.
Here's how I think about it now after 20+ years in this industry: every site I've ever launched for a client is a potential audit engagement.
Not a retainer. Not a long-term contract. Just a one-time audit, a clear report, and a remediation proposal if they want to act on it. It's an easy conversation — "I ran your site through a comprehensive audit and here's what I found" — and it delivers immediate, tangible value.
Done well, it's also a retainer opener. Clients who see concrete problems with dollar figures attached are far more likely to engage you for ongoing work than clients who are just vaguely aware their site could be better.
And honestly, after watching three clients deal with accessibility lawsuits that could have been avoided with a basic audit, I feel a bit of a professional obligation to have this conversation. The issues are fixable. The risk is real. And developers are the only people in the room who can actually do something about it.
The sites are already out there. Most of them are underperforming. The question is whether anyone bothers to look before the lawyers get there.
SEOgent was built to do exactly this. It's an API-first SEO crawler with an AI synthesis layer that generates narrative audit reports across SEO, performance, and accessibility. Point it at a domain, it crawls the site, runs analysis across all three pillars, and outputs a structured report with prioritized recommendations — the kind of thing you can hand directly to a client or use as the foundation for a remediation proposal.
Run a one-time audit starting at $9 — no subscription, you run it when you need it.