Data Study

We Audited 43 Product Hunt Launches for Accessibility. 1 Passed.

Quick answer: We ran automated WCAG 2.1 AA audits on 43 Product Hunt launches from April 2026. Average accessibility score: 5.6 out of 100. Only 1 out of 43 sites passed (2.3% pass rate). 1,877 total violations were detected, with color contrast failures appearing on 40% of sites.

We audited 43 products from Product Hunt's April 2026 launches using axe-core for WCAG 2.1 AA compliance, design token coverage analysis, and component drift detection. The results: an average score of 5.6 out of 100, a median of 0, and a WCAG AA pass rate of 2.3%.

Score Distribution: 93% Scored Critical

Score RangeRatingSites% of Total
90-100Excellent00%
70-89Good12.3%
50-69Needs Work24.7%
30-49Poor00%
0-29Critical4093.0%

One site scored Good: Offsite (87/100, 2 violations). Two sites scored Needs Work: ProdShort (52/100) and Stanley For X (50/100). After that, forty straight zeros.

Top Violations: What Breaks Most Often

1,877 total violations: 272 critical, 1,448 high, 157 medium.

ViolationSites AffectedSeverity
Color contrast below 4.5:117 (40%)High
Buttons without discernible text9 (21%)Critical
Images missing alt text5 (12%)Critical
Invalid ARIA attributes3 (7%)Critical
No visible focus indicator2 (5%)High
Form elements without labels1 (2%)Critical

Worst and Best Performers

Worst Offenders

SpeakON (104 violations), Clawdi (91), NovaVoice (44), Dune (28), Cai (27).

Best Performers

Offsite (87/100, 2 violations, WCAG AA pass), ProdShort (52/100), Stanley For X (50/100), CapyPlan (24/100), stagewise (14/100).

Design System Findings

60% of scanned sites use Tailwind CSS. Average design token coverage: 37.7%. Every single site had measurable component drift, averaging 5.3 drifting component types per site. Links were the most inconsistent component.

Why This Matters Beyond UX

Legal Exposure

ADA Title II compliance requirements now extend to state and local government web content. The DOJ processed over 4,000 digital accessibility lawsuits in 2024.

AI Code Tools Amplify the Problem

Many Product Hunt launches are built with AI code tools. Our vibe coding QA research found AI-generated apps average approximately 160 issues per app.

User Exclusion at Scale

The WHO estimates 16% of the global population lives with a significant disability. A 2.3% pass rate means 97.7% of these launches are partially or fully inaccessible to that population.

Four Fixes That Cover the Majority of Violations

  1. Check contrast ratios with a contrast checker. If anything fails 4.5:1, darken the text or lighten the background.
  2. Label every icon button with aria-label.
  3. Add alt text to every image. Decorative images get alt="".
  4. Tab through your entire site. If you lose track of focus, add :focus-visible styles.

Methodology

43 products selected from Product Hunt April 2026 launches. Each audited with axe-core WCAG 2.1 AA via headless Chromium (Playwright), plus design token coverage and component drift analysis. All 43 reports are publicly available.

Frequently Asked Questions

How were the accessibility scores calculated?

Each site was scanned with axe-core via headless Chromium. Scores are normalized to 0-100 based on violation count and severity.

What counts as a WCAG AA pass?

Zero critical violations and a score above 70. Only Offsite (87/100) met this threshold. See our WCAG compliance checker guide.

Is this representative of all Product Hunt launches?

This covers 43 products from April 2026. The 97.7% failure rate is consistent with the WebAIM Million study (95.9% failure rate across 1 million home pages).

How does this compare to the web overall?

The WebAIM Million consistently finds 95-96% of home pages have WCAG failures. Our 97.7% is slightly worse, consistent with launch-speed products deprioritizing accessibility.

Related Resources