
For two months, foxl.ai was invisible to Google. You could type "Foxl AI agent" into the search bar and get nothing. Our landing page, blog, pricing - none of it was indexed. The fix took one afternoon and the result was immediate: full HTML pages that crawlers can read without executing JavaScript.
The problem: CSR is invisible to crawlers
Our landing page was a standard React SPA built with Vite. The HTML that Cloudflare served looked like this:
<!DOCTYPE html>
<html>
<body>
<div id="root"></div>
<script src="/assets/index-abc123.js"></script>
</body>
</html>
Googlebot does execute JavaScript, but it deprioritizes JS-rendered content. Pages that require client-side rendering get crawled less frequently, indexed later, and rank lower. For a new domain with no authority, this means effectively zero visibility.
We also had a Cloudflare Worker (worker.js) that intercepted every route and served index.html with rewritten meta tags via HTMLRewriter. This gave crawlers correct <title> and <meta description> tags, but the actual page content - hero text, feature descriptions, blog posts - was still empty until JavaScript executed.
Why static export, not SSR
Next.js offers three rendering strategies: SSR (server-rendered per request), ISR (incremental static regeneration), and full static export. We chose static export (output: 'export') for several reasons:
- Cloudflare Workers limitations. Our site runs on a Cloudflare Worker with static assets. Workers have a 10ms CPU time limit on the free plan. SSR on every request would require either a paid Workers plan or migrating to Pages with
@cloudflare/next-on-pages. Static export produces plain HTML files that the existing Worker serves directly. - Content doesn't change between deploys. Blog posts, pricing, and feature descriptions update only when we deploy. There's no user-generated content or real-time data on the marketing site. SSR would re-render identical HTML on every request for no benefit.
- Deploy speed.
next build generates all 25 pages in 5 seconds. wrangler deploy uploads the delta in 2 seconds. Total deploy time under 20 seconds.
The tradeoff: we lose middleware (redirects handled by the Worker) and route handlers (llms.txt proxied by the Worker). Both were already in the Worker, so nothing changed operationally.
The migration: what actually changed
The existing React components were already well-structured. The migration was mostly mechanical:
- Created a new Next.js 15 App Router project alongside the existing Vite app
- Copied all components, added
'use client' directives to interactive ones (Hero animations, FAQ accordions, demo state machines) - Created App Router pages for each route with exported
metadata objects - Blog posts use
generateStaticParams for full SSG - all 17 posts pre-rendered at build time - Updated
wrangler.toml to point at landing/out instead of landing/dist - Removed the SPA routing logic from
worker.js (the biggest win - 200 lines deleted)
JSON-LD for sitelinks
Google's search results can show "sitelinks" - the indented sub-pages beneath your main result. To trigger these, you need structured data that tells Google about your site's hierarchy. We added a JSON-LD graph to the root layout:
- Organization - company info + social links (GitHub, Discord)
- WebSite - with a SearchAction pointing to docs.foxl.ai search
- SoftwareApplication - app metadata (free, macOS/Windows, download URL)
- SiteNavigationElement - the 5 main nav items (Blog, Pricing, Security, Changelog, Docs)
Google doesn't guarantee sitelinks, but providing this structure makes it significantly more likely. Combined with a proper sitemap.xml and robots.txt that allows AI crawlers (GPTBot, ClaudeBot, PerplexityBot), the site is now fully optimized for both traditional and AI-powered search.
Results
Before: curl https://foxl.ai/blog | grep "Introducing" returned nothing (content was JS-only).
After: every page serves complete HTML with full text content, meta tags, and structured data. Google Search Console should start showing indexed pages within days.
The lesson: if you're building a marketing site for a developer tool, start with SSG from day one. The "we'll add SSR later" approach means months of zero search visibility while you're trying to grow.