AI Agents Maintain Next.js on Cloudflare Runtime
Cloudflare's V-Next uses AI bots to build, review PRs, triage issues, and track Next.js changes, turning an intern prototype into a sustainable open-source experiment.
From Intern Prototype to AI-Driven Experiment
Cloudflare's V-Next started as an intern's three-month project to implement Next.js pages router on their Workers runtime. The intern made solid progress on basics, proving the Next.js API surface could map to Cloudflare's edge-deployed architecture. Steve Faulkner, Director of Engineering, revived it months later using AI agents, motivated by customer demand for easier Next.js deployments on Cloudflare. Dane Knecht, CTO, emphasized it's customer-driven: "for almost 5 years now... one of the biggest requests is how do you make next easier to deploy on cloudflare."
The project optimizes for Cloudflare's constraints—like global deployment without traditional server builds—by analyzing traffic to pre-render only high-hit assets (e.g., 10% covering 99% traffic), slashing build times from 45 minutes. This isn't a full fork but a reimplementation of the official Next.js API surface on Vite and Turbopack, avoiding divergence unless customer needs demand it.
"Dane Knecht: the goal is pretty much everything we do we do it for customers uh it's you know for almost 5 years now. been one of the biggest requests is how do you make next uh easier to deploy on cloudflare."
AI Bots Enable Sustainable Open-Source Maintenance
V-Next demonstrates open source in the AI era: over 50 committers contribute plans for AI agents to implement, with bots handling triaging, PR reviews, security scans, and syncing relevant Next.js commits into V-Next issues. This scales maintenance without human bottlenecks, addressing maintainer burnout from AI-generated slop PRs elsewhere.
Dane highlighted the experiment's dual purpose: easing Next.js on Cloudflare while testing AI for OSS. "We have AI bots that are doing triaging. We have AI bots that are reviewing all the PRs. We have AI bots that are doing security reviews. We have now AI bots that track the next.js repo and then open up issues back into our repo."
Community reception spiked new users dramatically post-launch, validating demand. Forks like this historically drive innovation—e.g., Node from io.js, Blink from WebKit—often reconverging stronger.
Compatibility Challenges and Hyrum's Law
Maintaining drop-in Next.js compatibility hits Hyrum's Law: developers rely on undocumented internals. Friction arises from community packages plugging into Next.js internals (e.g., importing from 'next/dist'), which V-Next rejects to stay true to the public API. Users report Vercel works but Cloudflare fails due to subtle behaviors like navigation hijacks or getInitialProps (deprecated in Next.js 12+ but missed by many).
Steve holds the line: no internals support yet, but customer demand could sway it. "Never say never." Vocal requests include reinstating getInitialProps or behavioral tweaks "next should have always done it this way." V-Next rejects feature PRs outside the API surface, unlike true forks like Cloudflare's Mdash (WordPress-inspired).
"Steve Faulkner: that's where they usually end up into trouble. So... do you guys support importing from vinexist or is that just a something that you're like no we will not do internals. right now. No, we have not done it yet. But I again never say never."
Mitigating AI Slop in Agentic Development
AI accelerates but introduces messes: giant 2,000-line template strings mixing logic, no linting/type-checking, unmaintainable even for agents. Steve manually deslopified by splitting into modules over a weekend, kicking off targeted PRs.
Strategies include:
- Porting Next.js tests (unit, E2E, smoke tests on production deployments) for regression confidence.
- Strict scoping: small, isolated tasks with human review of every AI-generated line.
- Tooling: Linting, type-checking, CI/CD to catch slop early.
- Human intervention on hotspots.
Dillon Mulroy, streaming engineer, noted similar issues with Hono: AI spits HTML/JS strings, cycling into debug hell. V-Next's test suite ports filter long-tail API noise, focusing bulk functionality like routing/hydration/SSR.
"Steve Faulkner: there was a part that was about a 2,000line uh template string in there that was like a lot of logic got like you know like clobbered into this thing... I'm not going to lie, it was pretty bad... I spent the weekend kicking off a bunch of PRs and just bit by bit got stuff out of there."
Path to Production and Reception
Post-experiment, V-Next nears stability: fixing full pre-rendering, Vite/Turbopack mismatches (e.g., hard vs. soft navigations). Launch spiked users, with positive sentiment despite gaps. Cloudflare weighs production based on parity, tests, and demand—already production-viable for most Next.js use cases.
Broader implications: AI lowers fork costs, enabling rapid iteration. Reception mixes excitement (pent-up demand) with skepticism on completeness.
"Dane Knecht: the spike on new new users that day was, you know, one of the biggest uh one day spikes ever. like uh um I mean you can see that there's there's pent-up demand uh and you know that that's why we why we do things here."
Key Takeaways
- Start AI projects with a human prototype (e.g., intern's pages router) to validate feasibility before scaling agents.
- Use AI bots for OSS drudgery: triage, PR review, security, upstream tracking—frees humans for strategy.
- Define strict scope (e.g., public API surface only) to avoid fork divergence; reject internals unless demand justifies.
- Combat slop with tests (port from upstream), linting/types, small tasks, and manual cleanups on hotspots.
- Monitor Hyrum's Law: expect undocumented reliance; prioritize community packages via tests/smoke runs.
- Measure success by user spikes and production viability—iterate on gaps like pre-rendering.
- For agentic dev, review every AI line; scope tightly to prevent unmaintainable blobs.
- Forks innovate ecosystems—embrace if customer-driven, but reconverge when possible.