Migrating a Legacy Laravel App with AI: From PHP 7.4 to Laravel 12
One of our marketing sites for an internal product was built by a junior developer a few years ago. The original brief was to make every page CMS-driven — that effort started but never really finished. What we ended up with was PHP 7.4 and Laravel 9, mostly hardcoded Blade pages, a half-built page builder that nobody used, and a lightweight CMS that only managed news articles.
Since then, the site went through two redesigns. Both times, a frontend developer reskinned the templates but left the backend and CMS exactly as they were. The technical debt just accumulated under fresh paint.
Last week I got pulled in during the second redesign to help figure out why the news page was painfully slow to load. Once I opened the codebase, the slow page was the least of its problems.
What I inherited
N+1 queries everywhere. A 4-table categorisation module for what was essentially a pivot table. Abandoned packages blocking composer install. Cypress tests that hadn’t run in years. Laravel Mix. Dependencies pinned to ancient versions because newer ones required PHP 8.1+.
The kind of codebase where every composer update feels like defusing a bomb.
I’d been putting off the upgrade. Going from Laravel 9 to 12 isn’t a minor version bump — Laravel 11 completely restructured the application skeleton, and 12 took it further. No more app/Http/Kernel.php, middleware configured declaratively in bootstrap/app.php, streamlined directory layout. It’s practically a different framework shape.
So I decided to use Claude Code as my pair programmer and see how far AI could take a real migration.
The approach
I broke the work into focused sessions, each tackling one concern. Claude would generate a plan, I’d review it, approve it, and let it execute. Some sessions were five minutes. Others were an hour.
The key insight: don’t try to migrate everything at once. Treat each problem as its own task.
Stripping the dead weight
Before touching the framework version, I cleaned house.
The test suite hadn’t been maintained — Cypress configs, PHPUnit tests that didn’t reflect the current codebase, test tooling nobody used. I told Claude to strip all of it, and it executed cleanly.
The waavi/translation package had been abandoned and was blocking installs. Gone. Unused packages adding complexity for no benefit on a site with a handful of static pages. Gone.
The diff tells the story: 1.5 million lines deleted.
Most of that was vendor bloat and test fixtures, but it meant the codebase I was actually upgrading was dramatically smaller.
The framework jump
Rather than stepping through 9 → 10 → 11 → 12, I decided to go straight to Laravel 12 on PHP 8.4. I described the target structure to Claude and let it handle the mechanical changes — new bootstrap/app.php configuration, removing the kernel, updating service providers, modernising the config files.
It wasn’t flawless. Claude accidentally removed email templates and some static asset directories during the upgrade. But that’s what version control is for — I caught it in the diff review and had Claude restore them.
This is the part people miss about AI-assisted development: you still review everything. The AI is fast, but it’s not infallible. Git diff is your safety net.
Fixing N+1 queries and over-engineered relationships
This was the most satisfying part. The original code was riddled with N+1 problems — the classic junior developer mistake.
The news listing page loaded every article, then for each article hit the database again to fetch its categories. On a page showing 30 articles, that’s 31 queries instead of 2:
// Before: N+1 — each article triggers a separate query for categories
foreach ($articles as $article) {
$article->categories; // 💥 query per article
}
// After: eager-loaded in two queries total
$articles = Article::with('categories')->published()->get();
But the problem went deeper than missing with() calls. The original developer had built a generic BasicCategorisation module with four database tables — types, categories, category translations, and a pivot — for what amounted to tagging articles with labels like “Press” and “Blog.”
My plan was straightforward: replace the entire module with a single article_category pivot table and a direct relationship on the model. I described the target schema to Claude, and it executed — one migration, one model update, one controller cleanup.
Four tables became one. The query count dropped. The code became readable.
Frontend modernisation
Laravel Mix to Vite — faster builds, hot module replacement, and one less legacy config file. I pointed Claude at the webpack configs and told it to convert to Vite.
The news listing page was loading all articles at once — this was the original performance issue I’d been asked to look at. I described what I wanted: a loadMore endpoint with paginated results and vanilla JS scroll detection. Claude wrote the implementation.
Making the site AI-friendly
This one felt meta — using AI to make the site readable by AI.
I installed spatie/laravel-markdown-response, which automatically serves clean markdown to AI bots (ClaudeBot, GPTBot) instead of raw HTML. Added it as middleware on all public-facing routes:
Route::middleware(ProvideMarkdownResponse::class)
->group(function () {
Route::get('/', [HomeController::class, 'home'])->name('home');
Route::view('how-it-works', 'Website.how-it-works');
Route::view('about-us', 'Website.about-us');
// ...
});
Built a custom preprocessor to strip navigation chrome from the markdown output, so bots get just the content. Even wrote a DOM-based approach instead of regex after Claude’s first attempt used a broken pattern — </site> instead of </div>, and regex can’t reliably handle nested HTML anyway.
Code quality with Rector and Pint
A framework upgrade is a good excuse to modernise the PHP itself. I used two tools that pair exceptionally well with AI:
Rector handled the automated refactoring — adding return type declarations, enforcing early returns, applying code quality rules, and privatising class members that had no business being public. I chose the rule sets and had Claude run them incrementally, fixing the edge cases that automated tooling can’t handle on its own.
// Before: no types, no strict mode
function getArticles($categoryId) {
$articles = Article::where('category_id', $categoryId)->get();
if ($articles->count() > 0) {
return $articles;
}
return collect();
}
// After: typed, strict, early return
function getArticles(int $categoryId): Collection {
return Article::query()->where('category_id', $categoryId)->get();
}
Laravel Pint handled formatting — consistent code style across every file with zero manual effort. One command, entire codebase aligned.
The combination is powerful: Rector transforms the structure, Pint normalises the style, and Claude reviews the output to catch anything the tools missed. I also styled the error pages (404, 500, 503) to match the website design instead of showing Laravel’s default pages.
What I learned
AI is best at the tedious parts. Framework upgrades, file structure changes, boilerplate — these are tasks where Claude saved hours. It knows Laravel 12’s conventions and applies them correctly.
AI executes what you’ve been putting off. I knew the N+1 queries were there. I knew the categorisation module was over-engineered. But fixing them manually was never urgent enough to prioritise. With Claude, I described the fix and it was done in minutes.
Planning before executing is non-negotiable. For each major change, I wrote a clear plan describing the target state and gave it to Claude. Reviewing its output against my plan takes minutes. Fixing a bad implementation takes hours.
Always review the diff. Claude removed files it shouldn’t have. It wrote a broken regex. These aren’t failures of the tool — they’re reminders that AI is a pair programmer, not an autopilot.
The numbers
| Metric | Before | After |
|---|---|---|
| PHP | 7.4 | 8.4 |
| Laravel | 9 | 12 |
| Lines changed | — | 141K added, 1.5M deleted |
| Category tables | 4 | 1 pivot |
| Build tool | Mix | Vite |
| News loading | All at once | Infinite scroll |
| AI readability | Raw HTML | Markdown responses |
What would have been a multi-week migration project was completed in a single focused day. The key was breaking the work into discrete, reviewable chunks — each with a clear plan and a verifiable outcome.
AI won’t replace the developer who understands why the migration matters. But it handles the how far faster than I could alone.
Then I did it again the next day
The day after finishing this migration, I tackled a much larger project — a company website with a full page builder CMS, 3,800+ commits of history, and significantly more complexity. Same approach, same tool.
This time I also brought in Rector to automate PHP modernisation — adding type declarations, early returns, code quality rules, and privatisation across the entire codebase. I configured the rule sets and had Claude run them incrementally, fixing the edge cases Rector couldn’t handle on its own (like $this usage in static methods and incorrect return types).
The same patterns applied: eager loading fixes, Pint for formatting, markdown responses for AI bots, upgrading stale packages. But with a bigger codebase, the time savings were even more dramatic. 33 commits in a single day, touching 416 files.
What I’d do differently next time
If I had to do this again on a similar project — a small site with mostly static pages and a simple CMS — I’d skip the in-place upgrade entirely. I’d scaffold a fresh Laravel 12 project and use AI to migrate the logic, views, and data across piece by piece.
In-place upgrades carry baggage. You’re constantly working around old directory structures, deprecated configs, and leftover files that shouldn’t be there. A fresh project gives you the correct skeleton from the start — no restoring accidentally deleted files, no orphaned middleware registrations, no ghost configs from three Laravel versions ago.
The migration itself would be the same structured sessions: move the routes, port the controllers, copy the views, run the migrations. But you’d start clean and only bring over what you actually need. With AI handling the boilerplate translation, the overhead of starting fresh is minimal — and the result is a codebase with zero legacy residue.