Add AI powered fixes to your Laravel error pages
I've been wanting to play around with openai-php/laravel, ever since it was first released. Then I learned the other day that you can inject text containing possible solutions into Laravel's error page, and I thought that's a great use of AI. So, here goes my first attempt both at playing with OpenAI, and at playing with Solutions for Laravel Ignition.
A little update: I've made a composer package out of this.
Laravel Ignition and Solutions
The error pages in Laravel are powered by the spatie/laravel-ignition package. This package has the ability to show possible solutions in the green box at the top right, as you can see below:
What I did not know until quite recently, is that you can provide your own solutions by registering a SolutionProvider
in your ServiceProvider
. This is how I'm going to be providing AI powered fixes on my error pages.
Prerequisites
First of all, you need to get an OpenAI API key. Keep in mind that while signing up for OpenAI is free, actually sending prompts to them, is not: You'll be charged by the number of tokens ('characters') they process for you. However, when you do sign up, you get $18 worth of credit, without needing to provide a credit card. So far, I've sent about 100 exception messages their way, and spent about $1 out of that credit, so it seems that the cost would be about $0.01 per exception processed, so you shouldn't be getting any shock bills.
With that out of the way, before you proceed you need to sign up at openai.com/api/ and get your API key. Provide that API key in your .env
file:
OPENAI_API_KEY={API-KEY}
Then install the openai-php/laravel
package, and publish the configuration file:
composer require openai-php/laravel
php artisan vendor:publish --provider="OpenAI\Laravel\ServiceProvider"
Scaffold your Solution
Now let's start writing some code. The OpenAI Solution will consist of two parts:
Firstly, the OpenAiSolution
class will make the request to OpenAI, and return a solution. This class must implement the Spatie\Ignition\Contracts\Solution
interface. Here is the basic scaffold - we'll add functionality to it later:
namespace App\Exceptions\Solutions;
use Spatie\Ignition\Contracts\Solution;
use Throwable;
class OpenAiSolution implements Solution
{
public function __construct(
private readonly Throwable $throwable,
)
{
}
public function getSolutionTitle(): string
{
// TODO: Implement getSolutionTitle() method.
}
public function getSolutionDescription(): string
{
// TODO: Implement getSolutionDescription() method.
}
public function getDocumentationLinks(): array
{
// TODO: Implement getDocumentationLinks() method.
}
}
Secondly, you'll need a provider that will call the above OpenAiSolution
class. This OpenAiSolutionProvider
implements the Spatie\Ignition\Contracts\HasSolutionsForThrowable
interface, and is really simple:
namespace App\Exceptions\Solutions;
use Spatie\Ignition\Contracts\HasSolutionsForThrowable;
use Throwable;
class OpenAiSolutionProvider implements HasSolutionsForThrowable
{
public function canSolve(Throwable $throwable): bool
{
// we want to have an AI powered solution for every exception.
// If you want to limit yourself, here would be the place to
// put the logic to do so
return true;
}
public function getSolutions(Throwable $throwable): array
{
return [new OpenAiSolution($throwable)];
}
}
Finally, register your OpenAiSolutionProvider
in the ServiceProvider's register
method. We'll wrap it in an if
condition, so that it won't be registered unless you have provided an OpenAI API key:
namespace App\Providers;
class AppServiceProvider extends ServiceProvider
{
public function register()
{
if(config('openai.api_key')) {
app(SolutionProviderRepository::class)
->registerSolutionProvider(OpenAiSolutionProvider::class);
}
}
}
Getting fixes from OpenAI
OK, this was all fairly boring boilerplate stuff - though it is exciting that you can provide your own Solutions to the Laravel Ignition error pages so easily. Now, how do we actually get OpenAI to give us solutions?
Firstly, we'll need to write our prompt, which is what we'll send to OpenAI to get a response. What information should the prompt contain? Well, we need to tell OpenAI the error message, as well as a reasonable amount of information about the exception. This shouldn't be too much (because OpenAI charges us depending on the length of the prompt), but it should be enough to provide meaningful context. Here is what I settled on, after some trial and error:
You are a very good PHP/Laravel developer. Use the following exception message, together with the context provided, to find a possible fix.
Exception Message: {!! $message !!}
File: {!! $file !!}
Line: {!! $line !!}
Code snippet including line numbers:
{!! $snippet !!}
Possible Fix:
As you can see, the prompt ends with a place for OpenAi to provide the fix / solution.
We'll save this as a blade file at resources/views/exceptions/openai/prompt.blade.php
.
Now we need to fill in the blanks for the scaffolding in the the OpenAiSolution
class:
namespace App\Exceptions\Solutions;
use Illuminate\Support\Facades\Cache;
use OpenAI\Laravel\Facades\OpenAI;
use Spatie\Backtrace\Backtrace;
use Spatie\Backtrace\Frame;
use Spatie\Ignition\Contracts\Solution;
use Throwable;
class OpenAiSolution implements Solution
{
private string $prompt;
public function __construct(
private readonly Throwable $throwable,
)
{
$this->prompt = $this->preparePrompt();
}
/**
* The Solution title. This will be shown on the error page
*/
public function getSolutionTitle(): string
{
return 'OpenAI suggestion';
}
/**
* Get the Solution text. Use caching both because OpenAi is slow,
* and because we want to save cost
*/
public function getSolutionDescription(): string
{
return Cache::remember(
'open-ai-solution-'.md5($this->prompt),
now()->addWeek(),
fn () => OpenAI::completions()->create([
'model' => 'text-davinci-003',
'max_tokens' => 200,
'temperature' => 0,
'prompt' => $this->prompt,
])->choices[0]->text
);
}
public function getDocumentationLinks(): array
{
return [];
}
/**
* Get the prompt that we'll send to OpenAi
*/
private function preparePrompt(): string
{
$finalApplicationFrame = $this->finalApplicationFrame($this->throwable);
return (string)view('exceptions.openai.prompt', [
'message' => $this->throwable->getMessage(),
'line' => $finalApplicationFrame->lineNumber,
'file' => $finalApplicationFrame->file,
'snippet' => collect($finalApplicationFrame->getSnippet(10))
->map(fn ($line, $number) => $number .' '.$line)
->join(PHP_EOL),
]);
}
/**
* If possible, get the final application frame before the error was thrown.
*/
private function finalApplicationFrame(Throwable $throwable): Frame
{
$backtrace = Backtrace::createForThrowable($throwable)->applicationPath(base_path());
$frames = $backtrace->frames();
return $frames[$backtrace->firstApplicationFrameIndex() ?? 0];
}
}
There is quite a bit going on here, so let's break it down:
-
getSolutionTitle()
: This is simple: it's just the title of the solution shown on the error page. -
getSolutionDescription()
: This is what actually gets the suggestion text by sending our prompt to OpenAI: We use caching to reduce load and cost, and increase speed. We provide the following arguments to the API:-
model
: OpenAI has multiple models to choose from.text-davinci-003
is the most advanced, so we go for that. -
max_token
: The maximum length of any response. I just picked a number out of my hat here - not too long, not too short. -
temperature
: Let me copy and paste this from the documentation: "What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic." Random seems bad, when it comes to fixing bugs, so I've chosen0
. -
prompt
: The actual prompt we are sending - see below on how we out that together.
-
-
getDocumentationLinks()
: This is what powers theDatabase: Getting Started docs
link in the screenshot above. But we don't have any documentation to provide here, so we provide an empty array. -
preparePrompt()
: This returns the actual prompt string to send to OpenAI. We are passing the exception message, and line number, together with the file name, and a short code snippet to the blade file shown above, and return it as a string. -
finalApplicationFrame()
: OpenAI should consider our application code, rather than vendor code, when suggesting a fix, so we supply the final frame of application code as code snippet.
Testing our AI powered suggestions
OK, now we've put this all together, it's time to put our AI powered suggestions through its paces.
Here is the simplest test I could think of - forgetting the parentheses for a function call:
namespace App\Http\Controllers;
use App\Models\User;
use Illuminate\Support\Facades\Cache;
class UsersController extends Controller
{
public function index()
{
return \App\Models\User::all;
}
}
Amazing! This actually works! Let's try another really basic mistake, that we've all made a billion times before - forgetting a ;
:
namespace App\Http\Controllers;
use App\Models\User;
use Illuminate\Support\Facades\Cache;
class UsersController extends Controller
{
public function index()
{
$users = \App\Models\User::all()
return $users;
}
}
Correct, but slightly less helpful: If I've missed the ;
when typing the original code, I'm likely to miss it when reading this suggestion, too. But hopefully you'll just do what every good developer does, and simply copy and paste the solution wholesale, without thinking about it.
OK, both of these are simple syntax errors that a static analyser or a good IDE would've identified and provided a fix for, too. Don't really need much 'intelligence' for this.
Let's try something a bit more complex: I'm using the filesystem provider for caching, and don't have redis installed on my machine. What happens when I try to use it?
namespace App\Http\Controllers;
use App\Models\User;
use Illuminate\Support\Facades\Cache;
class UsersController extends Controller
{
public function index()
{
return Cache::store('redis')->remember(
'all-users',
now()->addMinutes(5),
fn() => User::all(),
);
}
}
Hm. The error message is Class "Redis" not found
, and on the face of it the suggestion to use
the class seems kinda reasonable. Sadly it's not correct though. Anyway, I'm a good developer, so I just copy and paste the solution wholesale, without thinking about it:
namespace App\Http\Controllers;
use App\Models\User;
use Illuminate\Support\Facades\Cache;
use Illuminate\Support\Facades\Redis;
class UsersController extends Controller
{
public function index()
{
return Cache::store('redis')->remember(
'all-users',
now()->addMinutes(5),
fn() => User::all(),
);
}
}
Boom. That's what I was hoping to see! Now it's suggesting I install the Redis PHP extension, which is the correct fix. (Though the way it tells me to install it is not applicable to my system, given I'm running this on macOS. But we didn't supply this info to OpenAI, so it couldn't have known.)
Summary
In this post we've seen how to use OpenAI to provide AI powered fixes on Laravel Ignition error pages. This actually works pretty well, but there are a few caveats:
- This isn't exactly quick: It takes about 2-5 sec for every prompt, which isn't terrible, but I'm impatient. I don't think I could cope with this in real life.
- The other problem is that we are sending all our errors (including code snippets) to a 3rd party. Now, this isn't a particularly unusual thing to do: If you use any log aggregation / error tracking software, you are already doing that (and sending far more info their way, than this code sends to OpenAI). But OpenAI uses our prompts for training purposes, which kinda opens a whole new can of worms, and I don't think I'd be happy to use this for work. (Definitely don't do this without sign-off from management - and compliance, if relevant!)
- Most of all: Don't trust an AI to fix your code. I bet it would break a lot of things in the long run.
But it's fun to play with, surprisingly easy to do, and actually provides quite helpful suggestions / fixes.