Laravel AI SDK: A Practical Guide with Real Code Examples

• by Tobias Schäfer • 7 min read

When Taylor Otwell unveiled the official Laravel AI SDK at Laracon India 2026, my first thought was: “I could have used this two years ago.”

Because exactly what the SDK now solves elegantly – agents, structured output, provider failover, embeddings – I built myself for mitKai. With considerably more pain.

In this article, I’ll walk you through the SDK with concrete examples. Not Hello World demos, but code you can use in real projects.


Installation and Setup

composer require laravel/ai
php artisan vendor:publish --provider="Laravel\Ai\AiServiceProvider"
php artisan migrate

In your .env:

ANTHROPIC_API_KEY=your-key-here
OPENAI_API_KEY=your-key-here

Configuration lives in config/ai.php. You can register as many providers as you like – OpenAI, Anthropic, Gemini, Groq, xAI, DeepSeek, Mistral, Ollama, and more.


Agents: AI with Personality

Agents are the heart of the SDK. Instead of sending loose prompts to an API, you define specialized classes with their own instructions, tools, and output formats.

Creating an Agent

php artisan make:agent ProductDescriptionWriter

This generates a class you can customize to your needs:

<?php

namespace App\Ai\Agents;

use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\HasStructuredOutput;
use Laravel\Ai\Contracts\HasTools;
use Laravel\Ai\Concerns\Promptable;
use Laravel\Ai\Attributes\Provider;
use Laravel\Ai\Attributes\Temperature;
use Laravel\Ai\Attributes\MaxTokens;
use Laravel\Ai\JsonSchema;

#[Provider('anthropic')]
#[Temperature(0.7)]
#[MaxTokens(4096)]
class ProductDescriptionWriter implements Agent, HasStructuredOutput, HasTools
{
    use Promptable;

    public function __construct(
        private string $shopName,
        private string $targetAudience,
        private string $tone = 'professional yet approachable',
    ) {}

    public function instructions(): string
    {
        return <<<PROMPT
            You are an experienced e-commerce copywriter for the shop "{$this->shopName}".
            Target audience: {$this->targetAudience}.
            Tone: {$this->tone}.

            Rules:
            - Write exclusively based on the provided product data.
            - Do NOT invent features that aren't in the data.
            - Avoid generic phrases like "high-quality" or "perfect companion".
            - Lead with the concrete benefit, not a cliché.
            - Weave relevant keywords naturally into the text.
        PROMPT;
    }

    public function schema(JsonSchema $schema): array
    {
        return [
            'title' => $schema->string()
                ->description('SEO-optimized product title')
                ->required(),
            'description' => $schema->string()
                ->description('Product description as HTML with semantic structure')
                ->required(),
            'meta_description' => $schema->string()
                ->description('Meta description, max 155 characters')
                ->required(),
            'keywords' => $schema->array()
                ->items($schema->string())
                ->description('Relevant SEO keywords')
                ->required(),
        ];
    }

    public function tools(): iterable
    {
        return [
            new \App\Ai\Tools\FetchCategoryContext,
        ];
    }
}

Using the Agent

use App\Ai\Agents\ProductDescriptionWriter;

$agent = new ProductDescriptionWriter(
    shopName: 'OutdoorPro',
    targetAudience: 'Experienced hikers, 30-55 years',
    tone: 'competent, direct, no fluff',
);

$response = $agent->prompt(
    "Write a product description for:
    Name: Alpine Trekking Backpack 45L
    Material: Nylon 210D, waterproof
    Weight: 1.8 kg
    Features: Back ventilation, adjustable hip belt
    Category: Multi-day tours"
);

// Structured access
$response['title'];            // "Alpine Trekking Backpack 45L – 1.8 kg for..."
$response['description'];      // "<h2>...</h2><p>...</p>"
$response['meta_description']; // "The Alpine 45L weighs just 1.8 kg and..."
$response['keywords'];         // ["trekking-backpack", "45l", "multi-day-tours"]

Why this matters: Without structured output, you get free text that you then have to parse. With the schema, you get exactly the format you need – every time.


Tools: When the Agent Needs More Context

Sometimes an agent needs information that isn’t in the prompt. That’s what tools are for – functions the agent can call on its own.

Creating a Tool

php artisan make:tool FetchCategoryContext
<?php

namespace App\Ai\Tools;

use App\Models\Category;
use Illuminate\Contracts\Support\Stringable;
use Laravel\Ai\Contracts\Tool;
use Laravel\Ai\JsonSchema;
use Laravel\Ai\Tools\Request;

class FetchCategoryContext implements Tool
{
    public function description(): Stringable|string
    {
        return 'Fetches context about a product category: '
             . 'target audience, price segment, and related categories.';
    }

    public function schema(JsonSchema $schema): array
    {
        return [
            'category_name' => $schema->string()
                ->description('Name of the category')
                ->required(),
        ];
    }

    public function handle(Request $request): Stringable|string
    {
        $category = Category::where('name', 'like', "%{$request['category_name']}%")
            ->with('parent')
            ->first();

        if (! $category) {
            return 'Category not found.';
        }

        return json_encode([
            'name' => $category->name,
            'audience' => $category->target_audience,
            'price_segment' => $category->price_segment,
            'parent' => $category->parent?->name,
            'product_count' => $category->products()->count(),
        ]);
    }
}

The agent decides on its own whether and when to call the tool. If you give it a product name with a category, it will load the category context to write better copy.


Embeddings and Vector Search: Semantic Instead of Keyword-Based

For mitKai, deduplication was one of the biggest challenges: how do I ensure that 500 backpack descriptions don’t all sound the same? Embeddings solve this elegantly.

Setup with pgvector

// Migration
Schema::ensureVectorExtensionExists();

Schema::create('product_descriptions', function (Blueprint $table) {
    $table->id();
    $table->foreignId('product_id')->constrained();
    $table->text('description');
    $table->vector('embedding', dimensions: 1536)->index();
    $table->timestamps();
});

Generating and Storing Embeddings

use Illuminate\Support\Str;

$description = 'The Alpine 45L weighs just 1.8 kg and offers...';
$embedding = Str::of($description)->toEmbeddings();

ProductDescription::create([
    'product_id' => $product->id,
    'description' => $description,
    'embedding' => $embedding,
]);

Finding Similar Descriptions

// Before saving: is the new text too similar to existing ones?
$similar = ProductDescription::query()
    ->whereVectorSimilarTo('embedding', $newEmbedding, minSimilarity: 0.92)
    ->where('product_id', '!=', $product->id)
    ->limit(5)
    ->get();

if ($similar->isNotEmpty()) {
    // Text is too similar to existing descriptions
    // → Regenerate with a varied prompt
}

Practical tip: A similarity of 0.92+ means the texts are nearly identical. We use this as quality assurance: if a new text is too close to an existing one, it’s automatically regenerated with a varied prompt.


Provider Failover: Resilience Built In

In production, you don’t want your shop to stall because OpenAI hit a rate limit. Failover is built in:

$response = $agent->prompt(
    'Write a description for...',
    provider: ['anthropic', 'openai', 'gemini'],
);

The SDK tries Anthropic first. If that fails – next provider. No custom retry handling needed.


Bulk Generation with Queues

For bulk operations (e.g., 5,000 product descriptions), you don’t want to work synchronously. The SDK integrates seamlessly with Laravel’s queue system:

use App\Ai\Agents\ProductDescriptionWriter;

// Queue a single product
$agent->queue("Write a description for: {$productData}")
    ->then(function ($response) use ($product) {
        $product->update([
            'description' => $response['description'],
            'meta_description' => $response['meta_description'],
        ]);
    })
    ->catch(function (Throwable $e) use ($product) {
        Log::error("Description failed for product {$product->id}", [
            'error' => $e->getMessage(),
        ]);
    });

For thousands of products, wrap it in a job:

// In a BatchJob
$products->each(function ($product) use ($agent) {
    $agent->queue($this->buildPrompt($product))
        ->then(fn ($response) => $this->saveDescription($product, $response));
});

Streaming: Real-Time Output in the Frontend

When you need a chat interface or live preview:

// In a controller
Route::get('/preview-description', function (Request $request) {
    $agent = new ProductDescriptionWriter(
        shopName: $request->shop_name,
        targetAudience: $request->audience,
    );

    return $agent->stream($request->prompt);
});

For Vercel AI SDK (e.g., with a Vue or React frontend):

return $agent->stream($request->prompt)->usingVercelDataProtocol();

Testing: Making AI Features Testable

The best part of the SDK for me as a developer: you can finally test AI features properly.

use App\Ai\Agents\ProductDescriptionWriter;

public function test_generates_product_description(): void
{
    ProductDescriptionWriter::fake([
        json_encode([
            'title' => 'Alpine Trekking Backpack 45L',
            'description' => '<h2>Light and spacious</h2><p>...</p>',
            'meta_description' => 'The Alpine 45L for multi-day tours.',
            'keywords' => ['trekking-backpack', '45l'],
        ]),
    ]);

    $response = $this->post('/api/generate-description', [
        'product_id' => $this->product->id,
    ]);

    $response->assertOk();

    ProductDescriptionWriter::assertPrompted(
        fn ($prompt) => $prompt->contains('Alpine 45L')
    );
}
public function test_handles_provider_failure_gracefully(): void
{
    ProductDescriptionWriter::fake()
        ->preventStrayPrompts();

    // Your code can't accidentally make real API calls
}

Why this is a game changer: Before the SDK, we had to mock HTTP calls, manually build JSON responses, and hope the provider API didn’t change. Now: Agent::fake(), and done.


What I Wish I’d Had Sooner

If I were building mitKai from scratch today, I’d use the Laravel AI SDK as the foundation. Specifically, the following would have saved me weeks of development time:

FeatureBefore (built myself)Now (AI SDK)
Provider switchingCustom abstraction with interfacesprovider: ['anthropic', 'openai']
Structured outputJSON parsing + validation + error handlingschema() method with JsonSchema
DeduplicationCustom embedding pipeline + cosine similaritywhereVectorSimilarTo()
Bulk generationCustom queue handling with retry logic$agent->queue() with then()/catch()
TestsHTTP mocks + fixture filesAgent::fake() + assertions

This doesn’t mean you no longer need mitKai – the business logic (shop integration, SEO optimization, HTML generation, contextual deduplication) remains complex. But the foundation underneath is now significantly more solid.


Conclusion

The Laravel AI SDK is not a toy. It’s a thoughtful, production-ready framework for AI integrations that fits seamlessly into the Laravel ecosystem – from queues to testing to Eloquent.

If you use Laravel and are planning AI features, there’s no reason to build your own abstractions anymore.

The three most important things to remember:

  1. Agents are classes, not prompts. Define your AI interactions like any other service.
  2. Structured output makes AI predictable. No more JSON parsing, no surprising formats.
  3. Agent::fake() alone is worth the switch. Testable AI features were long overdue.

In the companion article Laravel Is Becoming an AI Platform – and That Changes More Than You Think, I look at what Laravel’s AI offensive means strategically – for shop owners, agencies, and the PHP ecosystem.