Introducing the Laravel AI SDK

Introducing the Laravel AI SDK

AI should make you faster, not slower. Wrestling with different API endpoints, managing scattered code, and reading through provider-specific documentation kind of defeats the entire point. The Laravel AI SDK, currently in beta, solves this by giving you a unified, Laravel-native way to work with multiple AI providers.

Laravel’s AI software development kit brings OpenAI, Anthropic, Google Gemini, Groq, xAI, and other providers under one API. It handles all the provider-specific details so you can build Laravel apps quickly and easily, from generating text with Claude or ChatGPT to creating images with DALL-E or Gemini, and building complex agent workflows with tools and structured output.

As AI transforms how we build software, we want to ensure no developer is left behind and can enjoy clean, simple workflows that align with Laravel’s passion for a great developer experience.

To try the Laravel AI SDK, install it via Composer and add your API keys:

Laravel AI SDK Provider Support

The Laravel AI SDK currently supports these providers:

Text Generation: OpenAI, Anthropic, Gemini, Groq, xAI

Image Generation: OpenAI, Gemini, xAI

Text-to-Speech: OpenAI, ElevenLabs

Speech-to-Text: OpenAI, ElevenLabs

Embeddings: OpenAI, Gemini, Cohere, Jina

Reranking: Cohere, Jina

Files: OpenAI, Anthropic, Gemini

Support for additional providers is planned.

Building with Agents and the Laravel AI SDK

Agents are the core of the Laravel AI SDK. Agents organize your AI interactions into clean, testable classes with clear responsibilities. Generate an agent with an Artisan command:

This creates a structured class where you define instructions, conversation context, available tools, and output schemas:

Once you've created an agent, you can prompt it using the prompt method:

Need to switch providers? Just pass the provider name:

The AI SDK defaults to sensible model choices (configured in config/ai.php), but you can override them per request. This makes it trivial to test different models and find what works best for your use case.

Agent middleware lets you intercept and modify prompts:

Attributes can be used to configure additional agent behavior:

Structured Output

When you need more than free-form text, agents support structured output using JSON schemas. This is perfect for building features like lead qualification systems, data extraction pipelines, or any workflow that needs predictable, parseable responses.

To instruct your agent to return structured output, implement the HasStructuredOutput interface:

Then, after prompting the agent, you can access the structured response like an array:

Failover Support

AI providers occasionally experience outages, rate limits, or service disruptions. Model failover automatically switches to a backup provider when your primary choice is unavailable, keeping your application running smoothly.

Pass an array of providers to automatically failover if one is unavailable:

If OpenAI is rate-limited or down, the request automatically fails over to Anthropic.

Agent Tools

You can also give agents access to additional functionality through tools. Create a tool with an Artisan command:

Tools define a schema and handle method:

Return tools from your agent's tools method, and the model will use them automatically when needed.

Semantic Search and RAG

The AI SDK includes a SimilaritySearch tool that searches vector embeddings in your database. This means you can build knowledge bases and let agents answer questions using your own data with retrieval-augmented generation (RAG). The SimilaritySearch tool uses brand new vector comparison methods offered by Laravel’s query builder:

Provider tools like WebSearch, WebFetch, and FileSearch let agents access real-time information and stored documents. These tools are executed by the provider itself, enabling powerful capabilities like searching the web for current events or fetching content from URLs.

Images, Audio, and More

Need to generate marketing visuals, product mockups, or user avatars on the fly? Generate images with a clean API:

The SDK supports OpenAI, Gemini, and xAI for image generation. You can even pass reference images as attachments to transform existing photos, and pick their format (for example, square, landscape, or portrait).

Plus, you can create voiceovers, accessibility features, or conversational interfaces by generating audio from text:

Build automated meeting notes, call center analytics, or podcast transcription features by transcribing audio files with speaker diarization:

Streaming

Stream responses in real-time using server-sent events:

Or, broadcast streamed events directly to users using Laravel Reverb and Echo:

Deep Laravel Integration

The Laravel AI SDK integrates deeply with the framework. For example, you can quickly store generated files with one line:

This works across all of Laravel's filesystem drivers: local, S3, DigitalOcean Spaces, and Laravel Cloud's Object Storage.

Or, leverage Laravel's powerful queue system to queue the generation of images, audio, transcriptions, and more:

Conversation Memory

The RemembersConversations trait automatically stores and retrieves conversation history. The Laravel AI SDK provides database migrations to create the tables necessary to store conversations and messages:

Start a conversation:

Continue it later:

Embeddings and Vector Search

To power semantic search, recommendation engines, or similarity detection in your application, generate vector embeddings using the toEmbeddings method on Laravel's Stringable class:

Embeddings can be cached to avoid redundant API calls. Enable caching globally in config/ai.php or per request:

Improve search relevance by reranking results based on semantic similarity rather than keyword matching:

Collections also have received a convenient rerank macro:

Files and Vector Stores

Store large documents, PDFs, or images with AI providers like OpenAI and Google Gemini to reference them across multiple conversations without re-uploading:

Once files are uploaded, you can also create vector stores for semantic file search:

Use the SDK’s built-in FileSearch provider tool to let agents search your vector stores:

You can easily filter searches by metadata to refine your results:

Real-World Example: Lead Extraction

Here's a practical example of the Laravel AI SDK in action for a structured output. Imagine you receive hundreds of contact form submissions and need to quickly identify quality leads, extract their needs, and route them appropriately.

Create a lead extractor agent that processes CSV data:

Process a CSV of submissions and save qualified leads with embeddings:

The agent automatically filters out spam ("hello, hello, why isn't this sending?"), identifies high-value prospects ("series B startup with 2.3 million users"), and extracts specific needs (MySQL, Laravel Private Cloud, Enterprise support) without any manual classification rules.

Now build a conversational interface to search your leads semantically:

Create the search agent with the SimilaritySearch tool:

The agent uses the embeddings to find semantically similar leads, even when the search query doesn't match exact keywords.

The Laravel AI SDK gives you a clean, organized foundation for building AI-powered features. It allows you to really start leveraging AI’s speed in one place, without juggling provider-specific APIs or scattering AI logic throughout your codebase. Everything has a place, and it all works together seamlessly.

Install the Laravel AI SDK and start building:

Check out the full documentation for detailed guides on agents, tools, structured output, streaming, and more.

We can't wait to see what you build with the Laravel AI SDK.

Keep reading

AnnouncementAug 13, 2025

Announcing Laravel Boost

Ship Laravel features faster with Boost, a Composer package that accelerates AI-assisted Laravel development by providing the essential context and structure that AI needs to generate high-quality, Laravel-specific code.

Ashley Hindle