Larasonic comes with built-in AI capabilities powered by Prism, allowing you to easily integrate various LLMs (Large Language Models) into your application.
โ๏ธ Configuration
The Prism configuration is located in config/prism.php
:
return [
'default' => env('PRISM_PROVIDER', 'openai'),
'providers' => [
'openai' => [
'api_key' => env('OPENAI_API_KEY'),
'model' => env('OPENAI_MODEL', 'gpt-4-turbo-preview'),
],
'anthropic' => [
'api_key' => env('ANTHROPIC_API_KEY'),
'model' => env('ANTHROPIC_MODEL', 'claude-3-opus-20240229'),
],
'ollama' => [
'host' => env('OLLAMA_HOST', 'http://localhost:11434'),
'model' => env('OLLAMA_MODEL', 'llama2'),
],
],
];
๐ Basic Usage
Generate text using any configured provider:
use Echolabs\Prism\Facades\Prism;
// Simple completion
$response = Prism::text()
->withPrompt('Summarize this article: ...')
->generate();
// With system prompt
$response = Prism::text()
->systemPrompt('You are a helpful assistant')
->withPrompt('What is Laravel?')
->generate();
// Switch providers on the fly
$response = Prism::text()
->using('anthropic', 'claude-3-5-sonnet-latest')
->withPrompt('Complex analysis needed...')
->generate();
๐๏ธ Structured Output
Get typed responses using schemas:
use EchoLabs\Prism\Prism;
use EchoLabs\Prism\Schema\ObjectSchema;
use EchoLabs\Prism\Schema\StringSchema;
use EchoLabs\Prism\Schema\ArraySchema;
use EchoLabs\Prism\Schema\NumberSchema;
$schema = new ObjectSchema(
name: 'user',
description: 'a user object',
properties: [
new StringSchema('name', "the user's name"),
new NumberSchema('age', "the user's age"),
new ArraySchema(
name: 'interests',
description: "The user's interests",
items: new StringSchema('interest', "a user's interests")
)
]
);
$profile = Prism::structured()
->withSchema($schema)
->withPrompt('Extract profile: John is 25 and loves coding, gaming')
->generate()
echo $profile['name']; // "John"
echo $profile['age']; // 25
Extend AI capabilities with custom tools:
use EchoLabs\Prism\Prism;
use EchoLabs\Prism\Facades\Tool;
$searchTool = Tool::as('search')
->for('Search for current information')
->withParameter('query', 'The search query')
->using(function (string $query): string {
// Your search implementation
return "Search results for: {$query}";
});
$response = Prism::text()
->withTools([$weatherTool])
->withPrompt('What will the wetather be for the football match today?')
->generate();
๐งช Testing
Fake AI responses in your tests:
use EchoLabs\Prism\Facades\Prism;
use EchoLabs\Prism\ValueObjects\Usage;
use EchoLabs\Prism\Enums\FinishReason;
use EchoLabs\Prism\Providers\ProviderResponse;
it('can generate text', function () {
// Create a fake provider response
$fakeResponse = new ProviderResponse(
text: 'Hello, I am Claude!',
toolCalls: [],
usage: new Usage(10, 20),
finishReason: FinishReason::Stop,
response: ['id' => 'fake-1', 'model' => 'fake-model']
);
// Set up the fake
$fake = Prism::fake([$fakeResponse]);
// Run your code
$response = Prism::text()
->using('anthropic', 'claude-3-sonnet')
->withPrompt('Who are you?')
->generate();
// Make assertions
expect($response->text)->toBe('Hello, I am Claude!');
});
๐ผ๏ธ Multi-Modal Support
Work with images alongside text:
use EchoLabs\Prism\ValueObjects\Messages\Support\Image;
// From a URL
$message = new UserMessage(
'Analyze this diagram:',
[Image::fromUrl('https://example.com/diagram.png')]
);
$response = Prism::text()
->using(Provider::Anthropic, 'claude-3-sonnet')
->withMessages([$message])
->generate();