https://aitechpoint1.blogspot.com/

Using OpenAI in Laravel. Artificial intelligence AI is it a buzzword, or is it something we need to be thinking about?

Using OpenAI in Laravel. AI is it a buzzword, or is it something we need to be thinking about? With the discharge of the OpenAI bundle, we will dive into AI-powered Laravel packages So, what is OpenAI? What can we do with it? Mostly this is about natural language processing. 

laravel



We pass it a few textual content, and in go back, it could amaze us with many things. From whole text examples to code examples to something you want. The principal hindrance is your imagination and its present day skills. Nuno launched the OpenAI PHP patron a while ago - however what can we do with this? Often I discover that the maximum difficult a part of operating with this technology is knowing what you can do with it. Looking across the OpenAI examples, I determined masses of samples that might in all likelihood paintings well for me and standard utilization. 

Let's stroll through the version for "Ad from product description". I am selecting this due to the fact we often seize text enter from our customers and then should output this later for various reasons. Imagine you are walking an online shop and want to capture product descriptions on your objects - and want to add a few killer "advert" style textual content to inspire people to click at the product. Let's leap in. I won't stroll you thru the set up manner of putting in place the OpenAI PHP Client. 

As you are analyzing this, I count on I do not want to train you this element. However, I will walk via how I might use this bundle, a good way to likely be special from different tutorials.

 The first thing I would do is bind the OpenAI Client class to my box - so that I do not need to use the facade or instantiate a client.


final class AppServiceProvider extends ServiceProvider

{

    public function register(): void

    {

        $this->app->singleton(

            abstract: Client::class,

            concrete: fn () => OpenAI::client(

                apiToken: strval(config('openai.api_key')),

            ),

        );

    }

}



Now, every time I try and inject the OpenAI Client magnificence right into a constructor or anywhere else, it's going to come pre-setup for me.

 Our use case is while we keep a product with a description. We want to vehicle-generate an advertisement textual content for the product. In fact, the pleasant place for this to show up might be in a Job. One we are able to dispatch as a part of the introduction manner of the model itself.

 I won't go through the creation of the model, as I need to make certain I get all of the points from this tutorial.

 When we dispatch a background task, we skip through what we want serialized into the constructor - and then, inside the cope with approach, we are able to solve instances from the DI field.

 Let's take a look at how this would look if we were to technique it really.


final class GenerateAdFromProduct implements ShouldQueue

{

    use Dispatchable;

    use InteractsWithQueue;

    use Queueable;

    use SerializesModels;

 

    public function __construct(

        public readonly string $text,

        public readonly int $product,

    ) {}

 

    public function handle(Client $client): void

    {

        $response = $client->completions()->create([

            'model' => 'text-davinci-003',

            'prompt' => $this->text,

            'temperature' => 0.5,

            'max_tokens' => 100,

            'top_p' => 1.0,

            'frequency_penalty' => 0.0,

            'presence_penalty' => 0.0,

        ]);

 

        DB::transaction(fn () => Product::query()->find(

            id: $this->product,

        ))->update(['ai_description' => $response['choices'][0]['text']]);

    }

}



This uses the settings defined on the instance page with out searching too deep into it. Of path, in case you do that in production, I fantastically propose looking at those settings greater carefully.

 Can we enhance this? Of route, we will - I am Steve, and I am opinionated .... Let's take it a step further.

 What are we doing right here? We use a predefined version to generate and return a response from OpenAI. The settings we are the usage of are predefined within the examples. Of course, they may be tweaked, however we can use this to take this a step further.

 Our first step is the version itself. There are most effective such a lot of available that OpenAI presently helps - this may change sooner or later, however for now, it might not. Let's create an Enum for the version a part of the payload:


enum Model: string

{

    case ADA = 'text-ada-001';

    case BABBAGE = 'text-babbage-001';

    case CURIE = 'text-curie-001';

    case DAVINCI = 'text-davinci-003';

}

Step one is complete. Let's have a look at the OpenAI client request now.

 

$client->completions()->create([

    'model' => Model::DAVINCI->value,

    'prompt' => $this->text,

    'temperature' => 0.5,

    'max_tokens' => 100,

    'top_p' => 1.0,

    'frequency_penalty' => 0.0,

    'presence_penalty' => 0.0,

]);


Pretty correct. The rest of the settings are particular to the version and result I am looking to acquire from what I can inform from the documentation. So that is some thing this is purposefully set up so that I can get commercial text from another text body. To me, this is an Advertising Transformer. It transforms something prompt you supply it into an advertisement. So, with that during thoughts - let's create a particular magnificence to create this.


final class AdvertisementTransformer

{

    public static function transform(string $prompt): array

    {

        return [

            'model' => Model::DAVINCI->value,

            'prompt' => $prompt,

            'temperature' => 0.5,

            'max_tokens' => 100,

            'top_p' => 1.0,

            'frequency_penalty' => 0.0,

            'presence_penalty' => 0.0,

        ];

    }

}


We are extracting the good judgment of creating a final touch to a committed elegance to be able to permit us to reuse it without difficulty. Let's look back on the OpenAI consumer request now:

$client->completions()->create(

    parameters: AdvertisementTransformer::transform(

        prompt: $this->text,

    ),

);


This, to me, as a minimum, is smooth and understandable. Looking at this, I am passing within the text from the job to a transformer a good way to remodel to the required parameters for an advertisement text generator.

 

The output for this could be the following:

{

  "object": "text_completion",

  "created": 1672769063,

  "model": "text-davinci-003",

  "choices": [

    {



"textual content": "Are you a #Laravel developer trying to live updated with the modern day information and updates? Look no in addition than Laravel News! With over 10K customers daily, you'll be able to stay knowledgeable and study from the respectable news outlet for the Laravel atmosphere.


"index": 0,

      "logprobs": null,

      "finish_reason": "stop"

    }

  ],

  "usage": {

    "prompt_tokens": 82,

    "completion_tokens": 72,

    "total_tokens": 154

  }

}



As you may see we've an array of choices, every with a text key. To retrieve this we just want to get entry to it like we normally would in PHP:



$response = $client->completions()->create(

    parameters: AdvertisementTransformer::transform(

        prompt: $this->text,

    ),

);


DB::transaction(fn () => Product::query()->find(

    id: $this->product,

))->update(['ai_description' => $response['choices'][0]['text']]);



All you currently need to do is create the standard transformers that you might use on your utility, tweak the parameters to the factor that you recognize they'll work for you, and you're free to hold on.

Post a Comment

0 Comments