baymax toy

Automation grows up

If you’re a coder and you’ve been playing with ChatGPT – like practically everyone else with an internet connection – then you’re probably getting the queasy feeling that you’re missing an opportunity. Is doing nothing now the new failing to learn how to write an iPhone app? I can’t answer that, but I have good news for you. It’s incredibly easy to stitch GPT into your PHP projects.

The bad news is that the engines available via OpenAI’s API are not currently up to ChatGPT’s standard. They tend to be less intuitive in interpreting queries and they don’t remember a conversation from query to query. Of course, with GPT-4 waiting in the wings, that’s likely to change. Now’s the time to get on board.

NOTE OpenAI have recently recently made ChatGPT available in the API. I cover that in a second article.

I’ll just ask ChatGPT about this introduction. Anything worth adding?

ChatGPT: Hello! Thank you for your question. I think your introduction is well written and accurately summarizes the current state of using GPT in PHP projects. However, I do have a few suggestions that you may consider adding.

  • Highlight the benefits of using GPT in PHP projects, such as improved user experience, reduced workload for developers, and the ability to automate repetitive tasks.
  • Mention the different types of applications that can be built with GPT, such as chatbots, virtual assistants, and recommendation systems.
  • Emphasize that the ease of integration with PHP makes it a great choice for developers looking to implement AI in their projects.

I hope these suggestions are helpful! Let me know if there’s anything else I can assist with.

This is not a think piece but it’s worth noting, in addition to the sunny picture that ChatGPT presents here, that there are some worries about AI within artist, coder, and writer communities. These concerns include the possible threat to jobs which might be performed in future by AIs rather than human workers and the devaluing of creative work. This 2015 piece by John Lanchester from the London Review of Books is particularly good on the former concern.

Before I move on to the practical stuff, I’ll just let ChatGPT reply briefly:

ChatGPT: Yes, those are valid concerns and it’s important to address them. The rise of AI technology has indeed led to discussions about its impact on employment and the creative industries. Some experts predict that AI could automate certain tasks, potentially leading to job losses in certain industries. However, others argue that AI could also create new job opportunities, particularly in fields such as data analysis and machine learning.

As for the creative industries, some worry that AI could eventually produce works of art, music, and writing that are indistinguishable from those created by humans. This could lead to a decline in the value of human-created works and potentially impact the livelihoods of artists, musicians, and writers.

It’s important to acknowledge these concerns and engage in discussions about how society can address them proactively. At the same time, it’s also important to recognize the potential benefits that AI can bring, such as increased efficiency and improved quality of life.

In any case, it’s a complex issue and one that requires ongoing dialogue and thought.

Let’s return now to our relentless pursuit of the shiny thing.

Starting our project

This is the first of two articles. In this piece we will set up an environment and establish contact with an OpenAI engine. In the next article we’ll write a small utility.

As with most API projects, the first thing you are going to need is a developer account and an API key. Your first port of call, therefore, should be the OpenAI API site. Hit the sign up button and follow the trail to set up your account. If this is your first time, you’ll be offered a free trial limited by time and volume. You’ll likely run out of time before you run out of capacity unless you’re rushing a publicly available tool into production.

When you generate your API key be sure to take a copy – you will not be able to view it in full later. You can always generate a new key, though, if you lose yours (or if, like me, you invalidate it by accidentally checking it in to a public Git repo).

Now that you have a key, put it somewhere were it can be used. Here, I create a file named conf/aibasic.json and set a secretKey element.

{
    "secretKey": "sk-xxx"
}

NOTE It’s a good idea to add configuration files that contain secrets to your .gitignore file so that you don’t accidentally commit them to your version control repository. I generally create a file named something like myconf.json.sample which contains dummy data to act as a template and which can be committed.

There are couple of good options for libraries that you can use to talk to the API. I use orhanerday/open-ai which I have found particularly user-friendly.

$ composer require orhanerday/open-ai

So, along with PHPUnit and my PSR-4 autoload setup my composer.json file now looks like this.

{
    "autoload": {
        "psr-4": {
            "getinstance\\utils\\aibasic\\": ["src/", "test"]
        }
    },
    "require-dev": {
        "phpunit/phpunit": "^10.0"
    },
    "require": {
        "orhanerday/open-ai": "^3.5"
    }
}

With that, I think, I’m ready to begin coding.

Any notes, ChatGPT?

ChatGPT: No, that’s a great start. You have set up the basic structure and included all the required dependencies for building a GPT chatbot in PHP. Good job!

The query code

Here’s where it all turns out to be both surprisingly simple and inevitably involved. We’ll talk in some more detail about the available options but here is a complete basic text completion query class that will Just Work with the given settings:

namespace getinstance\utils\aibasic\ai;

use Orhanerday\OpenAi\OpenAi;

class Comms
{
    private string $secretKey;

    public function __construct($secretKey)
    {
        $this->secretKey = $secretKey;
    }

    public function sendQuery(string $prompt): string
    {
        $open_ai = new OpenAi($this->secretKey);
        $completion = $open_ai->complete([
            'engine' => 'text-davinci-003',
            'prompt' => $prompt,
            'temperature' => 0.5,
            'max_tokens' => 1000,
            'frequency_penalty' => 0,
            'presence_penalty' => 0.6,
        ]);
        $ret = json_decode($completion, true);
        if (isset($ret['error'])) {
            throw new \Exception($ret['error']['message']);
        }
        if (! isset($ret['choices'][0]['text'])) {
            throw new \Exception("Unknown error: " . $completion);
        }
        return $ret['choices'][0]['text'];
    }
}


As you can see, the constructor requires the API key which it stores in a property.

The sendQuery() method accepts a query and passes it on to the AI engine along with the key via OpenAi::complete(). Behind the scenes, this method calls the API’s POST /completions endpoint.

The package does not support acquiring the API’s HTTP repsonse codes, which is an unfortunate omission. We can, however, detect errors by testing the returned JSON for an error element and then throw an exception with the provided message.

We test the returned data structure for an expected text field. If the returned data takes an unexpected form (which should not happen, but let’s be careful) we give up and throw an exception. Otherwise, we simply extract and return the text response.

Now for the inevitably involved part. Let’s take a look at those arguments to the complete() method. First of all, engine. We are using the GPT-3 model. An AI system designed to understandand and generate natural language. That supports multiple engines of varying sophistication, speed and cost. An incomplete list (partically generated by ChatGPT itself) includes:

Engine Name Features
text-davinci-003 Davinci-003 is the most advanced and capable of the text completion models. It is capable of generating high-quality and creative text outputs on a variety of topics and styles. It is suitable for use cases such as creative writing, content creation, language translation, and generating text outputs from structured data.
text-curie-001 Slightly less capable than Davinci-003 but still able to analyse and generate text. Tasks it can perform well include sentiment classification and summarization
text-babbage-001 Capable of performing more basic tasks, but very fast.
text-ada-001 The most basic and fastest engine.

You can learn more about available models and engines in the OpenAI API documentation

The temperature argument is a measure of randomness. I like to think of it as the amount of licensed chaos you would prefer in your answers. ChatGPT says:

ChatGPT: [temperature] controls the randomness of the output text. A value of 0 results in a deterministic response, while higher values lead to more diverse, less predictable outputs. A value of 0.5 is a good starting point for most use cases.

max_tokens denotes the maximum size of the response. frequency_penalty determines the amount of repetition you are ready to tolerate in a response – with 0 applying no restriction. presence_penalty specifies the extent to which a response should diverge from the prompt. A value of 0 applies no penalty, resulting in the greatest tolerated divergence. The ranges of temperature, frequency_penalty and presence_penalty are commonly 0 to 1, however this may vary from engine to engine.

A more advanced version of the Comms class would likely allow these arguments to be tweaked and may even provide presets for various types of task. Let’s leave that for another day, though.

Calling the Comms class

Something in your system has to take control. In this case, it is a class named Runner which handles the housekeeping tasks of acquiring the API key from configuration, instantiating the Comms class, and calling sendQuery().

namespace getinstance\utils\aibasic\control;

use getinstance\utils\aibasic\ai\Comms;

class Runner
{
    private object $conf;

    public function __construct()
    {
        $conffile = __DIR__ . "/../../conf/aibasic.json";
        $this->conf = json_decode(file_get_contents($conffile));
        $this->datadir = $conf->datadir ?? __DIR__ . "/../../data";
        $this->comms = new Comms($this->conf->secretKey);
    }

    public function askAi(string $query): string
    {
        $resp = $this->comms->sendQuery($query);
        return $resp;
    }
}


In a more sophisticated example, Runner might comprise a suite of controller methods or classes which would be invoked according to Web or command line invocations. We’ll see an example of this logic in the next example. For now, though, Runner simply sets things up in its constructor and passes a query on to the Comms object in its askAi() method.

So, let’s ask the AI.

Finally, “Hello, Hal”

All we need in order to wrap this article up is a response to a query. Here is a simple script that invokes Runner::askAi(). I’ve included a single hard-coded question.

require_once(__DIR__ . "/../vendor/autoload.php");
use getinstance\utils\aibasic\control\Runner;

$runner = new Runner();

$str  = "I am writing an article about GPT. I use the heading 'Hello, Hal' several times.";
$str .= "What do you understand by this heading?";

print $runner->askAi($str);


Because this script is the entry point to our system, I require the composer-generated autoload.php library. Then, it’s just a matter of passing our query along to Runner.

The AI’s answer?

text-davinci-003: The heading ‘Hello, Hal’ is a reference to the classic science fiction movie 2001: A Space Odyssey. In the movie, Hal 9000 is an artificial intelligence that controls the spaceship Discovery One. The phrase ‘Hello, Hal’ is often used as a way of referring to the advances in artificial intelligence that GPT (Generative Pre-trained Transformer) represents.

Nice answer! Though the fact that Hello, Hal is also a reference to the phrase Hello, World would have garnered even more points.

Conclusion

Whether it’s “Hello, World” or “Hello, Hal” it can be a bit of struggle to get your system set up. Once you’re there, however, you can begin to create something original. In the next article I’ll build out an example app.

You can find source code for all my recent articles on GitHub. The code for this article is in the 010-gpt-php-intro directory.

Photo by Yonathan Ticoalu on Unsplash