Skip to main content

Building SyliusGPT with the IONOS AI Model Hub

· 6 min read
Stephan Hochdörfer
Head of IT Business Operations

When our partner IONOS Cloud launched their new AI Model Hub offering, I tried to find a good use case to experiment with it. Why not build SyliusGPT? An AI solution knowing Sylius inside out - from a tech perspective.

An access token is required to be obtained through the following request to interact with the IONOS API. For authentication, include the Base64-encoded token of your IONOS username and password in your CURL command:

curl --location \
--request GET 'https://api.ionos.com/auth/v1/tokens/generate' \
--header 'Authorization: Basic WVhKemFHRmtMbUoxYzJoeVlVQnBiMjV2Y3k1amIyMDZJMUJoYTJsemRHRnVYekU9' \
--data ''

Since the AI Model Hub is currently only able to run open-source LLM models, we need to push the Sylius (and Symfony knowledge) into a vector database to be able to provide context to the LLM (by first extracting the "right" content") when querying it.

First, let's download the Sylius docs, the Sylius Stack docs as well as the Symfony documentation. This should give our AI solution a fairly good understanding of how things work in Sylius and Symfony.

Preparing IONOS Model Hub

Before we can push the documents into the IONOS AI Model Hub vector database, we need to create a document collection:

<?php

require __DIR__ . '/../vendor/autoload.php';

// set your IONOS API access token here...
const BEARER_TOKEN = '...';

$client = new GuzzleHttp\Client();

$response = $client->post(
'https://inference.de-txl.ionos.com/collections',
[
'headers' => [
'Content-Type' => 'application/json',
'Authorization' => 'Bearer ' . BEARER_TOKEN,
],
'json' => [
'properties' => [
'name' => 'Sylius documentation',
'description' => 'Collection of documents about Sylius and Symfony',
]
],
'config' => [
'curl' => [
\CURLOPT_SSLVERSION => 3,
],
],
]
);

$jsonResponse = json_decode($response->getBody()->getContents(), true, 512, JSON_THROW_ON_ERROR);
echo var_export($jsonResponse, true) . "\n";

From the JSON response, we note the "id" of the document collection as we will need it in the next step when pushing the documents into the document collection.

The whole documentation I downloaded resides in a local .docs directory. Let's recursively iterate over all the files and attach them to the document collection we created earlier. For now, we send all files as-is without any modifications:

<?php

require __DIR__ . '/../vendor/autoload.php';

// set your IONOS API access token here...
const BEARER_TOKEN = '...';
// set the id of the created document collection here...
const DOC_COLLECTION_ID = '';

$client = new GuzzleHttp\Client();

$it = new RecursiveDirectoryIterator(__DIR__ . '/../docs');
foreach (new RecursiveIteratorIterator($it) as $file) {
if (!in_array($file->getExtension(), ['md', 'rst'])) {
continue;
}

echo sprintf('Processsing file "%s"', $file) . "\n";

$response = $client->put(
sprintf('https://inference.de-txl.ionos.com/collections/%s/documents', DOC_COLLECTION_ID),
[
'headers' => [
'Content-Type' => 'application/json',
'Authorization' => 'Bearer ' . BEARER_TOKEN,
],
'json' => [
'items' => [
[
'properties' => [
'name' => $file->getBasename(),
'content' => base64_encode(file_get_contents($file)),
'contentType' => 'text/plain',
]
]
]
],
'config' => [
'curl' => [
\CURLOPT_SSLVERSION => 3,
],
],
]
);

$jsonResponse = json_decode($response->getBody()->getContents(), true, 512, JSON_THROW_ON_ERROR);
echo var_export($jsonResponse, true) . "\n";
}

Querying the IONOS AI Model Hub

Querying the IONOS AI Model Hub is a two-step process. First, we try to find the best matching documents in the vector database to give the LLM the best possible context for answering our questions.

But before that, let us choose our LLM model. To view the list of available LLM models, run the following command:

curl -H "Authorization: Bearer ${API_TOKEN}" \
--get https://inference.de-txl.ionos.com/models

I decided to use the meta-llama/Meta-Llama-3.1-405B-Instruct-FP8 LLM model for my tests and noted the model ID for further use.

<?php

// set your IONOS API access token here...
const BEARER_TOKEN = '...';
// set the id of the created document collection here...
const DOC_COLLECTION_ID = '';
// define how many documents should be retrieved from the vector database
const NUM_OF_DOCUMENTS = 3;
// define the LLM model to use
const MODEL_ID = '0b6c4a15-bb8d-4092-82b0-f357b77c59fd'; // meta-llama/Meta-Llama-3.1-405B-Instruct-FP8

require __DIR__ . '/../vendor/autoload.php';

// the query we want to run
$query = 'Which field types does the Sylius Grid bundle support?';

$client = new GuzzleHttp\Client();

$documents = [];
$response = $client->post(
sprintf('https://inference.de-txl.ionos.com/collections/%s/query', DOC_COLLECTION_ID),
[
'headers' => [
'Content-Type' => 'application/json',
'Authorization' => 'Bearer ' . BEARER_TOKEN,
],
'json' => [
'query' => $query,
'limit' => NUM_OF_DOCUMENTS,
]
]
);

$jsonResponse = json_decode($response->getBody()->getContents(), true, 512, JSON_THROW_ON_ERROR);
if (isset($jsonResponse['properties'], $jsonResponse['properties']['matches']) and is_array($jsonResponse['properties']['matches'])) {
foreach ($jsonResponse['properties']['matches'] as $match) {
if (isset($match['document'], $match['document']['properties'], $match['document']['properties']['content']) and is_string($match['document']['properties']['content'])) {
$documents[] = base64_decode($match['document']['properties']['content']);
}
}
}

Now that we have found all matching documents, let's pass them and our query to the LLM, including some instructions on how to handle the request:

$documents = implode("\n\n", $documents);

$prompt = <<<END
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Please use the information specified as context to answer the question.
Formulate your answer in one sentence and be an honest AI.<|eot_id|>
<|begin_of_text|><|start_header_id|>context<|end_header_id|>
$documents<|eot_id|>
<|start_header_id|>user<|end_header_id|>
$query<|eot_id|>
<|start_header_id|>assistant<|end_header_id|>
END;

$response = $client->post(
sprintf('https://inference.de-txl.ionos.com/models/%s/predictions', MODEL_ID),
[
'headers' => [
'Content-Type' => 'application/json',
'Authorization' => 'Bearer ' . BEARER_TOKEN,
],
'json' => [
'properties' => [
'input' => $prompt,
]
]
]
);
$jsonResponse = json_decode($response->getBody()->getContents(), true, 512, JSON_THROW_ON_ERROR);
if (isset($jsonResponse['properties'], $jsonResponse['properties']['output']) and is_string($jsonResponse['properties']['output'])) {
echo $jsonResponse['properties']['output'] . "\n";
}

How good are the results? I first tried a simple question, "Which field types does the Sylius Grid bundle support?" and this is what the LLM responded:

The Sylius Grid bundle supports the following field types: twig, string, and custom, 
with the ability to create additional custom field types by implementing the
FieldTypeInterface and registering them as services.

The answer looks good to me. But what does "registering them as services" mean? Let's ask the LLM to clarify. So I wondered, "How to register a Sylius Grid field type as a service? Can you show me the XML code to do that?". I specifically picked XML as output because the Sylius Stack documentation only features YAML and PHP configuration examples. This is the output I got:

To register a Sylius Grid field type as a service using XML, you can use the 
following code:

<!-- config/services.xml -->
<?xml version="1.0" encoding="UTF-8" ?>
<container xmlns="http://symfony.com/schema/dic/services"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://symfony.com/schema/dic/services
https://symfony.com/schema/dic/services/services-1.0.xsd">

<services>
<service id="app.grid_field.custom" class="App\Grid\FieldType\CustomType">
<tag name="sylius.grid_field" type="custom" />
</service>
</services>
</container>

This code defines a service named `app.grid_field.custom` with the class
`App\Grid\FieldType\CustomType` and tags it as a Sylius Grid field type
with the type `custom`.

As far as I can tell, the answer is correct. The XML output looks fine to me.

Building your own AI agent is relatively straightforward, thanks to the easy-to-use IONOS AI Model Hub APIs. Furthermore, your data is securely hosted in a data center in Germany, and IONOS ensures that your data will not be used to train models.