I have a Laravel application that uses OpenAI API through the openai-php/laravel
composer package.
It used to work fine but suddenly started to fail to connect to OpenAI even though I made no changes to the codebase.
The code that makes the request to OpenAI is as follows. It calls the OpenAI API with the stream mode. It uses Guzzle Http client under the hood.
use OpenAI\Laravel\Facades\OpenAI;
$stream = OpenAI::chat()->createStreamed([
'model' => 'gpt-4-turbo',
'messages' => [
['role' => 'user', 'content' => $prompt],
],
'temperature' => 0,
]);
The error message I get is as follows.
{
"message": "Connection refused for URI https://api.openai.com/v1/chat/completions",
"context": {
"userId": "9bc22218-7648-4baf-85df-ddaa75879d38",
"exception": {
"class": "OpenAI\\Exceptions\\TransporterException",
"message": "Connection refused for URI https://api.openai.com/v1/chat/completions",
"code": 0,
"file": "/var/www/html/vendor/openai-php/client/src/Transporters/HttpTransporter.php:108",
"previous": {
"class": "GuzzleHttp\\Exception\\ConnectException",
"message": "Connection refused for URI https://api.openai.com/v1/chat/completions",
"code": 0,
"file": "/var/www/html/vendor/guzzlehttp/guzzle/src/Handler/StreamHandler.php:72",
"previous": {
"class": "GuzzleHttp\\Exception\\ConnectException",
"message": "Connection refused for URI https://api.openai.com/v1/chat/completions",
"code": 0,
"file": "/var/www/html/vendor/guzzlehttp/guzzle/src/Handler/StreamHandler.php:329"
}
}
}
},
"level": 400,
"level_name": "ERROR",
"channel": "production",
"datetime": "2024-06-16T00:26:21.750301+00:00",
"extra": {}
}
The original exception is thrown from the StreamHandler.php
file in the guzzlehttp
package.
$resource = @\fopen((string) $uri, 'r', false, $contextResource);
$this->lastHeaders = $http_response_header ?? [];
if (false === $resource) {
throw new ConnectException(sprintf('Connection refused for URI %s', $uri), $request, null, $context);
}
It works fine in the following scenarios:
- Local environment with the streaming enabled (
OpenAI::chat()->createStreamed()
) - Production environment (Fly.io) with streaming disabled (
OpenAI::chat()->create()
instead ofOpenAI::chat()->createStreamed()
) - Production environment (Fly.io) with streaming enabled but the code executed in the Tinker session
It only fails when the app calls OpenAI with streaming enabled (OpenAI::chat()->createStreamed()
) in the context of a web request in production (Fly.io).
Because the same code works in the Tinker session, I suspect that the issue is related to the web server permissions to make outgoing requests. But Iām not sure how to debug this further.