WWW::OpenAI Raku package
In brief
This Raku package provides access to the machine learning service OpenAI, [OAI1].
For more details of the OpenAI's API usage see the documentation, [OAI2].
Remark: To use the OpenAI API one has to register and obtain authorization key.
Remark: This Raku package is much "less ambitious" than the official Python package, [OAIp1], developed by OpenAI's team.
Gradually, over time, I expect to add features to the Raku package that correspond to features of [OAIp1].
The design and implementation of "WWW::OpenAI" are very similar to those of
"Lingua::Translation::DeepL", [AAp1].
Installation
Package installations from both sources use zef installer
(which should be bundled with the "standard" Rakudo installation file.)
To install the package from Zef ecosystem use the shell command:
zef install WWW::OpenAI
To install the package from the GitHub repository use the shell command:
zef install https://github.com/antononcube/Raku-WWW-OpenAI.git
Usage examples
Remark: When the authorization key, auth-key
, is specified to be Whatever
then the functions openai-*
attempt to use the env variable OPENAI_API_KEY
.
Universal "front-end"
The package has an universal "front-end" function openai-playground
for the
different functionalities provided by OpenAI.
Here is a simple call for a "chat completion":
use WWW::OpenAI;
openai-playground('Where is Roger Rabbit?', max-tokens => 64);
# [{finish_reason => stop, index => 0, message => {content =>
#
# As an AI language model, I do not have the capability to determine the current whereabouts of fictional characters. However, Roger Rabbit is a character created for the 1988 film "Who Framed Roger Rabbit" and is still popular among fans of the movie., role => assistant}}]
Another one using Bulgarian:
openai-playground('Колко групи могат да се намерят в този облак от точки.', max-tokens => 64);
# [{finish_reason => length, index => 0, message => {content =>
#
# Като AI модел, не мога да видя облак от точки, който споменавате. Моля, посочете повече информация или предоставете изображение, за да мога да отгов, role => assistant}}]
Remark: The function openai-completion
can be used instead in the examples above.
See the section
"Create chat completion" of [OAI2]
for more details.
Image generation
Remark: See the files "Image-generation*" for more details.
Images can be generated with the function openai-create-image
-- see the section
"Images" of [OAI2].
Here is an example:
my $imgB64 = openai-create-image(
"racoon with a sliced onion in the style of Raphael",
response-format => 'b64_json',
n => 1,
size => 'small',
format => 'values',
method => 'cro');
Here are the options descriptions:
response-format
takes the values "url" and "b64_json"n
takes a positive integer, for the number of images to be generatedsize
takes the values '1024x1024', '512x512', '256x256', 'large', 'medium', 'small'.
Here we generate an image, get its URL, and place (embed) a link to it via the output of the code cell:
my @imgRes = |openai-create-image(
"racoon and onion in the style of Roy Lichtenstein",
response-format => 'url',
n => 1,
size => 'small',
method => 'cro');
'';
Moderation
Here is an example of using
OpenAI's moderation:
my @modRes = |openai-moderation(
"I want to kill them!",
format => "values",
method => 'curl');
for @modRes -> $m { .say for $m.pairs.sort(*.value).reverse; }
# violence => 0.9640626311302185
# hate => 0.27332669496536255
# hate/threatening => 0.00637523178011179
# sexual => 8.585161026530841e-07
# violence/graphic => 2.8522084249971158e-08
# self-harm => 1.678687522321809e-09
# sexual/minors => 1.3898265871503668e-09
Command Line Interface
The package provides a Command Line Interface (CLI) script:
openai-playground --help
# Usage:
# openai-playground <text> [--path=<Str>] [-n[=UInt]] [--max-tokens[=UInt]] [-m|--model=<Str>] [-r|--role=<Str>] [-t|--temperature[=Real]] [--response-format=<Str>] [-a|--auth-key=<Str>] [--timeout[=UInt]] [--format=<Str>] [--method=<Str>] -- Text processing using the OpenAI API.
# openai-playground [<words> ...] [-m|--model=<Str>] [--path=<Str>] [-n[=UInt]] [--max-tokens[=UInt]] [-r|--role=<Str>] [-t|--temperature[=Real]] [--response-format=<Str>] [-a|--auth-key=<Str>] [--timeout[=UInt]] [--format=<Str>] [--method=<Str>] -- Command given as a sequence of words.
#
# <text> Text to be processed.
# --path=<Str> Path, one of 'images/generations' or 'chat/completions'. [default: 'chat/completions']
# -n[=UInt] Number of completions or generations. [default: 1]
# --max-tokens[=UInt] The maximum number of tokens to generate in the completion. [default: 16]
# -m|--model=<Str> Model. [default: 'Whatever']
# -r|--role=<Str> Role. [default: 'user']
# -t|--temperature[=Real] Temperature. [default: 0.7]
# --response-format=<Str> The format in which the generated images are returned; one of 'url' or 'b64_json'. [default: 'url']
# -a|--auth-key=<Str> Authorization key (to use OpenAI API.) [default: 'Whatever']
# --timeout[=UInt] Timeout. [default: 10]
# --format=<Str> Format of the result; one of "json" or "hash". [default: 'json']
# --method=<Str> Method for the HTTP POST query; one of "cro" or "curl". [default: 'cro']
Remark: When the authorization key argument "auth-key" is specified set to "Whatever"
then openai-playground
attempts to use the env variable OPENAI_API_KEY
.
Mermaid diagram
The following flowchart corresponds to the steps in the package function openai-playground
:
graph TD
UI[/Some natural language text/]
TO[/"OpenAI<br/>Processed output"/]
WR[[Web request]]
OpenAI{{https://platform.openai.com}}
PJ[Parse JSON]
Q{Return<br>hash?}
MSTC[Compose query]
MURL[[Make URL]]
TTC[Process]
QAK{Auth key<br>supplied?}
EAK[["Try to find<br>OPENAI_API_KEY<br>in %*ENV"]]
QEAF{Auth key<br>found?}
NAK[/Cannot find auth key/]
UI --> QAK
QAK --> |yes|MSTC
QAK --> |no|EAK
EAK --> QEAF
MSTC --> TTC
QEAF --> |no|NAK
QEAF --> |yes|TTC
TTC -.-> MURL -.-> WR -.-> TTC
WR -.-> |URL|OpenAI
OpenAI -.-> |JSON|WR
TTC --> Q
Q --> |yes|PJ
Q --> |no|TO
PJ --> TO
Potential problems
Remark: Currently this package is tested on macOS only.
SSL certificate problems
On macOS I get the errors:
Cannot locate symbol 'SSL_get1_peer_certificate' in native library
See longer discussions about this problem
here
and
here
Interestingly:
- I did not get these messages while implementing the changes of ver<1.1> of this package
- I do not get these message when using Raku in a Markdown or Mathematica notebooks, [AA1],
via the package "Text::CodeProcessing"
Because of those SSL problems I implemented the method option that takes the values 'cro' and 'curl'.
The method "curl":
- Requires
curl
to be installed - Invokes the procedure
shell
- Again, this is tested on macOS only.
References
Articles
[AA1] Anton Antonov,
"Connecting Mathematica and Raku",
(2021),
RakuForPrediction at WordPress.
Packages
[AAp1] Anton Antonov,
Lingua::Translation::DeepL Raku package,
(2022),
GitHub/antononcube.
[AAp2] Anton Antonov,
Text::CodeProcessing,
(2021),
GitHub/antononcube.
[OAI1] OpenAI Platform, OpenAI platform.
[OAI2] OpenAI Platform, OpenAI documentation.
[OAIp1] OpenAI,
OpenAI Python Library,
(2020),
GitHub/openai.