OpenAI API Access
You need to acquire an OpenAI API key to use these procedures. Using them will incur costs on your OpenAI account. You can set the api key globally by defining the apoc.openai.key configuration in apoc.conf
|
All the following procedures can have the following APOC config, i.e. in apoc.conf
or via docker env variable
.Apoc configuration
key |
description |
default |
apoc.ml.openai.type |
"AZURE" or "OPENAI", indicates whether the API is Azure or not |
"OPENAI" |
apoc.ml.openai.url |
the OpenAI endpoint base url |
https://api.openai.com/v1
(or empty string if |
apoc.ml.azure.api.version |
in case of |
"" |
Moreover, they can have the following configuration keys, as the last parameter. If present, they take precedence over the analogous APOC configs.
key |
description |
apiType |
analogous to |
endpoint |
analogous to |
apiVersion |
analogous to |
failOnError |
If true (default), the procedure fails in case of empty, blank or null input |
Therefore, we can use the following procedures with the Open AI Services provided by Azure, pointing to the correct endpoints as explained in the documentation.
That is, if we want to call an endpoint like https://my-resource.openai.azure.com/openai/deployments/my-deployment-id/embeddings?api-version=my-api-version` for example, by passing as a configuration parameter:
{endpoint: "https://my-resource.openai.azure.com/openai/deployments/my-deployment-id",
apiVersion: my-api-version,
apiType: 'AZURE'
}
The /embeddings
portion will be added under-the-hood.
Similarly, if we use the apoc.ml.openai.completion
, if we want to call an endpoint like https://my-resource.openai.azure.com/openai/deployments/my-deployment-id/completions?api-version=my-api-version
for example,
we can write the same configuration parameter as above,
where the /completions
portion will be added.
While using the apoc.ml.openai.chat
, with the same configuration, the url portion /chat/completions
will be added
Or else, we can write this apoc.conf
:
apoc.ml.openai.url=https://my-resource.openai.azure.com/openai/deployments/my-deployment-id
apoc.ml.azure.api.version=my-api-version
apoc.ml.openai.type=AZURE
Generate Embeddings API
This procedure apoc.ml.openai.embedding
can take a list of text strings, and will return one row per string, with the embedding data as a 1536 element vector.
It uses the /embeddings/create
API which is documented here.
Additional configuration is passed to the API, the default model used is text-embedding-ada-002
.
CALL apoc.ml.openai.embedding(['Some Text'], $apiKey, {}) yield index, text, embedding;
index | text | embedding |
---|---|---|
0 |
"Some Text" |
[-0.0065358975, -7.9563365E-4, …. -0.010693862, -0.005087272] |
name | description |
---|---|
texts |
List of text strings |
apiKey |
OpenAI API key |
configuration |
optional map for entries like model and other request parameters. We can also pass a custom Or an |
name | description |
---|---|
index |
index entry in original list |
text |
line of text from original list |
embedding |
1536 element floating point embedding vector for ada-002 model |
Text Completion API
This procedure apoc.ml.openai.completion
can continue/complete a given text.
It uses the /completions/create
API which is documented here.
Additional configuration is passed to the API, the default model used is text-davinci-003
.
CALL apoc.ml.openai.completion('What color is the sky? Answer in one word: ', $apiKey, {config}) yield value;
{ created=1684248202, model="text-davinci-003", id="cmpl-7GqBWwX49yMJljdmnLkWxYettZoOy", usage={completion_tokens=2, prompt_tokens=12, total_tokens=14}, choices=[{finish_reason="stop", index=0, text="Blue", logprobs=null}], object="text_completion"}
name | description |
---|---|
prompt |
Text to complete |
apiKey |
OpenAI API key |
configuration |
optional map for entries like model, temperature, and other request parameters |
name | description |
---|---|
value |
result entry from OpenAI (containing) |
Chat Completion API
This procedure apoc.ml.openai.chat
takes a list of maps of chat exchanges between assistant and user (with optional system message), and will return the next message in the flow.
It uses the /chat/create
API which is documented here.
Additional configuration is passed to the API, the default model used is gpt-4o
.
CALL apoc.ml.openai.chat([
{role:"system", content:"Only answer with a single word"},
{role:"user", content:"What planet do humans live on?"}
], $apiKey) yield value
{created=1684248203, id="chatcmpl-7GqBXZr94avd4fluYDi2fWEz7DIHL", object="chat.completion", model="gpt-4o-0301", usage={completion_tokens=2, prompt_tokens=26, total_tokens=28}, choices=[{finish_reason="stop", index=0, message={role="assistant", content="Earth."}}]}
CALL apoc.ml.openai.chat([
{role:"user", content:"Which athletes won the gold medal in mixed doubles's curling at the 2022 Winter Olympics?"}
], $apiKey, { model: "gpt-3.5-turbo" }) yield value
{ "created" : 1721902606, "usage" : { "total_tokens" : 59, "completion_tokens" : 32, "prompt_tokens" : 27 }, "model" : "gpt-3.5-turbo-2024-05-13", "id" : "chatcmpl-9opocM1gj9AMXIh7oSWWfoumJOTRC", "choices" : [ { "index" : 0, "finish_reason" : "stop", "message" : { "content" : "The gold medal in mixed doubles curling at the 2022 Winter Olympics was won by the Italian team, consisting of Stefania Constantini and Amos Mosaner.", "role" : "assistant" } } ], "system_fingerprint" : "fp_400f27fa1f", "object" : "chat.completion" }
name | description |
---|---|
messages |
List of maps of instructions with `{role:"assistant |
user |
system", content:"text}` |
apiKey |
OpenAI API key |
configuration |
optional map for entries like model, temperature, and other request parameters |
name | description |
---|---|
value |
result entry from OpenAI (containing created, id, model, object, usage(tokens), choices(message, index, finish_reason)) |