Prev

Chat Completion API

Next

Add ChatGPT sprinkles to your app

PreReqs

To get started with the ChatGPT API you'll need to:

  1. Create an Open AI account
  2. Add some payment info
  3. Generate an API key

ChatGPT API?

Technically ChatGPT is a web app that uses the Open AI Chat Completion API. But for a lightning talk, ChatGPT is faster off the tongue.

Links

Basic Demo

Useful use cases

API Request

While OpenAI has a SDK for Node.js you can add, you can also make a raw fetch request to https://api.openai.com/v1/chat/completions

  1. Make a POST request to the OpenAI chat/completions endpoint.
  2. Authorize the request with an API key.
fetchChatCompletion.ts

fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
});

3a. The model allows us to choose between things like gpt-3.5-turbo and gpt-4.

fetchChatCompletion.ts

fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
body: JSON.stringify({
model: "gpt-3.5-turbo",
}),
});

3b. The messages array is how we pass in a chat history. The ChatGPT API itself has no memory/session state.

fetchChatCompletion.ts

fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
body: JSON.stringify({
model: "gpt-3.5-turbo",
messages: [
{
role: "user",
content: message,
},
],
}),
});

  1. Make a POST request to the OpenAI chat/completions endpoint.
  2. Authorize the request with an API key.

3a. The model allows us to choose between things like gpt-3.5-turbo and gpt-4.

3b. The messages array is how we pass in a chat history. The ChatGPT API itself has no memory/session state.

fetchChatCompletion.ts

fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
});

chat.completion API Response

The answer will be in choices[0].message.content.

response.json

{
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1677652288,
"model": "gpt-3.5-turbo-0301",
"choices": [
{
"message": {
"role": "assistant",
"content": "Hello there, how may I assist you today?"
}
}
],
"usage": {
"prompt_tokens": 9,
"completion_tokens": 12,
"total_tokens": 21
}
}

Use it in Remix

Make the OpenAI fetch request inside of a Remix action

  • Protects the API key
  • Return the fetch directly as the Action response
routes/demo.tsx

export const action = async ({ request }) => {
let formData = await request.formData();
let prompt = formData.get("prompt");
return fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
body: JSON.stringify({
model: "gpt-3.5-turbo",
messages: [
{
role: "user",
content: prompt,
},
],
}),
});
};

Submit to the action with a basic HTML <form/>.

routes/demo.tsx

export default function Demo() {
return (
<form method="post">
<label>
Prompt
<textarea required name="prompt" />
</label>
<div>
<button type="submit">Send</button>
</div>
</form>
);
}
export const action = async ({ request }) => {
let formData = await request.formData();
let prompt = formData.get("prompt");
return fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
body: JSON.stringify({
model: "gpt-3.5-turbo",
messages: [
{
role: "user",
content: prompt,
},
],
}),
});
};

  • We don't want to navigate after submission, we want to show the response.
  • The trick is to use fetcher.Form
routes/demo.tsx

export default function Demo() {
let fetcher = useFetcher();
return (
<fetcher.Form method="post">
<label>
Prompt
<textarea required name="prompt" />
</label>
<div>
<button type="submit">Send</button>
</div>
</fetcher.Form>
);
}
export const action = async ({ request }) => {
let formData = await request.formData();
let prompt = formData.get("prompt");
return fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
body: JSON.stringify({
model: "gpt-3.5-turbo",
messages: [
{
role: "user",
content: prompt,
},
],
}),
});
};

Display the API response stored on fetcher.data

routes/demo.tsx

export default function Demo() {
let fetcher = useFetcher();
return (
<fetcher.Form method="post">
<label>
Prompt
<textarea required name="prompt" />
</label>
<div>
<button type="submit">Send</button>
</div>
{fetcher.data && (
<pre>{fetcher.data?.choices?.[0]?.message?.content}</pre>
)}
</fetcher.Form>
);
}
export const action = async ({ request }) => {
let formData = await request.formData();
let prompt = formData.get("prompt");
return fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
body: JSON.stringify({
model: "gpt-3.5-turbo",
messages: [
{
role: "user",
content: prompt,
},
],
}),
});
};

Make the OpenAI fetch request inside of a Remix action

  • Protects the API key
  • Return the fetch directly as the Action response

Submit to the action with a basic HTML <form/>.

  • We don't want to navigate after submission, we want to show the response.
  • The trick is to use fetcher.Form

Display the API response stored on fetcher.data

routes/demo.tsx

export const action = async ({ request }) => {
let formData = await request.formData();
let prompt = formData.get("prompt");
return fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
body: JSON.stringify({
model: "gpt-3.5-turbo",
messages: [
{
role: "user",
content: prompt,
},
],
}),
});
};

The final Remix route file would look like this

routes/demo.tsx

export default function Demo() {
let fetcher = useFetcher();
return (
<fetcher.Form method="post">
<label>
Prompt
<textarea required name="prompt" />
</label>
<div>
<button type="submit">Send</button>
</div>
{fetcher.data && (
<pre>{fetcher.data?.choices?.[0]?.message?.content}</pre>
)}
</fetcher.Form>
);
}
export const action = async ({ request }) => {
let formData = await request.formData();
let prompt = formData.get("prompt");
return fetch("https://api.openai.com/v1/chat/completions", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
},
body: JSON.stringify({
model: "gpt-3.5-turbo",
messages: [
{
role: "user",
content: prompt,
},
],
}),
});
};