Skip to content

ChatGPT ChatML Wrapper

What if there was a converter to help transition to ChatGPT's API with existing prompts?

With the announcement of ChatGPT’s API, many people will be looking to move to the new model with their old prompts. I’ve written two libraries that facilitate the transition for people who want to use turbo but only want to send one prompt along as usual.

Python: GitHub - bramses/gpt-to-chatgpt-py: Convert a regular GPT call into a ChatGPT call

Example:

import openai
import os
from dotenv import load_dotenv
from gpt_to_chatgpt import toChatML, get_message

load_dotenv()

openai.api_key = os.getenv("OPENAI_API_KEY")

res = openai.ChatCompletion.create(
  model="gpt-3.5-turbo",
  messages=toChatML("this is a test"),
)

print(get_message(res))

# As an AI language model, I don't really take tests, but I'm always ready to respond to your prompts and queries. How can I assist you today?

bramadams.dev is a reader-supported published Zettelkasten. Both free and paid subscriptions are available. If you want to support my work, the best way is by taking out a paid subscription.

Typescript/JS: GitHub - bramses/gpt-to-chatgpt-ts: Convert GPT Completion call to a ChatGPT call

Example:

const { Configuration, OpenAIApi } = require("openai");
const { toChatML, get_message } = require("gpt-to-chatgpt")
require("dotenv").config();

const configuration = new Configuration({
  apiKey: process.env.OPENAI_API_KEY,
});
const openai = new OpenAIApi(configuration);

openai.createChatCompletion({
  model: "gpt-3.5-turbo",
  messages: toChatML('this is a test')
}).then((data) => {
  console.log((get_message(data.data)));
});

// As an AI language model, I cannot provide a specific answer to the prompt, as it is too broad. However, I confirm that this is a test.

Enjoy!

Comments