minte9
LearnRemember




API

Create new secret key and install the library. Set environment variable permanently (API secret key).
 
pin install openai
    # platform.openai.com/account/api-keys

sudo gedit ~/.bashrc
    # export OPENAI_API_KEY=xxx

echo $OPENAI_API_KEY

Request

Extract the content of the assistant's message from the JSON response.
 
import openai
import os

# Setup OpenAI API key
openai.api_key = os.environ.get("OPENAI_API_KEY")

question = "What is flask python?"

response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": question}],
    max_tokens=256,
    n=1,
    stop=None,
    temperature=0.7
)

answer = response['choices'][0]['message']['content']
print(answer)

"""
Flask is a lightweight web framework written in Python. 
It is designed to be simple, easy to use, and flexible ...
"""

Streaming Completion

Streaming with the OpenAI API allows you to get partial results and process them as they become available, which can be more efficient and responsive.
 
import openai
import os, sys

# Setup OpenAI API key
openai.api_key = os.environ.get("OPENAI_API_KEY")

# User question
question = "What is flask python?"

response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": question}],
    max_tokens=256,
    n=1,
    stop=None,
    temperature=0.7,
    stream=True
)

for chunk in response:
    content = chunk["choices"][0]["delta"].get("content", "")
    print(content, end="", flush=True)

Conversation History

You can create a conversation where the model remembers the context. You'll need to include that conversation history in subsequent requests.
 
import openai
import os, sys

# Setup OpenAI API key
openai.api_key = os.environ.get("OPENAI_API_KEY")

# Initialize conversation history
conversation_history = []

while True:
    # User question
    print("\nPlease enter your question (or 'exit' to end):")
    question = input()

    if question.lower() == 'exit':
        break

    # Add user question to conversation history
    conversation_history.append({"role": "user", "content": question})

    response = openai.ChatCompletion.create(
        model="gpt-3.5-turbo",
        messages=conversation_history,
        max_tokens=256,
        n=1,
        stop=None,
        temperature=0.7,
        stream=True
    )

    for chunk in response:
        content = chunk["choices"][0]["delta"].get("content", "")
        print(content, end="", flush=True)

    # Add API response to converstion history
    conversation_history.append({"role": "system", "content": content})

print("Conversation ended.")

Context

You can add a context message at the start of the conversation to instruct the model.
 
import openai
import os, sys

# Setup OpenAI API key
openai.api_key = os.environ.get("OPENAI_API_KEY")

# Initialize conversation history
conversation_history = []

# Context for keeping answers short
context_message = {
    "role": "system",
    "content": "System: Please keep the answers short"
}
conversation_history.append(context_message)

# Question stream
questions = [
    "What is Flask? Keep the answers short.",
    "What's the current version?"
]

for question in questions:
    print("\nQuestion:", question)

    # Add user question to conversation history
    conversation_history.append({"role": "user", "content": question})

    response = openai.ChatCompletion.create(
        model="gpt-3.5-turbo",
        messages=conversation_history,
        max_tokens=64,
        n=1,
        stop=None,
        temperature=0.7,
        stream=True
    )

    for chunk in response:
        content = chunk["choices"][0]["delta"].get("content", "")
        print(content, end="", flush=True)

    # Add API response to converstion history
    conversation_history.append({"role": "system", "content": content})



  Last update: 212 days ago