Making API Calls to LLMs
Most generative AI applications use API calls to cloud models (OpenAI, Cohere, Anthropic, etc.). This chapter shows how to make your first API call with Python.
Prerequisites
- Python installed.
- An API key from OpenAI (or another provider).
- Install the library:
pip install openai.
Your First API Call (OpenAI)
import openai
openai.api_key = "your-api-key-here"
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain what an API is in one sentence."}
]
)
print(response.choices[0].message.content)Understanding the Response
The response contains the generated text in
choices[0].message.content.Other Providers
Cohere, Anthropic (Claude), and Hugging Face also offer APIs.
Handling Errors
Always use try-except to handle rate limits and API errors.
Two Minute Drill
- Use
openai.ChatCompletion.create()for chat models. - Messages include system, user, and assistant roles.
- Handle exceptions to avoid crashes.
- Store API keys in environment variables.
Need more clarification?
Drop us an email at career@quipoinfotech.com
