Master both OpenAI SDKs with hands-on examples!
This project helps developers understand and compare the OpenAI Agent SDK and the traditional OpenAI Python library through interactive web interfaces. Perfect for learning, prototyping, and building production-ready AI applications.
๐ง Agent SDK (/agent) |
๐ฌ Chat SDK (/chat) |
|---|---|
| Advanced AI Agents | Simple Chat Completions |
| Tool calling, memory, streaming | Direct API calls, streaming |
| Persistent conversation history | Session-based chat history |
| Complex multi-step reasoning | Straightforward Q&A |
| Production-ready agent framework | Traditional chat interface |
- โ Compare both OpenAI SDKs in action
- โ Understand when to use each approach
- โ Build your own AI agents and chat applications
- โ Deploy production-ready solutions
- โ Customize agent configurations easily
git clone <your-repo>
cd OpenAI-Agent-Template# Using uv (recommended)
uv sync
Create a .env file:
OPENAI_API_KEY=your_openai_api_key_here
FIREWORKS_API_KEY=your_fireworks_api_key_here # Optionalpython -m src.app.main
# or
uv run python -m src.app.main- ๐ค Agent SDK Demo: http://localhost:5000/agent
- ๐ฌ Chat SDK Demo: http://localhost:5000/chat
Experience the full power of OpenAI's Agent SDK:
- Persistent Memory: Conversations remember context across sessions
- Tool Integration: Weather tools, web search, and more
- Streaming Responses: Real-time AI responses
- Session Management: Each browser session maintains separate memory
- Advanced Features: Multi-turn reasoning, complex workflows
Try asking:
- "What's the weather like today?"
- "Remember that I like coffee. What should I drink tomorrow?"
- "Can you help me plan a trip to Japan?"
Learn traditional OpenAI API usage:
- Direct API Calls: Simple request-response pattern
- Chat History: Save and load conversations
- Streaming: Real-time response generation
- Session Storage: Browser-based conversation management
- Clean & Simple: Perfect for learning API basics
Try asking:
- "Explain quantum computing in simple terms"
- "Write a Python function to sort a list"
- "What are the benefits of renewable energy?"
Want to customize your AI agent? Edit src/agent/registry.py:
# Add your custom agent configuration
AGENT_CONFIGS = {
"openai": {
"name": "Agent (OpenAI)",
"model_factory": openai_model_factory,
"model_settings": chat_model_settings,
"instructions": INSTRUCTIONS, # โ Edit your agent's personality
"tools": [fetch_weather], # โ Add your custom tools
},
"your_custom_agent": {
"name": "My Custom Agent",
"model_factory": openai_model_factory,
"model_settings": your_custom_settings,
"instructions": "You are a helpful coding assistant...",
"tools": [your_custom_tools],
}
}# Use environment variable to switch agents
AGENT_TYPE=openai # Default OpenAI
AGENT_TYPE=fireworks # Fireworks AI
AGENT_TYPE=your_custom_agent # Your custom agentCreate new tools in src/agent/tools/:
from agents import function_tool
@function_tool
def your_custom_tool():
"""Your custom tool implementation"""
pass
# Add to registry.py
"tools": [fetch_weather, your_custom_tool]- โ FastAPI Backend: High-performance async web server
- โ Database Integration: SQLite (dev) / PostgreSQL (prod)
- โ Session Management: Persistent conversation memory
- โ Error Handling: Graceful fallbacks and user feedback
- โ Logging: Structured logging for debugging
- โ Docker Support: Easy deployment anywhere
- โ Modern Design: Clean, responsive Gradio interface
- โ Dark Theme: Easy on the eyes
- โ Real-time Streaming: Watch AI responses generate live
- โ Chat History: Save and manage conversations
- โ Mobile Friendly: Works great on all devices
- โ Persistent Sessions: Conversations survive page refreshes
- โ Memory Limits: Optimized for performance (10 recent messages)
- โ Database Storage: All conversations safely stored
- โ Session Isolation: Each user gets private memory
# Required
OPENAI_API_KEY=your_key_here
# Optional
FIREWORKS_API_KEY=your_fireworks_key
AGENT_TYPE=openai # or fireworks
DB_URL=your_database_url # required only for remote hostingdocker build -t openai-agent-platform .
docker run -p 5000:5000 -e OPENAI_API_KEY=your_key openai-agent-platform- Fork this repository
- Connect to Railway/Heroku
- Set environment variables
- Deploy! ๐
- Understand basic OpenAI API calls
- Learn about streaming responses
- Explore chat history management
- Practice with different prompts
- Experience persistent memory
- See tool calling in action
- Understand session management
- Compare with traditional approach
- Modify agent configurations
- Add custom tools and capabilities
- Implement your own business logic
- Deploy to production
Q: Which SDK should I use for my project? A: Use Agent SDK for complex, multi-step workflows with memory. Use Chat SDK for simple, stateless interactions.
Q: Can I add my own AI models?
A: Yes! Edit src/agent/registry.py to add new model configurations.
Q: How do I add custom tools?
A: Create functions in src/agent/tools/ and add them to your agent configuration.
Q: Is this production ready? A: Yes! Includes proper error handling, logging, database integration, and deployment configurations.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
- โ Commercial Use: Use this project in commercial applications
- โ Modification: Modify and distribute your changes
- โ Distribution: Share this project with others
- โ Private Use: Use privately without restrictions
โ ๏ธ Attribution: Include copyright notice and licenseโ ๏ธ State Changes: Document significant changes you make
๐ Happy Learning! Start exploring both interfaces and discover which OpenAI SDK fits your needs best!