OpenAI SDK Integration Tutorial¶
This tutorial demonstrates how to use the OpenAI SDK with Unique's secure API proxy for chat completions, assistants, and responses.
Overview¶
The Unique API proxy provides a secure, OpenAI-compatible gateway that allows you to access any language model through the Unique platform.
Learn how to:
- Configure OpenAI SDK to work with Unique's secure API proxy
- Use chat completions through Unique's proxy
- Create and manage assistants
- Use the responses API
Prerequisites¶
- Unique SDK installed and configured
- OpenAI Python SDK installed (
pip install openai) - Valid API credentials (
API_KEY,APP_ID,COMPANY_ID,USER_ID) - Environment variables configured
Why Use Unique API Proxy?¶
The Unique API proxy provides enterprise-grade security and management for your LLM access:
- Secure Authentication: All requests are authenticated through Unique's secure infrastructure, protecting your API keys
- Usage Tracking: Optional recording of model usage for monitoring, cost management, and compliance (when enabled)
- Unified Interface: Access multiple LLM providers (OpenAI, Azure, Anthropic, etc.) through a single endpoint
- OpenAI Compatibility: Use the exact same APIs as OpenAI - no code changes needed beyond configuration
The API interface is identical to OpenAI's, so you can use any OpenAI SDK features and reference the official OpenAI documentation for API details.
Complete Example¶
Full OpenAI Integration Example¶
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 | |
Usage¶
Run the script with different commands:
Important Notes¶
-
API Base URL: The API base URL must point to your Unique API proxy endpoint:
{your_base_url}/openai-proxy/ -
Custom Headers: All requests require custom headers for authentication and routing:
x-user-id- User ID for request contextx-company-id- Company ID for multi-tenant isolationx-api-version- API version (typically"2023-12-06")x-app-id- Application ID for usage trackingx-model- Model identifier (e.g.,"AZURE_o3_2025_0416")-
Authorization- Bearer token with your Unique API key -
Model Selection: Use models available in your Unique environment. Check available models using the LLM Models API.
-
OpenAI API Compatibility: The Unique API proxy is fully compatible with OpenAI's API. You can use any OpenAI SDK features and reference the official OpenAI documentation for detailed API specifications, parameters, and response formats.
Related Resources¶
- Chat Completion API - Native Unique chat completion API
- LLM Models API - View available models
- Configuration Guide - Set up your environment