Skip to content

litellm:openai-gpt-5-chat

Basic Information

  • Name: litellm:openai-gpt-5-chat
  • Provider: LITELLM
  • Version: gpt-5-chat
  • Encoder: o200k_base

Token Limits

  • Input Limit: 128,000 tokens
  • Output Limit: 16,384 tokens
  • Total Limit: 144,384 tokens

Capabilities

  • responses_api
  • streaming
  • vision

Temperature Bounds

  • Not specified