Skip to main content

Clarifai

Anthropic, OpenAI, Qwen, xAI, Gemini and most of Open soured LLMs are Supported on Clarifai.

PropertyDetails
DescriptionClarifai is a powerful AI platform that provides access to a wide range of LLMs through a unified API. LiteLLM enables seamless integration with Clarifai's models using an OpenAI-compatible interface.
Provider DocClarifai โ†—
OpenAI compatible Endpoint for Providerhttps://api.clarifai.com/v2/ext/openai/v1
Supported Endpoints/chat/completions

Pre-Requisitesโ€‹

pip install litellm

Required Environment Variablesโ€‹

To obtain your Clarifai Personal access token follow this link.

os.environ["CLARIFAI_PAT"] = "CLARIFAI_API_KEY"  # CLARIFAI_PAT

Usageโ€‹

import os
from litellm import completion

os.environ["CLARIFAI_API_KEY"] = ""

response = completion(
model="clarifai/openai.chat-completion.gpt-oss-20b",
messages=[{ "content": "Tell me a joke about physics?","role": "user"}]
)

Streaming Supportโ€‹

LiteLLM supports streaming responses with Clarifai models:

import litellm

for chunk in litellm.completion(
model="clarifai/openai.chat-completion.gpt-oss-20b",
api_key="CLARIFAI_API_KEY",
messages=[
{"role": "user", "content": "Tell me a fun fact about space."}
],
stream=True,
):
print(chunk.choices[0].delta)

Tool Calling (Function Calling)โ€‹

Clarifai models accessed via LiteLLM support function calling:

import litellm

tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current temperature for a given location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City and country e.g. Tokyo, Japan"
}
},
"required": ["location"],
"additionalProperties": False
},
}
}
}]

response = litellm.completion(
model="clarifai/openai.chat-completion.gpt-oss-20b",
api_key="CLARIFAI_API_KEY",
messages=[{"role": "user", "content": "What is the weather in Paris today?"}],
tools=tools,
)

print(response.choices[0].message.tool_calls)

Clarifai modelsโ€‹

liteLLM supports all models on Clarifai community

๐Ÿง  OpenAI Modelsโ€‹

๐Ÿค– Anthropic Modelsโ€‹

๐Ÿช„ xAI Modelsโ€‹

๐Ÿ”ท Google Gemini Modelsโ€‹

๐Ÿงฉ Qwen Modelsโ€‹

๐Ÿ’ก MiniCPM (OpenBMB) Modelsโ€‹

๐Ÿงฌ Microsoft Phi Modelsโ€‹

๐Ÿฆ™ Meta Llama Modelsโ€‹

๐Ÿ” DeepSeek Modelsโ€‹

Usage with LiteLLM Proxyโ€‹

Here's how to call Clarifai with the LiteLLM Proxy Server

1. Save key in your environmentโ€‹

export CLARIFAI_PAT="CLARIFAI_API_KEY"

2. Start the proxyโ€‹

model_list:
- model_name: clarifai-model
litellm_params:
model: clarifai/openai.chat-completion.gpt-oss-20b
api_key: os.environ/CLARIFAI_PAT
litellm --config /path/to/config.yaml

# Server running on http://0.0.0.0:4000

3. Test itโ€‹

curl --location 'http://0.0.0.0:4000/chat/completions' \
--header 'Content-Type: application/json' \
--data ' {
"model": "clarifai-model",
"messages": [
{
"role": "user",
"content": "what llm are you"
}
]
}
'

Important Notesโ€‹

  • Always prefix Clarifai model IDs with clarifai/ when specifying the model name
  • Use your Clarifai Personal Access Token (PAT) as the API key
  • Usage is tracked and billed through Clarifai
  • API rate limits are subject to your Clarifai account settings
  • Most OpenAI parameters are supported, but some advanced features may vary by model

FAQsโ€‹

QuestionAnswer
Can I use all Clarifai models with LiteLLM?Most chat-completion models are supported. Use the Clarifai model URL as the model.
Do I need a separate Clarifai PAT?Yes, you must use a valid Clarifai Personal Access Token.
Is tool calling supported?Yes, provided the underlying Clarifai model supports function/tool calling.
How is billing handled?Clarifai usage is billed independently via Clarifai.

Additional Resourcesโ€‹