大模型融合访问开源工具 - LiteLLM

目前,能提供大模型的厂商越来越多,比如OpenAI、Anthropic、Google、Deepseek、Qwen。

另外,也有很多开源大模型工具,比如Ollama、vLLM、ollama.cpp。

这里尝试探索对这些LLM访问进修融合集成的开源工具LiteLLM。

所用示例参考和修改自网络资料。

1 LiteLLM概要介绍

1.1 LiteLLM介绍

LiteLLM 是一个开源工具,它像一个大语言模型的"万能遥控器"。

LiteLLM提供标准的API 接口,用一套代码调用上百种不同厂商大模型。简化了集成管理的工作。

LiteLLM的核心价值在于将复杂的底层差异封装起来,为开发者提供统一、强大的管理能力。

LiteLLM通过模型名称区分大模型,比如

model="ollama/gemma3n:e2b",表示模型为ollama大模型

model="openai/gemma3n:e2b",表示模型为openai兼容的大模型

1.2 LiteLLM安装

LiteLLM可以通过pip快速安装,命令如下所示。

pip install litellm

2 同步访问示例

这里示例LiteLLM采用同步方式访问大模型。

2.1 Ollama接口示例

这里测试兼容ollama接口的大模型调用。

假设模型为gemma3n:e2b,则model="ollama/gemma3n:e2b",示例代码如下。

复制代码
from litellm import completion

response = completion(
            model="ollama/gemma3n:e2b",
            messages = [{ "content": "Hello, how are you?","role": "user"}],
            api_base="http://localhost:11434"
)
print(response)

输出示例如下所示

ModelResponse(id='chatcmpl-cadfb4cf-c6f1-4105-8fbf-094f3d44b579', created=1766390099, model='ollama/gemma3n:e2b', object='chat.completion', system_fingerprint=None, choices=[Choices(finish_reason='stop', index=0, message=Message(content="Hello! I'm doing well, thank you for asking. As a large language model, I don't experience emotions like humans do, but I'm functioning optimally and ready to help. How are *you* doing today? 😊 \n\nIs there anything I can assist you with?\n\n\n\n", role='assistant', tool_calls=None, function_call=None, provider_specific_fields=None, reasoning_content=None))], usage=Usage(completion_tokens=62, prompt_tokens=21, total_tokens=83, completion_tokens_details=None, prompt_tokens_details=None))

2.2 Openai接口示例

这里测试兼容openai接口的大模型调用。

假设模型为gemma3n:e2b,则model="openai/gemma3n:e2b",示例代码如下。

复制代码
from litellm import completion

response = completion(
            model="openai/gemma3n:e2b",
            messages = [{ "content": "Hello, how are you?","role": "user"}],
            api_base="http://localhost:11434/v1",
            api_key="ollama"
)
print(response)

输出如下所示

ModelResponse(id='chatcmpl-712', created=1766390462, model='gemma3n:e2b', object='chat.completion', system_fingerprint='fp_ollama', choices=[Choices(finish_reason='stop', index=0, message=Message(content="Hello! I'm doing well, thank you for asking. As a large language model, I don't experience emotions or feelings like humans do, but I'm functioning optimally and ready to assist you. \n\nHow are *you* doing today? Is there anything I can help you with? 😊\n\n\n\n", role='assistant', tool_calls=None, function_call=None, provider_specific_fields={'refusal': None}), provider_specific_fields={})], usage=Usage(completion_tokens=65, prompt_tokens=15, total_tokens=80, completion_tokens_details=None, prompt_tokens_details=None), service_tier=None)

3 异步访问示例

这里示例LiteLLM采用同步方式访问大模型。

model等参数配置与同步访问一致,因为是异步访问,需要设置stream=True。

以下是流式访问openai接口兼容大模型的示例代码。

复制代码
from litellm import completion
import os


response = completion(
    model="openai/gemma3n:e2b",
    messages=[{ "content": "Hello, how are you?","role": "user"}],
    api_base="http://localhost:11434/v1",
    api_key="ollama",
    stream=True,
)

for item in response:
    print(item)

输出如下所示

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='I', role='assistant', function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' am', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' doing', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' well', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=',', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' thank', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' for', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' asking', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='!', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' As', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' a', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' large', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' language', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' model', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=',', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' I', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' don', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content="'", role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='t', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' experience', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' feelings', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' like', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' humans', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' do', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=',', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' but', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' I', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content="'", role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='m', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' functioning', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' optimally', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' and', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' ready', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' to', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' assist', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='.', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' ', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='\n\n', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='How', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' are', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' *', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='*', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' doing', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' today', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='?', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' Is', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' there', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' anything', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' I', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' can', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' help', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' with', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='?', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' 😊', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='\n\n\n\n', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason='stop', index=0, delta=Delta(provider_specific_fields=None, content=None, role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None)

referrence


LiteLLM - Getting Started

https://docs.litellm.ai/docs/

相关推荐
水如烟26 分钟前
孤能子视角:关系性学习,“喂饭“的小孩认知
人工智能
徐_长卿29 分钟前
2025保姆级微信AI群聊机器人教程:教你如何本地打造私人和群聊机器人
人工智能·机器人
XyX——32 分钟前
【福利教程】一键解锁 ChatGPT / Gemini / Spotify 教育权益!TG 机器人全自动验证攻略
人工智能·chatgpt·机器人
十二AI编程2 小时前
Anthropic 封杀 OpenCode,OpenAI 闪电接盘:AI 编程生态的 48 小时闪电战
人工智能·chatgpt
CCC:CarCrazeCurator2 小时前
从 APA 到 AVP:汽车自动泊车系统技术演进与产业发展深度研究
人工智能
OpenMiniServer2 小时前
当 AI 成为 Git 里的一个“人”
人工智能·git
bryant_meng3 小时前
【DLNR】《High-frequency Stereo Matching Network》
人工智能·深度学习·计算机视觉·stereo matching·dlnr
梦雨羊3 小时前
Base-NLP学习
人工智能·学习·自然语言处理
丝斯20113 小时前
AI学习笔记整理(42)——NLP之大规模预训练模型Transformer
人工智能·笔记·学习