大模型融合访问开源工具 - LiteLLM

目前,能提供大模型的厂商越来越多,比如OpenAI、Anthropic、Google、Deepseek、Qwen。

另外,也有很多开源大模型工具,比如Ollama、vLLM、ollama.cpp。

这里尝试探索对这些LLM访问进修融合集成的开源工具LiteLLM。

所用示例参考和修改自网络资料。

1 LiteLLM概要介绍

1.1 LiteLLM介绍

LiteLLM 是一个开源工具,它像一个大语言模型的"万能遥控器"。

LiteLLM提供标准的API 接口,用一套代码调用上百种不同厂商大模型。简化了集成管理的工作。

LiteLLM的核心价值在于将复杂的底层差异封装起来,为开发者提供统一、强大的管理能力。

LiteLLM通过模型名称区分大模型,比如

model="ollama/gemma3n:e2b",表示模型为ollama大模型

model="openai/gemma3n:e2b",表示模型为openai兼容的大模型

1.2 LiteLLM安装

LiteLLM可以通过pip快速安装,命令如下所示。

pip install litellm

2 同步访问示例

这里示例LiteLLM采用同步方式访问大模型。

2.1 Ollama接口示例

这里测试兼容ollama接口的大模型调用。

假设模型为gemma3n:e2b,则model="ollama/gemma3n:e2b",示例代码如下。

复制代码
from litellm import completion

response = completion(
            model="ollama/gemma3n:e2b",
            messages = [{ "content": "Hello, how are you?","role": "user"}],
            api_base="http://localhost:11434"
)
print(response)

输出示例如下所示

ModelResponse(id='chatcmpl-cadfb4cf-c6f1-4105-8fbf-094f3d44b579', created=1766390099, model='ollama/gemma3n:e2b', object='chat.completion', system_fingerprint=None, choices=[Choices(finish_reason='stop', index=0, message=Message(content="Hello! I'm doing well, thank you for asking. As a large language model, I don't experience emotions like humans do, but I'm functioning optimally and ready to help. How are *you* doing today? 😊 \n\nIs there anything I can assist you with?\n\n\n\n", role='assistant', tool_calls=None, function_call=None, provider_specific_fields=None, reasoning_content=None))], usage=Usage(completion_tokens=62, prompt_tokens=21, total_tokens=83, completion_tokens_details=None, prompt_tokens_details=None))

2.2 Openai接口示例

这里测试兼容openai接口的大模型调用。

假设模型为gemma3n:e2b,则model="openai/gemma3n:e2b",示例代码如下。

复制代码
from litellm import completion

response = completion(
            model="openai/gemma3n:e2b",
            messages = [{ "content": "Hello, how are you?","role": "user"}],
            api_base="http://localhost:11434/v1",
            api_key="ollama"
)
print(response)

输出如下所示

ModelResponse(id='chatcmpl-712', created=1766390462, model='gemma3n:e2b', object='chat.completion', system_fingerprint='fp_ollama', choices=[Choices(finish_reason='stop', index=0, message=Message(content="Hello! I'm doing well, thank you for asking. As a large language model, I don't experience emotions or feelings like humans do, but I'm functioning optimally and ready to assist you. \n\nHow are *you* doing today? Is there anything I can help you with? 😊\n\n\n\n", role='assistant', tool_calls=None, function_call=None, provider_specific_fields={'refusal': None}), provider_specific_fields={})], usage=Usage(completion_tokens=65, prompt_tokens=15, total_tokens=80, completion_tokens_details=None, prompt_tokens_details=None), service_tier=None)

3 异步访问示例

这里示例LiteLLM采用同步方式访问大模型。

model等参数配置与同步访问一致,因为是异步访问,需要设置stream=True。

以下是流式访问openai接口兼容大模型的示例代码。

复制代码
from litellm import completion
import os


response = completion(
    model="openai/gemma3n:e2b",
    messages=[{ "content": "Hello, how are you?","role": "user"}],
    api_base="http://localhost:11434/v1",
    api_key="ollama",
    stream=True,
)

for item in response:
    print(item)

输出如下所示

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='I', role='assistant', function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' am', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' doing', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' well', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=',', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' thank', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' for', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' asking', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='!', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' As', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' a', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' large', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' language', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' model', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=',', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' I', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' don', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content="'", role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='t', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' experience', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' feelings', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' like', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' humans', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' do', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=',', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' but', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' I', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content="'", role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='m', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' functioning', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' optimally', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' and', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' ready', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' to', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' assist', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='.', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' ', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='\n\n', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='How', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' are', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' *', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='*', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' doing', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' today', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='?', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' Is', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' there', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' anything', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' I', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' can', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' help', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' with', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='?', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' 😊', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='\n\n\n\n', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason='stop', index=0, delta=Delta(provider_specific_fields=None, content=None, role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None)

referrence


LiteLLM - Getting Started

https://docs.litellm.ai/docs/

相关推荐
小白|3 分钟前
CANN与联邦学习融合:构建隐私安全的分布式AI推理与训练系统
人工智能·机器学习·自动驾驶
艾莉丝努力练剑10 分钟前
hixl vs NCCL:昇腾生态通信库的独特优势分析
运维·c++·人工智能·cann
梦帮科技11 分钟前
Node.js配置生成器CLI工具开发实战
前端·人工智能·windows·前端框架·node.js·json
程序员泠零澪回家种桔子13 分钟前
Spring AI框架全方位详解
java·人工智能·后端·spring·ai·架构
Echo_NGC223716 分钟前
【FFmpeg 使用指南】Part 3:码率控制策略与质量评估体系
人工智能·ffmpeg·视频·码率
纤纡.26 分钟前
PyTorch 入门精讲:从框架选择到 MNIST 手写数字识别实战
人工智能·pytorch·python
大大大反派27 分钟前
CANN 生态中的自动化部署引擎:深入 `mindx-sdk` 项目构建端到端 AI 应用
运维·人工智能·自动化
程序猿追28 分钟前
深度解读 AIR (AI Runtime):揭秘 CANN 极致算力编排与调度的核心引擎
人工智能
2601_9495936533 分钟前
深入解析CANN-acl应用层接口:构建高效的AI应用开发框架
数据库·人工智能
●VON35 分钟前
CANN安全与隐私:从模型加固到数据合规的全栈防护实战
人工智能·安全