大模型融合访问开源工具 - LiteLLM

目前,能提供大模型的厂商越来越多,比如OpenAI、Anthropic、Google、Deepseek、Qwen。

另外,也有很多开源大模型工具,比如Ollama、vLLM、ollama.cpp。

这里尝试探索对这些LLM访问进修融合集成的开源工具LiteLLM。

所用示例参考和修改自网络资料。

1 LiteLLM概要介绍

1.1 LiteLLM介绍

LiteLLM 是一个开源工具,它像一个大语言模型的"万能遥控器"。

LiteLLM提供标准的API 接口,用一套代码调用上百种不同厂商大模型。简化了集成管理的工作。

LiteLLM的核心价值在于将复杂的底层差异封装起来,为开发者提供统一、强大的管理能力。

LiteLLM通过模型名称区分大模型,比如

model="ollama/gemma3n:e2b",表示模型为ollama大模型

model="openai/gemma3n:e2b",表示模型为openai兼容的大模型

1.2 LiteLLM安装

LiteLLM可以通过pip快速安装,命令如下所示。

pip install litellm

2 同步访问示例

这里示例LiteLLM采用同步方式访问大模型。

2.1 Ollama接口示例

这里测试兼容ollama接口的大模型调用。

假设模型为gemma3n:e2b,则model="ollama/gemma3n:e2b",示例代码如下。

复制代码
from litellm import completion

response = completion(
            model="ollama/gemma3n:e2b",
            messages = [{ "content": "Hello, how are you?","role": "user"}],
            api_base="http://localhost:11434"
)
print(response)

输出示例如下所示

ModelResponse(id='chatcmpl-cadfb4cf-c6f1-4105-8fbf-094f3d44b579', created=1766390099, model='ollama/gemma3n:e2b', object='chat.completion', system_fingerprint=None, choices=[Choices(finish_reason='stop', index=0, message=Message(content="Hello! I'm doing well, thank you for asking. As a large language model, I don't experience emotions like humans do, but I'm functioning optimally and ready to help. How are *you* doing today? 😊 \n\nIs there anything I can assist you with?\n\n\n\n", role='assistant', tool_calls=None, function_call=None, provider_specific_fields=None, reasoning_content=None))], usage=Usage(completion_tokens=62, prompt_tokens=21, total_tokens=83, completion_tokens_details=None, prompt_tokens_details=None))

2.2 Openai接口示例

这里测试兼容openai接口的大模型调用。

假设模型为gemma3n:e2b,则model="openai/gemma3n:e2b",示例代码如下。

复制代码
from litellm import completion

response = completion(
            model="openai/gemma3n:e2b",
            messages = [{ "content": "Hello, how are you?","role": "user"}],
            api_base="http://localhost:11434/v1",
            api_key="ollama"
)
print(response)

输出如下所示

ModelResponse(id='chatcmpl-712', created=1766390462, model='gemma3n:e2b', object='chat.completion', system_fingerprint='fp_ollama', choices=[Choices(finish_reason='stop', index=0, message=Message(content="Hello! I'm doing well, thank you for asking. As a large language model, I don't experience emotions or feelings like humans do, but I'm functioning optimally and ready to assist you. \n\nHow are *you* doing today? Is there anything I can help you with? 😊\n\n\n\n", role='assistant', tool_calls=None, function_call=None, provider_specific_fields={'refusal': None}), provider_specific_fields={})], usage=Usage(completion_tokens=65, prompt_tokens=15, total_tokens=80, completion_tokens_details=None, prompt_tokens_details=None), service_tier=None)

3 异步访问示例

这里示例LiteLLM采用同步方式访问大模型。

model等参数配置与同步访问一致,因为是异步访问,需要设置stream=True。

以下是流式访问openai接口兼容大模型的示例代码。

复制代码
from litellm import completion
import os


response = completion(
    model="openai/gemma3n:e2b",
    messages=[{ "content": "Hello, how are you?","role": "user"}],
    api_base="http://localhost:11434/v1",
    api_key="ollama",
    stream=True,
)

for item in response:
    print(item)

输出如下所示

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='I', role='assistant', function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' am', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' doing', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' well', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=',', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' thank', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' for', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' asking', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='!', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' As', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' a', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' large', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' language', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' model', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=',', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' I', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' don', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content="'", role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='t', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' experience', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' feelings', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' like', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' humans', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' do', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=',', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' but', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' I', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content="'", role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='m', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' functioning', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' optimally', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' and', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' ready', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' to', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' assist', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='.', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' ', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='\n\n', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='How', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' are', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' *', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='*', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' doing', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' today', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='?', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' Is', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' there', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' anything', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' I', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' can', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' help', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' you', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' with', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='?', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content=' 😊', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields=None, refusal=None, content='\n\n\n\n', role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None, citations=None, service_tier=None)

ModelResponseStream(id='chatcmpl-570', created=1766391486, model='gemma3n:e2b', object='chat.completion.chunk', system_fingerprint='fp_ollama', choices=[StreamingChoices(finish_reason='stop', index=0, delta=Delta(provider_specific_fields=None, content=None, role=None, function_call=None, tool_calls=None, audio=None), logprobs=None)], provider_specific_fields=None)

referrence


LiteLLM - Getting Started

https://docs.litellm.ai/docs/

相关推荐
高洁019 分钟前
多模态AI模型融合难?核心问题与解决思路
人工智能·深度学习·机器学习·数据挖掘·transformer
碑 一33 分钟前
视频分割Video K-Net
人工智能·计算机视觉
renhongxia144 分钟前
ORACLE-SWE:量化Oracle 信息信号对SWE代理的贡献
人工智能·深度学习·学习·语言模型·分类
AI自动化工坊1 小时前
Google LiteRT-LM生产级部署指南:如何在边缘设备实现高效LLM推理?
人工智能·ai·llama
互联网江湖2 小时前
携程当学胖东来
人工智能
陌殇殇2 小时前
001 Spring AI Alibaba框架整合百炼大模型平台 — 快速入门
人工智能·spring boot·ai
Proxy_ZZ02 小时前
用Matlab绘制BER曲线对比SPA与Min-Sum性能
人工智能·算法·机器学习
黎阳之光2 小时前
黎阳之光:以视频孪生领跑全球,赋能数字孪生水利智能监测新征程
大数据·人工智能·算法·安全·数字孪生
宇擎智脑科技2 小时前
基于 SAM3 + FastAPI 搭建智能图像标注工具实战
人工智能·计算机视觉
F_U_N_2 小时前
效率提升80%:AI全流程研发真实项目落地复盘
人工智能·ai编程