ollama 本地模型部署
- [下载安装: [link](https://ollama.com/download)](#下载安装: link)
- 部署使用
-
- 在终端查看ollama是否安装完成
- [终端查看ollama 命令说明](#终端查看ollama 命令说明)
- 查看当前支持下载的模型
- 启动对话模式
- [默认情况下,ollama启动了server 的api访问功能](#默认情况下,ollama启动了server 的api访问功能)
- [外部 api访问](#外部 api访问)
- [至此,完成本地部署和api访问。 enjoy](#至此,完成本地部署和api访问。 enjoy)
下载安装: link
下载说明
支持windows 、 linux 、 macos 多个系统。本文使用windows安装
下载OllamaSetup.exe 根据指导完成安装。
部署使用
在终端查看ollama是否安装完成
bash
ollama -v
ollama version is 0.3.9
终端查看ollama 命令说明
bash
ollama help
Large language model runner
Usage:
ollama [flags]
ollama [command]
Available Commands:
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
pull Pull a model from a registry
push Push a model to a registry
list List models
ps List running models
cp Copy a model
rm Remove a model
help Help about any command
Flags:
-h, --help help for ollama
-v, --version Show version information
Use "ollama [command] --help" for more information about a command.
查看当前支持下载的模型
bash
ollama list
NAME ID SIZE MODIFIED
llama3.1:latest f66fc8dc39ea 4.7 GB 4 days ago
qwen2:latest e0d4e1163c58 4.4 GB 2 months ago
llama3:latest 365c0bd3c000 4.7 GB 2 months ago
启动对话模式
bash
ollama run llama3.1
>>> who are you
I'm an artificial intelligence model known as Llama. Llama stands for "Large Language Model Meta AI."
>>> /help
Available Commands:
/set Set session variables
/show Show model information
/load <model> Load a session or model
/save <model> Save your current session
/clear Clear session context
/bye Exit
/?, /help Help for a command
/? shortcuts Help for keyboard shortcuts
Use """ to begin a multi-line message.
>>> /show
Available Commands:
/show info Show details for this model
/show license Show model license
/show modelfile Show Modelfile for this model
/show parameters Show parameters for this model
/show system Show system message
/show template Show prompt template
>>> /show info
Model
arch llama
parameters 8.0B
quantization Q4_0
context length 131072
embedding length 4096
Parameters
stop "<|start_header_id|>"
stop "<|end_header_id|>"
stop "<|eot_id|>"
License
LLAMA 3.1 COMMUNITY LICENSE AGREEMENT
Llama 3.1 Version Release Date: July 23, 2024
>>> /show template
{{- if or .System .Tools }}<|start_header_id|>system<|end_header_id|>
{{- if .System }}
{{ .System }}
{{- end }}
{{- if .Tools }}
Cutting Knowledge Date: December 2023
When you receive a tool call response, use the output to format an answer to the orginal user question.
You are a helpful assistant with tool calling capabilities.
{{- end }}
{{- end }}<|eot_id|>
{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1 }}
{{- if eq .Role "user" }}<|start_header_id|>user<|end_header_id|>
{{- if and $.Tools $last }}
Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}. Do not use variables.
{{ range $.Tools }}
{{- . }}
{{ end }}
{{- end }}
{{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>
{{ end }}
{{- else if eq .Role "assistant" }}<|start_header_id|>assistant<|end_header_id|>
{{- if .ToolCalls }}
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}{{ end }}
{{- else }}
{{ .Content }}{{ if not $last }}<|eot_id|>{{ end }}
{{- end }}
{{- else if eq .Role "tool" }}<|start_header_id|>ipython<|end_header_id|>
{{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>
{{ end }}
{{- end }}
{{- end }}
>>> what yuo want for me
I don't have personal desires or expectations from you. I exist to assist and provide information to the best of
my abilities.
However, if you'd like to engage in conversation or ask questions on a particular topic, I'm here to help!
Here are some ways we can interact:
1. **Ask me anything**: You can ask me any question on any topic.
2. **Learn something new**: I can provide information on various subjects, from science and history to
entertainment and culture.
3. **Chat about a hobby**: If you have a particular interest or hobby, feel free to discuss it with me.
4. **Get help with a problem**: If you're stuck with a problem or need advice, I'll do my best to assist you.
What sounds interesting to you?
>>> Send a message (/? for help)
默认情况下,ollama启动了server 的api访问功能
外部 api访问
使用postman网页版本
需要自己下载下代理。