AI学习二、通过anythingLLM搭建本地知识库
第一节我已经搭建了ollama+deepseek的环境,本章需要用到的核心工具为anythingLLM
了解
这个是官网给的信息
java
What is AnythingLLM?
AnythingLLM is the easiest to use, all-in-one AI application that can do RAG, AI Agents, and much more with no code or infrastructure headaches.
什么是AnythingLLM?
AnythingLLM是最容易使用的,集所有功能于一身的AI应用程序,它可以做RAG, AI代理,以及更多的事情,没有代码或基础设施的麻烦
Why use AnythingLLM?
You want a zero-setup, private, and all-in-one AI application for local LLMs, RAG, and AI Agents all in one place without painful developer-required set up.
为什么要使用AnythingLLM?
您希望在一个地方为本地llm、RAG和AI代理提供零设置、私有和一体化的AI应用程序,而不需要开发人员进行痛苦的设置。
Learn more about AnythingLLM Desktop →
or
You need a fully-customizable, private, and all-in-one AI app for your business or organization that is basically a full ChatGPT with permissioning but with any LLM, embedding model, or vector database.
你需要为你的企业或组织提供一个完全可定制的、私有的、一体化的AI应用程序,它基本上是一个完整的ChatGPT,具有权限,但具有任何LLM、嵌入模型或矢量数据库。
Learn more about AnythingLLM for Docker →
安装anythingLLM
官网指出docker版本和desktop版是有区别的
大概区别如下
Feature | Available on Desktop | Available on Docker |
---|---|---|
Multi-user support(多用户支持) | ❌ | ✅ |
Emeddable chat widgets(可插入的聊天小部件) | ❌ | ✅ |
One-click install(提供单键安装) | ✅ | ❌ |
Private documents(文档私有化) | ✅ | ✅ |
Connect to any vector database(连接到任何矢量数据库) | ✅ | ✅ |
Use any LLM(使用任何LLM) | ✅ | ✅ |
Built-in embedding provider(内置嵌入提供程序) | ✅ | ✅ |
Built-in LLM provider(内置LLM提供程序) | ✅ | ❌ |
White-labeling | ❌ | ✅ |
Chat logs(聊天记录) | ✅ | ✅ |
Agent support(支持代理) | ✅ | ✅ |
Agent skills | ✅ | ✅ |
Third-party data connectors(第三方数据连接器) | ✅ | ✅ |
Password protection(密码保护) | ❌ | ✅ |
Invite new users to instance(邀请新用户加入实例) | ❌ | ✅ |
Text splitting configuration(文本分割配置) | ✅ | ✅ |
Whisper model support(Whisper模型支持) | ✅ | ✅ |
Full developer API(完整的开发者API) | ✅ | ✅ |
User management(用户管理 ) | ❌ | ✅ |
Workspace access management(工作空间访问管理) | ❌ | ✅ |
Website scraping(网站抓取) | ✅ | ✅ |
docker安装方式
sh
# Assuming that you want to store app data in a folder at /var/lib/anythingllm
# Pull in the latest image
docker pull mintplexlabs/anythingllm:master
export STORAGE_LOCATION="/var/lib/anythingllm" && \
mkdir -p $STORAGE_LOCATION && \
touch "$STORAGE_LOCATION/.env" && \
docker run -d -p 3001:3001 \ # expose on port 3001 (can be any host port)
--cap-add SYS_ADMIN \
-v ${STORAGE_LOCATION}:/app/server/storage \
-v ${STORAGE_LOCATION}/.env:/app/server/.env \
-e STORAGE_DIR="/app/server/storage" \
mintplexlabs/anythingllm:master
# visit http://localhost:3001 to use AnythingLLM!
win安装
根据系统自行选择下载
Windows 10+ (Home, Professional - x86 64-bit)
Windows 10+ (Home, Professional - ARM 64-bit)
下载完成后,一路点击安装即可
我这里选择使用win的方式进行了安装
配置
这里我们需要配置一下我们连接的ollama的信息

使用
新建工作区

上传文档


以上文档上传成功后,进行提问

以上即搭建成功知识库内容