使用ollama取代openai的api进行graphRAG失败记录

pip install ollama

pip install langchain_ollama

graph_documents = llm_transformer.convert_to_graph_documents(split_documents)

print(graph_documents)

偶尔会成功,但是大部分是失败的:

报错记录如下,暂时没想到好的办法:

python 复制代码
---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
Cell In[64], line 2
      1 # Transform documents to graph documents
----> 2 graph_documents = llm_transformer.convert_to_graph_documents(split_documents)
      3 print(graph_documents)

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:762, in LLMGraphTransformer.convert_to_graph_documents(self, documents)
    750 def convert_to_graph_documents(
    751     self, documents: Sequence[Document]
    752 ) -> List[GraphDocument]:
    753     """Convert a sequence of documents into graph documents.
    754 
    755     Args:
   (...)
    760         Sequence[GraphDocument]: The transformed documents as graphs.
    761     """
--> 762     return [self.process_response(document) for document in documents]

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:762, in <listcomp>(.0)
    750 def convert_to_graph_documents(
    751     self, documents: Sequence[Document]
    752 ) -> List[GraphDocument]:
    753     """Convert a sequence of documents into graph documents.
    754 
    755     Args:
   (...)
    760         Sequence[GraphDocument]: The transformed documents as graphs.
    761     """
--> 762     return [self.process_response(document) for document in documents]

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:718, in LLMGraphTransformer.process_response(self, document)
    715     nodes_set.add((rel["tail"], rel["tail_type"]))
    717     source_node = Node(id=rel["head"], type=rel["head_type"])
--> 718     target_node = Node(id=rel["tail"], type=rel["tail_type"])
    719     relationships.append(
    720         Relationship(
    721             source=source_node, target=target_node, type=rel["relation"]
    722         )
    723     )
    724 # Create nodes list

File D:\anaconda3\envs\graphRAG\lib\site-packages\pydantic\v1\main.py:341, in BaseModel.__init__(__pydantic_self__, **data)
    339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
    340 if validation_error:
--> 341     raise validation_error
    342 try:
    343     object_setattr(__pydantic_self__, '__dict__', values)

ValidationError: 2 validation errors for Node
id
  none is not an allowed value (type=type_error.none.not_allowed)
type
  none is not an allowed value (type=type_error.none.not_allowed)
相关推荐
哈里谢顿18 小时前
Ollama 部署 Qwen 详细指南(2026 最新版)
ollama
hay_lee2 天前
Spring AI实现对话聊天-流式输出
java·人工智能·ollama·spring ai
Java后端的Ai之路2 天前
【RAG技术】- RAG系统调优手段之GraphRAG(全局视野)
人工智能·知识库·调优·rag·graphrag
穆友航3 天前
配置 OpenClaw 使用 Ollama 本地模型
大模型·ollama·openclaw
feasibility.4 天前
在OpenCode使用skills搭建基于LLM的dify工作流
人工智能·低代码·docker·ollama·skills·opencode·智能体/工作流
福大大架构师每日一题5 天前
ollama v0.15.2发布:新增Clawdbot集成指令,全面支持Ollama模型启动!
golang·ollama
问道飞鱼6 天前
【大模型知识】Chroma + Ollama + Llama 3.1 搭建本地知识库
llama·知识库·chroma·ollama
xiucai_cs6 天前
AI RAG 本地知识库实战
人工智能·知识库·dify·rag·ollama
玄同7657 天前
Llama.cpp 全实战指南:跨平台部署本地大模型的零门槛方案
人工智能·语言模型·自然语言处理·langchain·交互·llama·ollama
阿尔的代码屋8 天前
[大模型实战 01] 本地大模型初体验:Ollama 部署与 Python 调用指南
qwen·modelscope·大模型实战·ollama·大模型部署