使用ollama取代openai的api进行graphRAG失败记录

pip install ollama

pip install langchain_ollama

graph_documents = llm_transformer.convert_to_graph_documents(split_documents)

print(graph_documents)

偶尔会成功,但是大部分是失败的:

报错记录如下,暂时没想到好的办法:

python 复制代码
---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
Cell In[64], line 2
      1 # Transform documents to graph documents
----> 2 graph_documents = llm_transformer.convert_to_graph_documents(split_documents)
      3 print(graph_documents)

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:762, in LLMGraphTransformer.convert_to_graph_documents(self, documents)
    750 def convert_to_graph_documents(
    751     self, documents: Sequence[Document]
    752 ) -> List[GraphDocument]:
    753     """Convert a sequence of documents into graph documents.
    754 
    755     Args:
   (...)
    760         Sequence[GraphDocument]: The transformed documents as graphs.
    761     """
--> 762     return [self.process_response(document) for document in documents]

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:762, in <listcomp>(.0)
    750 def convert_to_graph_documents(
    751     self, documents: Sequence[Document]
    752 ) -> List[GraphDocument]:
    753     """Convert a sequence of documents into graph documents.
    754 
    755     Args:
   (...)
    760         Sequence[GraphDocument]: The transformed documents as graphs.
    761     """
--> 762     return [self.process_response(document) for document in documents]

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:718, in LLMGraphTransformer.process_response(self, document)
    715     nodes_set.add((rel["tail"], rel["tail_type"]))
    717     source_node = Node(id=rel["head"], type=rel["head_type"])
--> 718     target_node = Node(id=rel["tail"], type=rel["tail_type"])
    719     relationships.append(
    720         Relationship(
    721             source=source_node, target=target_node, type=rel["relation"]
    722         )
    723     )
    724 # Create nodes list

File D:\anaconda3\envs\graphRAG\lib\site-packages\pydantic\v1\main.py:341, in BaseModel.__init__(__pydantic_self__, **data)
    339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
    340 if validation_error:
--> 341     raise validation_error
    342 try:
    343     object_setattr(__pydantic_self__, '__dict__', values)

ValidationError: 2 validation errors for Node
id
  none is not an allowed value (type=type_error.none.not_allowed)
type
  none is not an allowed value (type=type_error.none.not_allowed)
相关推荐
Heartsuit10 小时前
LLM大语言模型私有化部署-使用Dify的工作流编排打造专属AI搜索引擎
人工智能·dify·ollama·qwen2.5·ai搜索引擎·tavily search·工作流编排
Engineer-Yao19 小时前
【win10+RAGFlow+Ollama】搭建本地大模型助手(教程+源码)
docker·大模型·win10·wsl·ollama·本地大模型·ragflow
handsomelky3 天前
ollama本地部署大语言模型记录
人工智能·语言模型·自然语言处理·chatgpt·llama·ollama·gemma
静待缘起3 天前
【大模型】GraphRAG技术原理
大模型·graphrag
Florian7 天前
GraphRAG+文档结构:打造高性能实体溯源方案
graph·graphrag·文档结构
AI程序猿人18 天前
(译)提示词工程指南:如何写出让AI更听话的提示词(Prompt)?| 附完整示例和小学生版本
人工智能·自然语言处理·llm·prompt·知识图谱·ollama·大模型应用
AI小白龙*1 个月前
Windows环境下搭建Qwen开发环境
人工智能·windows·自然语言处理·llm·llama·ai大模型·ollama
lpcarl1 个月前
Ubuntu安装ollama,并运行ollama和通义千问,使用gradio做界面
ubuntu·gradio·ollama
DbWong_09181 个月前
langchain_chatchat+ollama部署本地知识库,联网查询以及对数据库(Oracle)数据进行查询
python·oracle·langchain·ollama
冷小鱼1 个月前
【BUG】Error: llama runner process has terminated: exit status 127
bug·llama·ollama