使用ollama取代openai的api进行graphRAG失败记录

pip install ollama

pip install langchain_ollama

graph_documents = llm_transformer.convert_to_graph_documents(split_documents)

print(graph_documents)

偶尔会成功,但是大部分是失败的:

报错记录如下,暂时没想到好的办法:

python 复制代码
---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
Cell In[64], line 2
      1 # Transform documents to graph documents
----> 2 graph_documents = llm_transformer.convert_to_graph_documents(split_documents)
      3 print(graph_documents)

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:762, in LLMGraphTransformer.convert_to_graph_documents(self, documents)
    750 def convert_to_graph_documents(
    751     self, documents: Sequence[Document]
    752 ) -> List[GraphDocument]:
    753     """Convert a sequence of documents into graph documents.
    754 
    755     Args:
   (...)
    760         Sequence[GraphDocument]: The transformed documents as graphs.
    761     """
--> 762     return [self.process_response(document) for document in documents]

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:762, in <listcomp>(.0)
    750 def convert_to_graph_documents(
    751     self, documents: Sequence[Document]
    752 ) -> List[GraphDocument]:
    753     """Convert a sequence of documents into graph documents.
    754 
    755     Args:
   (...)
    760         Sequence[GraphDocument]: The transformed documents as graphs.
    761     """
--> 762     return [self.process_response(document) for document in documents]

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:718, in LLMGraphTransformer.process_response(self, document)
    715     nodes_set.add((rel["tail"], rel["tail_type"]))
    717     source_node = Node(id=rel["head"], type=rel["head_type"])
--> 718     target_node = Node(id=rel["tail"], type=rel["tail_type"])
    719     relationships.append(
    720         Relationship(
    721             source=source_node, target=target_node, type=rel["relation"]
    722         )
    723     )
    724 # Create nodes list

File D:\anaconda3\envs\graphRAG\lib\site-packages\pydantic\v1\main.py:341, in BaseModel.__init__(__pydantic_self__, **data)
    339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
    340 if validation_error:
--> 341     raise validation_error
    342 try:
    343     object_setattr(__pydantic_self__, '__dict__', values)

ValidationError: 2 validation errors for Node
id
  none is not an allowed value (type=type_error.none.not_allowed)
type
  none is not an allowed value (type=type_error.none.not_allowed)
相关推荐
王亭_6663 小时前
Ollama+open-webui搭建私有本地大模型详细教程
人工智能·大模型·ollama·openwebui·deepseek
Golinie1 天前
Ollama+Langchaingo+Gin开发本地LLM简单应用
大模型·gin·ollama·langchaingo
Ki13813 天前
将树莓派5当做Ollama服务器,C#调用generate的API的示例
linux·树莓派·ollama
Learn-Share_HY4 天前
[Python]如何利用Flask搭建一個Web服務器,並透過Ngrok訪問來實現LINE Bot功能?
linux·人工智能·python·ubuntu·flask·ollama·ngrok
whltaoin5 天前
《2核2G阿里云神操作!Ubuntu+Ollama低成本部署Deepseek模型实战》
ubuntu·阿里云·大模型·ollama
塞大花5 天前
微软 GraphRAG 项目学习总结
人工智能·microsoft·语言模型·大语言模型·rag·knowledge graph·graphrag
乐予吕6 天前
MCP:AI 时代的“USB-C”,解锁模型上下文新范式
openai·ollama·mcp
tinghe177 天前
本地部署Dify 添加Ollama模型DeepSeek
dify·本地部署·ollama·deepseek
小小工匠11 天前
LLM - CentOS上离线部署Ollama+Qwen2.5-coder模型完全指南
ollama·离线部署·qwen2.5-coder
TitusTong15 天前
使用 <think> 标签解析 DeepSeek 模型的推理过程
前端·ollama·deepseek