使用ollama取代openai的api进行graphRAG失败记录

pip install ollama

pip install langchain_ollama

graph_documents = llm_transformer.convert_to_graph_documents(split_documents)

print(graph_documents)

偶尔会成功,但是大部分是失败的:

报错记录如下,暂时没想到好的办法:

python 复制代码
---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
Cell In[64], line 2
      1 # Transform documents to graph documents
----> 2 graph_documents = llm_transformer.convert_to_graph_documents(split_documents)
      3 print(graph_documents)

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:762, in LLMGraphTransformer.convert_to_graph_documents(self, documents)
    750 def convert_to_graph_documents(
    751     self, documents: Sequence[Document]
    752 ) -> List[GraphDocument]:
    753     """Convert a sequence of documents into graph documents.
    754 
    755     Args:
   (...)
    760         Sequence[GraphDocument]: The transformed documents as graphs.
    761     """
--> 762     return [self.process_response(document) for document in documents]

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:762, in <listcomp>(.0)
    750 def convert_to_graph_documents(
    751     self, documents: Sequence[Document]
    752 ) -> List[GraphDocument]:
    753     """Convert a sequence of documents into graph documents.
    754 
    755     Args:
   (...)
    760         Sequence[GraphDocument]: The transformed documents as graphs.
    761     """
--> 762     return [self.process_response(document) for document in documents]

File D:\anaconda3\envs\graphRAG\lib\site-packages\langchain_experimental\graph_transformers\llm.py:718, in LLMGraphTransformer.process_response(self, document)
    715     nodes_set.add((rel["tail"], rel["tail_type"]))
    717     source_node = Node(id=rel["head"], type=rel["head_type"])
--> 718     target_node = Node(id=rel["tail"], type=rel["tail_type"])
    719     relationships.append(
    720         Relationship(
    721             source=source_node, target=target_node, type=rel["relation"]
    722         )
    723     )
    724 # Create nodes list

File D:\anaconda3\envs\graphRAG\lib\site-packages\pydantic\v1\main.py:341, in BaseModel.__init__(__pydantic_self__, **data)
    339 values, fields_set, validation_error = validate_model(__pydantic_self__.__class__, data)
    340 if validation_error:
--> 341     raise validation_error
    342 try:
    343     object_setattr(__pydantic_self__, '__dict__', values)

ValidationError: 2 validation errors for Node
id
  none is not an allowed value (type=type_error.none.not_allowed)
type
  none is not an allowed value (type=type_error.none.not_allowed)
相关推荐
AI_小站6 天前
值得细读的8个视觉大模型生成式预训练方法
大语言模型·ai大模型·计算机技术·大模型训练·视觉大模型·ollama·大模型应用
GuokLiu7 天前
240909-ChuanhuChatGPT集成Ollama的环境配置
前端·大模型·gui·chuanhuchatgpt·ollama
Jnchin15 天前
Ubuntu下修改Ollama的模型存储路径
ubuntu·ollama
伊织code19 天前
[译] RAGFlow 使用说明
docker·llm·mac·rag·ollama·本地·ragflow
tales_teller24 天前
docker+ollama运行微软graphRAG实战流程2-安装运行graphRAG
microsoft·docker·ollama·graphrag
tales_teller1 个月前
docker+ollama运行微软graphRAG实战流程1-安装运行模型
microsoft·docker·容器·ollama·graphrag
小G同学1 个月前
ollama搭建本地ai大模型并应用调用
golang·ollama
双鱼星星1 个月前
Windows 10 安装 DockerDesktop+Ollama 3 + open webui
windows·ai·大模型·docker desktop·ollama·open-webui
汀、人工智能1 个月前
LLM大模型部署实战指南:Ollama简化流程,OpenLLM灵活部署,LocalAI本地优化,Dify赋能应用开发
ollama
大鹏的NLP博客1 个月前
设置Ollama在局域网中访问的方法(Ubuntu)
linux·运维·ubuntu·llm·ollama