ChuanhuChatGPT集成百川大模型

搭建步骤:

  1. 拷贝本地模型,把下载好的Baichuan2-7B-Chat拷贝到models目录下

  2. 修改modules\models\base_model.py文件,class ModelType增加Baichuan

    |-----------------------------------------------------------------------------------------------------------|
    | Baichuan ``= 16 elif "baichuan" in model_name_lower: ``model_type ``= ModelType.Baichuan |

  3. 修改modules\models\models.py文件,get_model方法增加ModelType.Baichuan

    |-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
    | elif model_type ``=``= ModelType.Baichuan: ``from .Baichuan ``import Baichuan_Client ``model ``= Baichuan_Client(model_name, user_name``=``user_name) |

  4. 增加modules\models\Baichuan.py文件

    |-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
    | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 | from modelscope ``import snapshot_download, AutoModelForCausalLM, AutoTokenizer,GenerationConfig from transformers ``import AutoModelForCausalLM, AutoTokenizer from transformers.generation ``import GenerationConfig import logging import colorama from ..index_func ``import * from ..presets ``import * from ..utils ``import * from .base_model ``import BaseLLMModel from ..presets ``import MODEL_METADATA from datetime ``import datetime class Baichuan_Client(BaseLLMModel): ``def __init__(``self``, model_name, user_name``=``"") ``-``> ``None``: ``super``().__init__(model_name``=``model_name, user``=``user_name) ``import torch ``from transformers ``import AutoModel, AutoTokenizer ``global CHATGLM_TOKENIZER, CHATGLM_MODEL ``print``(``"__init__ Baichuan_Client"``) ``if CHATGLM_TOKENIZER ``is None or CHATGLM_MODEL ``is None``: ``model_path ``= None ``if os.path.exists(``"models"``): ``model_dirs ``= os.listdir(``"models"``) ``if model_name ``in model_dirs: ``model_path ``= f``"models/{model_name}" ``if model_path ``is not None``: ``model_source ``= model_path ``else``: ``model_source ``= snapshot_download(f``"baichuan-inc/{model_name}"``, revision``=``'v1.0.4'``) ``CHATGLM_TOKENIZER ``= AutoTokenizer.from_pretrained( ``model_source, device_map``=``"auto"``, trust_remote_code``=``True``, torch_dtype``=``torch.float16 ``) ``quantified ``= False ``if "int4" in model_name: ``quantified ``= True ``model ``= AutoModelForCausalLM.from_pretrained( ``model_source, device_map``=``"auto"``, trust_remote_code``=``True``, torch_dtype``=``torch.float16 ``) ``model.generation_config ``= GenerationConfig.from_pretrained(model_source) ``model ``= model.``eval``() ``CHATGLM_MODEL ``= model ``def _get_glm_style_input(``self``): ``print``(``"_get_glm_style_input"``) ``print``(f``"the history is: {self.history}"``) ``history ``= [x[``"content"``] ``for x ``in self``.history] ``query ``= history.pop() ``print``(f``"the message is: {query}"``) ``return history, query ``def get_answer_at_once(``self``): ``print``(``"get_answer_at_once"``) ``history,query ``= self``._get_glm_style_input() ``messages ``= [] ``messages.append({``'role'``: ``'user'``, ``'content'``: query}) ``now ``= datetime.now() ``print``(``"get_answer_at_once start"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) ``response ``= CHATGLM_MODEL.chat( ``CHATGLM_TOKENIZER, messages) ``now ``= datetime.now() ``print``(``"get_answer_at_once end"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) ``print``(f``"the response is: {response}"``) ``return response, ``len``(response) ``def get_answer_stream_iter(``self``): ``history,query ``= self``._get_glm_style_input() ``messages ``= [] ``messages.append({``'role'``: ``'user'``, ``'content'``: query}) ``result ``= "" ``now ``= datetime.now() ``print``(``"get_answer_stream_iter start"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) ``for response ``in CHATGLM_MODEL.chat( ``CHATGLM_TOKENIZER, ``messages ``): ``print``(f``"the response is: {response}"``) ``result ``+``= response ``yield result ``now ``= datetime.now() ``print``(``"get_answer_stream_iter end"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) |

  5. 答案回调开关控制get_answer_at_once、get_answer_stream_iter方法调用选择

  6. 执行效果

相关推荐
HyperAI超神经16 分钟前
在线教程丨端侧TTS新SOTA!NeuTTS-Air基于0.5B模型实现3秒音频克隆
人工智能·深度学习·机器学习·音视频·tts·音频克隆·neutts-air
sunsunyu0320 分钟前
视频转图片工具
python·音视频
软件开发技术深度爱好者24 分钟前
Python类中方法种类介绍
开发语言·python
用户83562907805138 分钟前
使用Python合并Word文档:实现高效自动化办公
后端·python
闭着眼睛学算法1 小时前
【双机位A卷】华为OD笔试之【排序】双机位A-银行插队【Py/Java/C++/C/JS/Go六种语言】【欧弟算法】全网注释最详细分类最全的华子OD真题题解
java·c语言·javascript·c++·python·算法·华为od
Pocker_Spades_A1 小时前
Python快速入门专业版(五十四):爬虫基石:HTTP协议全解析(从请求到响应,附Socket模拟请求)
爬虫·python·http
DoubleKK2 小时前
Python 中的 json_repair 使用教程:轻松修复大模型返回的非法 JSON
python
CoovallyAIHub2 小时前
OCR战场再起风云:LightOnOCR-1B凭什么比DeepSeekOCR快1.7倍?(附演示开源地址)
深度学习·算法·计算机视觉
武子康2 小时前
AI研究-120 DeepSeek-OCR 从 0 到 1:上手路线、实战要点
人工智能·深度学习·机器学习·ai·ocr·deepseek·deepseek-ocr
萧鼎2 小时前
深入掌握 OpenCV-Python:从图像处理到智能视觉
图像处理·python·opencv