ChuanhuChatGPT集成百川大模型

搭建步骤:

  1. 拷贝本地模型,把下载好的Baichuan2-7B-Chat拷贝到models目录下

  2. 修改modules\models\base_model.py文件,class ModelType增加Baichuan

    |-----------------------------------------------------------------------------------------------------------|
    | Baichuan ``= 16 elif "baichuan" in model_name_lower: ``model_type ``= ModelType.Baichuan |

  3. 修改modules\models\models.py文件,get_model方法增加ModelType.Baichuan

    |-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
    | elif model_type ``=``= ModelType.Baichuan: ``from .Baichuan ``import Baichuan_Client ``model ``= Baichuan_Client(model_name, user_name``=``user_name) |

  4. 增加modules\models\Baichuan.py文件

    |-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
    | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 | from modelscope ``import snapshot_download, AutoModelForCausalLM, AutoTokenizer,GenerationConfig from transformers ``import AutoModelForCausalLM, AutoTokenizer from transformers.generation ``import GenerationConfig import logging import colorama from ..index_func ``import * from ..presets ``import * from ..utils ``import * from .base_model ``import BaseLLMModel from ..presets ``import MODEL_METADATA from datetime ``import datetime class Baichuan_Client(BaseLLMModel): ``def __init__(``self``, model_name, user_name``=``"") ``-``> ``None``: ``super``().__init__(model_name``=``model_name, user``=``user_name) ``import torch ``from transformers ``import AutoModel, AutoTokenizer ``global CHATGLM_TOKENIZER, CHATGLM_MODEL ``print``(``"__init__ Baichuan_Client"``) ``if CHATGLM_TOKENIZER ``is None or CHATGLM_MODEL ``is None``: ``model_path ``= None ``if os.path.exists(``"models"``): ``model_dirs ``= os.listdir(``"models"``) ``if model_name ``in model_dirs: ``model_path ``= f``"models/{model_name}" ``if model_path ``is not None``: ``model_source ``= model_path ``else``: ``model_source ``= snapshot_download(f``"baichuan-inc/{model_name}"``, revision``=``'v1.0.4'``) ``CHATGLM_TOKENIZER ``= AutoTokenizer.from_pretrained( ``model_source, device_map``=``"auto"``, trust_remote_code``=``True``, torch_dtype``=``torch.float16 ``) ``quantified ``= False ``if "int4" in model_name: ``quantified ``= True ``model ``= AutoModelForCausalLM.from_pretrained( ``model_source, device_map``=``"auto"``, trust_remote_code``=``True``, torch_dtype``=``torch.float16 ``) ``model.generation_config ``= GenerationConfig.from_pretrained(model_source) ``model ``= model.``eval``() ``CHATGLM_MODEL ``= model ``def _get_glm_style_input(``self``): ``print``(``"_get_glm_style_input"``) ``print``(f``"the history is: {self.history}"``) ``history ``= [x[``"content"``] ``for x ``in self``.history] ``query ``= history.pop() ``print``(f``"the message is: {query}"``) ``return history, query ``def get_answer_at_once(``self``): ``print``(``"get_answer_at_once"``) ``history,query ``= self``._get_glm_style_input() ``messages ``= [] ``messages.append({``'role'``: ``'user'``, ``'content'``: query}) ``now ``= datetime.now() ``print``(``"get_answer_at_once start"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) ``response ``= CHATGLM_MODEL.chat( ``CHATGLM_TOKENIZER, messages) ``now ``= datetime.now() ``print``(``"get_answer_at_once end"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) ``print``(f``"the response is: {response}"``) ``return response, ``len``(response) ``def get_answer_stream_iter(``self``): ``history,query ``= self``._get_glm_style_input() ``messages ``= [] ``messages.append({``'role'``: ``'user'``, ``'content'``: query}) ``result ``= "" ``now ``= datetime.now() ``print``(``"get_answer_stream_iter start"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) ``for response ``in CHATGLM_MODEL.chat( ``CHATGLM_TOKENIZER, ``messages ``): ``print``(f``"the response is: {response}"``) ``result ``+``= response ``yield result ``now ``= datetime.now() ``print``(``"get_answer_stream_iter end"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) |

  5. 答案回调开关控制get_answer_at_once、get_answer_stream_iter方法调用选择

  6. 执行效果

相关推荐
xchenhao13 分钟前
SciKit-Learn 全面分析分类任务 breast_cancer 数据集
python·机器学习·分类·数据集·scikit-learn·svm
linjoe993 小时前
【Deep Learning】Ubuntu配置深度学习环境
人工智能·深度学习·ubuntu
独行soc3 小时前
2025年渗透测试面试题总结-66(题目+回答)
java·网络·python·安全·web安全·adb·渗透测试
Y学院6 小时前
Python 数据分析:从新手到高手的“摸鱼”指南
python·数据分析
深耕AI6 小时前
【PyTorch训练】准确率计算(代码片段拆解)
人工智能·pytorch·python
eqwaak06 小时前
科技信息差(9.12)
开发语言·python·科技·量子计算
Blossom.1187 小时前
从“能写”到“能干活”:大模型工具调用(Function-Calling)的工程化落地指南
数据库·人工智能·python·深度学习·机器学习·计算机视觉·oracle
蒋星熠7 小时前
破壁者指南:内网穿透技术的深度解构与实战方法
网络·数据库·redis·python·websocket·网络协议·udp