ChuanhuChatGPT集成百川大模型

搭建步骤:

  1. 拷贝本地模型,把下载好的Baichuan2-7B-Chat拷贝到models目录下

  2. 修改modules\models\base_model.py文件,class ModelType增加Baichuan

    |-----------------------------------------------------------------------------------------------------------|
    | Baichuan ``= 16 elif "baichuan" in model_name_lower: ``model_type ``= ModelType.Baichuan |

  3. 修改modules\models\models.py文件,get_model方法增加ModelType.Baichuan

    |-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
    | elif model_type ``=``= ModelType.Baichuan: ``from .Baichuan ``import Baichuan_Client ``model ``= Baichuan_Client(model_name, user_name``=``user_name) |

  4. 增加modules\models\Baichuan.py文件

    |-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
    | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 | from modelscope ``import snapshot_download, AutoModelForCausalLM, AutoTokenizer,GenerationConfig from transformers ``import AutoModelForCausalLM, AutoTokenizer from transformers.generation ``import GenerationConfig import logging import colorama from ..index_func ``import * from ..presets ``import * from ..utils ``import * from .base_model ``import BaseLLMModel from ..presets ``import MODEL_METADATA from datetime ``import datetime class Baichuan_Client(BaseLLMModel): ``def __init__(``self``, model_name, user_name``=``"") ``-``> ``None``: ``super``().__init__(model_name``=``model_name, user``=``user_name) ``import torch ``from transformers ``import AutoModel, AutoTokenizer ``global CHATGLM_TOKENIZER, CHATGLM_MODEL ``print``(``"__init__ Baichuan_Client"``) ``if CHATGLM_TOKENIZER ``is None or CHATGLM_MODEL ``is None``: ``model_path ``= None ``if os.path.exists(``"models"``): ``model_dirs ``= os.listdir(``"models"``) ``if model_name ``in model_dirs: ``model_path ``= f``"models/{model_name}" ``if model_path ``is not None``: ``model_source ``= model_path ``else``: ``model_source ``= snapshot_download(f``"baichuan-inc/{model_name}"``, revision``=``'v1.0.4'``) ``CHATGLM_TOKENIZER ``= AutoTokenizer.from_pretrained( ``model_source, device_map``=``"auto"``, trust_remote_code``=``True``, torch_dtype``=``torch.float16 ``) ``quantified ``= False ``if "int4" in model_name: ``quantified ``= True ``model ``= AutoModelForCausalLM.from_pretrained( ``model_source, device_map``=``"auto"``, trust_remote_code``=``True``, torch_dtype``=``torch.float16 ``) ``model.generation_config ``= GenerationConfig.from_pretrained(model_source) ``model ``= model.``eval``() ``CHATGLM_MODEL ``= model ``def _get_glm_style_input(``self``): ``print``(``"_get_glm_style_input"``) ``print``(f``"the history is: {self.history}"``) ``history ``= [x[``"content"``] ``for x ``in self``.history] ``query ``= history.pop() ``print``(f``"the message is: {query}"``) ``return history, query ``def get_answer_at_once(``self``): ``print``(``"get_answer_at_once"``) ``history,query ``= self``._get_glm_style_input() ``messages ``= [] ``messages.append({``'role'``: ``'user'``, ``'content'``: query}) ``now ``= datetime.now() ``print``(``"get_answer_at_once start"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) ``response ``= CHATGLM_MODEL.chat( ``CHATGLM_TOKENIZER, messages) ``now ``= datetime.now() ``print``(``"get_answer_at_once end"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) ``print``(f``"the response is: {response}"``) ``return response, ``len``(response) ``def get_answer_stream_iter(``self``): ``history,query ``= self``._get_glm_style_input() ``messages ``= [] ``messages.append({``'role'``: ``'user'``, ``'content'``: query}) ``result ``= "" ``now ``= datetime.now() ``print``(``"get_answer_stream_iter start"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) ``for response ``in CHATGLM_MODEL.chat( ``CHATGLM_TOKENIZER, ``messages ``): ``print``(f``"the response is: {response}"``) ``result ``+``= response ``yield result ``now ``= datetime.now() ``print``(``"get_answer_stream_iter end"``+``"++++++++"``+ now.strftime(``"%Y-%m-%d %H:%M:%S"``)) |

  5. 答案回调开关控制get_answer_at_once、get_answer_stream_iter方法调用选择

  6. 执行效果

相关推荐
struggle202510 分钟前
Burn 开源程序是下一代深度学习框架,在灵活性、效率和可移植性方面毫不妥协
人工智能·python·深度学习·rust
腾飞开源13 分钟前
17_Flask部署到网络服务器
python·flask·python web开发·flask快速入门教程·flask框架·flask视频教程·flask会话技术
Mikhail_G27 分钟前
Python应用八股文
大数据·运维·开发语言·python·数据分析
mikes zhang28 分钟前
Flask文件上传与异常处理完全指南
后端·python·flask
烛阴38 分钟前
深入浅出地理解Python元类【从入门到精通】
前端·python
binbinaijishu881 小时前
PyTorch:让深度学习飞入寻常百姓家(从零开始玩转张量与神经网络!)
pytorch·深度学习·神经网络·其他
weixin_464078071 小时前
Python学习小结
python·学习
ubax2 小时前
day 51 python打卡
开发语言·python
laocooon5238578862 小时前
基于Python的TCP应用案例,包含**服务器端**和**客户端**的完整代码
网络·python·tcp/ip
哆啦A梦的口袋呀2 小时前
设计模式汇总
python·设计模式