visionTransformer window平台下报错

  • 错误:
clike 复制代码
KeyError: 'Transformer/encoderblock_0/MlpBlock_3/Dense_0kernel is not a file in the archive'
  • 解决方法:

修改这个函数即可,主要原因是Linux系统与window系统路径分隔符不一样导致

clike 复制代码
    def load_from(self, weights, n_block):
        ROOT = f"Transformer/encoderblock_{n_block}"
        with torch.no_grad():
            # query_weight = np2th(weights[pjoin(ROOT, ATTENTION_Q, "kernel")]).view(self.hidden_size, self.hidden_size).t()
            # key_weight = np2th(weights[pjoin(ROOT, ATTENTION_K, "kernel")]).view(self.hidden_size, self.hidden_size).t()
            # value_weight = np2th(weights[pjoin(ROOT, ATTENTION_V, "kernel")]).view(self.hidden_size, self.hidden_size).t()
            # out_weight = np2th(weights[pjoin(ROOT, ATTENTION_OUT, "kernel")]).view(self.hidden_size, self.hidden_size).t()
            query_weight = np2th(weights[(ROOT + '/' + ATTENTION_Q + "/kernel")]).view(self.hidden_size,self.hidden_size).t()
            key_weight = np2th(weights[(ROOT + '/' + ATTENTION_K + "/kernel")]).view(self.hidden_size,self.hidden_size).t()
            value_weight = np2th(weights[(ROOT + '/' + ATTENTION_V + "/kernel")]).view(self.hidden_size,self.hidden_size).t()
            out_weight = np2th(weights[(ROOT + '/' + ATTENTION_OUT + "/kernel")]).view(self.hidden_size,self.hidden_size).t()

            # query_bias = np2th(weights[pjoin(ROOT, ATTENTION_Q, "bias")]).view(-1)
            # key_bias = np2th(weights[pjoin(ROOT, ATTENTION_K, "bias")]).view(-1)
            # value_bias = np2th(weights[pjoin(ROOT, ATTENTION_V, "bias")]).view(-1)
            # out_bias = np2th(weights[pjoin(ROOT, ATTENTION_OUT, "bias")]).view(-1)
            query_bias = np2th(weights[(ROOT + '/' + ATTENTION_Q + "/bias")]).view(-1)
            key_bias = np2th(weights[(ROOT + '/' + ATTENTION_K + "/bias")]).view(-1)
            value_bias = np2th(weights[(ROOT + '/' + ATTENTION_V + "/bias")]).view(-1)
            out_bias = np2th(weights[(ROOT + '/' + ATTENTION_OUT + "/bias")]).view(-1)

            self.attn.query.weight.copy_(query_weight)
            self.attn.key.weight.copy_(key_weight)
            self.attn.value.weight.copy_(value_weight)
            self.attn.out.weight.copy_(out_weight)
            self.attn.query.bias.copy_(query_bias)
            self.attn.key.bias.copy_(key_bias)
            self.attn.value.bias.copy_(value_bias)
            self.attn.out.bias.copy_(out_bias)

            mlp_weight_0 = np2th(weights[(ROOT + '/' + FC_0 + "/kernel")]).t()
            mlp_weight_1 = np2th(weights[(ROOT + '/' + FC_1 + "/kernel")]).t()
            mlp_bias_0 = np2th(weights[(ROOT + '/' + FC_0 +"/bias")]).t()
            mlp_bias_1 = np2th(weights[(ROOT + '/' + FC_1 + "/bias")]).t()

            self.ffn.fc1.weight.copy_(mlp_weight_0)
            self.ffn.fc2.weight.copy_(mlp_weight_1)
            self.ffn.fc1.bias.copy_(mlp_bias_0)
            self.ffn.fc2.bias.copy_(mlp_bias_1)

            self.attention_norm.weight.copy_(np2th(weights[(ROOT + '/' + ATTENTION_NORM + "/scale")]))
            self.attention_norm.bias.copy_(np2th(weights[(ROOT + '/' + ATTENTION_NORM +  "/bias")]))
            self.ffn_norm.weight.copy_(np2th(weights[(ROOT + '/' + MLP_NORM + "/scale")]))
            self.ffn_norm.bias.copy_(np2th(weights[(ROOT + '/' +  MLP_NORM + "/bias")]))
相关推荐
张拭心6 分钟前
为什么说 AI 视频模型不能用来做教育?Sora-2 Veo-3 来了也不行
前端·人工智能
百***07457 分钟前
【保姆级教程】GPT-5.2极速接入指南:3步上手专家级多模态AI能力
人工智能·gpt
SirLancelot110 分钟前
AI大模型-基本介绍(一)RAG、向量、向量数据库
数据库·人工智能·ai·向量·向量数据库·rag
跨境猫小妹20 分钟前
跨境电商深水区:价值增长新范式,重构出海增长逻辑
大数据·人工智能·重构·产品运营·跨境电商·防关联
imbackneverdie20 分钟前
AI工具如何重塑综述写作新体验
数据库·人工智能·考研·自然语言处理·aigc·论文·ai写作
zhaodiandiandian22 分钟前
大模型驱动AI产业化浪潮,全链条突破重塑经济生态
人工智能
这儿有一堆花24 分钟前
将 AI 深度集成到开发环境:Gemini CLI 实用指南
人工智能·ai·ai编程
zhaodiandiandian25 分钟前
从多模态到AI Agent,技术突破引领智能时代新变革
人工智能
l3538o6757329 分钟前
国产POE降压恒压芯片方案选型:48v-52v输入转5v-12v/1-3A电源芯片
人工智能·科技·单片机·嵌入式硬件·电脑·智能家居
迪菲赫尔曼36 分钟前
YAML2ModelGraph【v1.0】:一键生成 Ultralytics 模型结构图
人工智能·yolo·目标检测·yolov5·yolov8·yolo11·结构图