关于Transformer的理解

关于Transformer, QKV的意义表示其更像是一个可学习的查询系统,或许以前搜索引擎的算法就与此有关或者某个分支的搜索算法与此类似。


Can anyone help me to understand this image? - #2 by J_Johnson - nlp - PyTorch Forums

Embeddings - these are learnable weights where each token(token could be a word, sentence piece, subword, character, etc) are converted into a vector, say, with 500 values between 0 and 1 that are trainable.

Positional Encoding - for each token, we want to inform the model where it's located, orderwise. This is because linear layers are not ideal for handling sequential information. So we manually pass this in by adding a vector of sine and cosine values on the first 2 elements in the embedding vector.

This sequence of vectors goes through an attention layer, which basically is like a learnable digitized database search function with keys, queries and values. In this case, we are "searching" for the most likely next token.

The Feed Forward is just a basic linear layer, but is applied across each embedding in the sequence separately(i.e. 3 dim tensor instead of 2 dim).

Then the final Linear layer is where we want to get out our predicted next token in the form of a vector of probabilities, which we apply a softmax to put the values in the range of 0 to 1.

There are two sides because when that diagram was developed, it was being used in language translations. But generative language models for next token prediction just use the Transformer decoder and not the encoder.

Here is a PyTorch tutorial that might help you go through how it works.

Language Modeling with nn.Transformer and torchtext --- PyTorch Tutorials 2.0.1+cu117 documentation


相关推荐
自动化代码美学7 分钟前
【AI白皮书】AI安全
人工智能·安全
紫微AI7 分钟前
OpenClaw:从周末实验到现象级开源 AI 代理
人工智能·开源
yzx99101324 分钟前
2026年主流AI工具深度用户指南
人工智能
香芋Yu31 分钟前
【强化学习教程——01_强化学习基石】第06章_Q-Learning与SARSA
人工智能·算法·强化学习·rl·sarsa·q-learning
零售ERP菜鸟1 小时前
数字系统的新角色:从管控工具到赋能平台
大数据·人工智能·职场和发展·创业创新·学习方法·业界资讯
Howie Zphile1 小时前
奇门遁甲x全面预算 # 双轨校准实务:资本化支出与经营目标设定的奇门-财务融合方案
大数据·人工智能
大模型任我行1 小时前
腾讯:Agent视觉隐喻迁移
人工智能·语言模型·自然语言处理·论文笔记
weixin_448119941 小时前
Datawhale Easy-Vibe 202602 第1次笔记
人工智能
weixin_509138342 小时前
《智能体认知动力学导论》第7章:应用案例
人工智能·智能体·语义空间·认知动力学
子午2 小时前
【宠物识别系统】Python+深度学习+人工智能+算法模型+图像识别+TensorFlow+2026计算机毕设项目
人工智能·python·深度学习