LLama: Open and Effecient Foundation Language Models

This paper is inspired by the Chinchilla scaling law. It found that given a fixed computing budget, the best performance is not generated by the larger models, but by the smaller models trained on more data . So it proposed a collection of models ranging from 7B to 65B. These smaller models outperforms other bigger models.

1. Architecture

It based on traditional transformer models, and leveraged some improvement proposed in the subquently large language models. The main change point is:

  • Pre-normalization, which nomalized the input in the sub-layer, instead of the output.
  • SwiGelu, instead of Relu.
  • Rotary Embeddidngs.

2. Efficient Implementation

  • The casual multihead attention. Which need me to explore the behind logic further.
  • Reduce the amount of activations that are recomputed during the backward pass.
  • Save the activation by manually implementing it, instead of using PyTorch Autograd in backward pass.
  • Using model and sequence parallelism to reduce the memory usage.
  • Using the overlay the computing and comunication bewteen different GPUs as much as possible.
相关推荐
CyberwayTech几秒前
赛博威线上营销费用管理:咨询+系统,双轮驱动ROI增长
大数据·人工智能
一粒黑子几秒前
【实测】GitNexus实测:拖入GitHub链接秒出代码知识图谱,今天涨了857星
人工智能·gpt·安全·ai·大模型·ai编程
chaofan9801 分钟前
GPT-5.5 领衔 Image 2.0:像素级控制时代,AI 绘图告别开盲盒
开发语言·人工智能·python·gpt·自动化·api
秋说2 分钟前
【知识图谱】大模型驱动多模态健康智能问诊深度剖析
人工智能·大模型·知识图谱·同态加密·差分隐私·智慧医疗
User_芊芊君子3 分钟前
【OpenAI 把 AI 玩明白了】:自主推理 + 动态知识图谱,这 4 个技术突破要颠覆行业
java·人工智能·知识图谱
NOCSAH4 分钟前
统好AI:助力企业智改数转的务实实践
大数据·人工智能·统好ai
乔代码嘚4 分钟前
Agentic-KGR:多智能体强化学习驱动的知识图谱本体渐进式扩展技术
人工智能·学习·大模型·知识图谱·ai大模型·大模型学习·大模型教程
飞行家贞贞4 分钟前
知识图谱-入门项目
人工智能·知识图谱
张较瘦_5 分钟前
[论文阅读] AI + 软件工程 | 突破LLM代码生成瓶颈:编程知识图谱(PKG)让检索增强更精准
论文阅读·人工智能·软件工程