《动手学深度学习 Pytorch版》 5.3 延后初始化

python 复制代码
import torch
from torch import nn
from d2l import torch as d2l

下面实例化的多层感知机的输入维度是未知的,因此框架尚未初始化任何参数,显示为"UninitializedParameter"。

python 复制代码
net = nn.Sequential(nn.LazyLinear(256), nn.ReLU(), nn.LazyLinear(10))

net[0].weight
复制代码
c:\Software\Miniconda3\envs\d2l\lib\site-packages\torch\nn\modules\lazy.py:178: UserWarning: Lazy modules are a new feature under heavy development so changes to the API or functionality can happen at any moment.
  warnings.warn('Lazy modules are a new feature under heavy development '





<UninitializedParameter>

一旦指定了输入维度,框架就可以一层一层的延迟初始化。

python 复制代码
X = torch.rand(2, 20)
net(X)

net[0].weight.shape
复制代码
torch.Size([256, 20])

练习

(1)如果指定了第一层的输入维度,但没有指定后续层的维度,会发生什么?是否立即进行初始化?

python 复制代码
net = nn.Sequential(
    nn.Linear(20, 256), nn.ReLU(),
    nn.LazyLinear(128), nn.ReLU(),
    nn.LazyLinear(10)
)
net[0].weight, net[2].weight, net[4].weight
复制代码
c:\Software\Miniconda3\envs\d2l\lib\site-packages\torch\nn\modules\lazy.py:178: UserWarning: Lazy modules are a new feature under heavy development so changes to the API or functionality can happen at any moment.
  warnings.warn('Lazy modules are a new feature under heavy development '





(Parameter containing:
 tensor([[ 0.1332,  0.1372, -0.0939,  ..., -0.0579, -0.0911, -0.1820],
         [-0.1570, -0.0993, -0.0685,  ..., -0.0469, -0.0208,  0.0665],
         [ 0.0861,  0.1135,  0.1631,  ..., -0.1407,  0.1088, -0.2052],
         ...,
         [-0.1454, -0.0283, -0.1074,  ..., -0.2164, -0.2169,  0.1913],
         [-0.1617,  0.1206, -0.2119,  ..., -0.1862, -0.0951,  0.1535],
         [-0.0229, -0.2133, -0.1027,  ...,  0.1973,  0.1314,  0.1283]],
        requires_grad=True),
 <UninitializedParameter>,
 <UninitializedParameter>)
python 复制代码
net(X)  # 延迟初始化
net[0].weight.shape, net[2].weight.shape, net[4].weight.shape
复制代码
(torch.Size([256, 20]), torch.Size([128, 256]), torch.Size([10, 128]))

(2)如果指定了不匹配的维度会发生什么?

python 复制代码
X = torch.rand(2, 10)
net(X)  # 会报错
复制代码
---------------------------------------------------------------------------

RuntimeError                              Traceback (most recent call last)

Cell In[14], line 2
      1 X = torch.rand(2, 10)
----> 2 net(X)


File c:\Software\Miniconda3\envs\d2l\lib\site-packages\torch\nn\modules\module.py:1130, in Module._call_impl(self, *input, **kwargs)
   1126 # If we don't have any hooks, we want to skip the rest of the logic in
   1127 # this function, and just call forward.
   1128 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1129         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1130     return forward_call(*input, **kwargs)
   1131 # Do not call functions when jit is used
   1132 full_backward_hooks, non_full_backward_hooks = [], []


File c:\Software\Miniconda3\envs\d2l\lib\site-packages\torch\nn\modules\container.py:139, in Sequential.forward(self, input)
    137 def forward(self, input):
    138     for module in self:
--> 139         input = module(input)
    140     return input


File c:\Software\Miniconda3\envs\d2l\lib\site-packages\torch\nn\modules\module.py:1130, in Module._call_impl(self, *input, **kwargs)
   1126 # If we don't have any hooks, we want to skip the rest of the logic in
   1127 # this function, and just call forward.
   1128 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1129         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1130     return forward_call(*input, **kwargs)
   1131 # Do not call functions when jit is used
   1132 full_backward_hooks, non_full_backward_hooks = [], []


File c:\Software\Miniconda3\envs\d2l\lib\site-packages\torch\nn\modules\linear.py:114, in Linear.forward(self, input)
    113 def forward(self, input: Tensor) -> Tensor:
--> 114     return F.linear(input, self.weight, self.bias)


RuntimeError: mat1 and mat2 shapes cannot be multiplied (2x10 and 20x256)

(3)如果输入具有不同的维度,需要做什么?

调整维度,要么填充,要么降维。

相关推荐
Jmayday1 小时前
机器学习基本理论
人工智能·机器学习
ZhengEnCi1 小时前
01b-上下文向量与信息瓶颈
人工智能
王_teacher1 小时前
机器学习 矩阵求导 完整公式+严谨推导
人工智能·线性代数·考研·机器学习·矩阵·线性回归
码以致用1 小时前
DeerFlow Memory架构
人工智能·ai·架构·agent
ting94520001 小时前
从零构建大模型实战:数据处理与 GPT-2 完整实现
人工智能
学点程序1 小时前
Manifest:帮个人 AI Agent 降低模型成本的开源路由器
人工智能·开源
可观测性用观测云1 小时前
观测云 x AI Agent:运维智能化的范式跃迁实践
人工智能
数数科技的数据干货1 小时前
ThinkingAI携手华为云,共建企业级AI Agent平台Agentic Engine
人工智能·ai·华为云·agent
人工智能AI技术1 小时前
春招急救:7天面试突击方案
人工智能
2603_954708312 小时前
如何确保微电网标准化架构设计流程的完整性?
网络·人工智能·物联网·架构·系统架构