Pytorch中高维度张量理解

Pytorch中高维度张量理解

创建一个tensor

python 复制代码
tensor = torch.rand(3,5,3,2)

结果如下:

python 复制代码
```python
tensor([[[[0.3844, 0.9532],
          [0.0787, 0.4187],
          [0.4144, 0.9552]],

         [[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]],

         [[0.0040, 0.1001],
          [0.3837, 0.6088],
          [0.1752, 0.3184]],

         [[0.2762, 0.8417],
          [0.5438, 0.4406],
          [0.0529, 0.5175]],

         [[0.1038, 0.7948],
          [0.4991, 0.5155],
          [0.4651, 0.8095]]],


        [[[0.0377, 0.0249],
          [0.2440, 0.8501],
          [0.1176, 0.7303]],

         [[0.9979, 0.6738],
          [0.2486, 0.4152],
          [0.5896, 0.8879]],

         [[0.3499, 0.6918],
          [0.4399, 0.5192],
          [0.1783, 0.5962]],

         [[0.3021, 0.4297],
          [0.9558, 0.0046],
          [0.9994, 0.1249]],

         [[0.8348, 0.7249],
          [0.1525, 0.3867],
          [0.8992, 0.6996]]],


        [[[0.5918, 0.9135],
          [0.8205, 0.5719],
          [0.8127, 0.3856]],

         [[0.1870, 0.6190],
          [0.2991, 0.9424],
          [0.5405, 0.4200]],

         [[0.9396, 0.8072],
          [0.0319, 0.6586],
          [0.4849, 0.6193]],

         [[0.5268, 0.2794],
          [0.7877, 0.9502],
          [0.6553, 0.9574]],

         [[0.4079, 0.4648],
          [0.6375, 0.8829],
          [0.6280, 0.1463]]]])

现在我想获取

python 复制代码
tensor[0,0,0,0]

获取第一个维度的第0个元素:

python 复制代码
		[[[0.3844, 0.9532],
          [0.0787, 0.4187],
          [0.4144, 0.9552]],

         [[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]],

         [[0.0040, 0.1001],
          [0.3837, 0.6088],
          [0.1752, 0.3184]],

         [[0.2762, 0.8417],
          [0.5438, 0.4406],
          [0.0529, 0.5175]],

         [[0.1038, 0.7948],
          [0.4991, 0.5155],
          [0.4651, 0.8095]]]

获取第二个维度的第0个元素:

python 复制代码
		[[0.3844, 0.9532],
		  [0.0787, 0.4187],
		  [0.4144, 0.9552]]

获取第三个维度的第0个元素:

python 复制代码
		[0.3844, 0.9532]

获取第四个维度的第0个元素:

python 复制代码
		0.3844

其他情况

tensor[-1]

获取第1个维度的最后一个元素:

python 复制代码
		[[[0.5918, 0.9135],
          [0.8205, 0.5719],
          [0.8127, 0.3856]],

         [[0.1870, 0.6190],
          [0.2991, 0.9424],
          [0.5405, 0.4200]],

         [[0.9396, 0.8072],
          [0.0319, 0.6586],
          [0.4849, 0.6193]],

         [[0.5268, 0.2794],
          [0.7877, 0.9502],
          [0.6553, 0.9574]],

         [[0.4079, 0.4648],
          [0.6375, 0.8829],
          [0.6280, 0.1463]]]

tensor[0,1]

获取第1个维度的第0个元素 :

python 复制代码
		[[[0.3844, 0.9532],
          [0.0787, 0.4187],
          [0.4144, 0.9552]],

         [[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]],

         [[0.0040, 0.1001],
          [0.3837, 0.6088],
          [0.1752, 0.3184]],

         [[0.2762, 0.8417],
          [0.5438, 0.4406],
          [0.0529, 0.5175]],

         [[0.1038, 0.7948],
          [0.4991, 0.5155],
          [0.4651, 0.8095]]]

第2个维度的第1个元素:

python 复制代码
 		[[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]]

tensor[:,1,0,1]

获取第1个维度的所有元素:

python 复制代码
		[[[0.3844, 0.9532],
          [0.0787, 0.4187],
          [0.4144, 0.9552]],

         [[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]],

         [[0.0040, 0.1001],
          [0.3837, 0.6088],
          [0.1752, 0.3184]],

         [[0.2762, 0.8417],
          [0.5438, 0.4406],
          [0.0529, 0.5175]],

         [[0.1038, 0.7948],
          [0.4991, 0.5155],
          [0.4651, 0.8095]]],


        [[[0.0377, 0.0249],
          [0.2440, 0.8501],
          [0.1176, 0.7303]],

         [[0.9979, 0.6738],
          [0.2486, 0.4152],
          [0.5896, 0.8879]],

         [[0.3499, 0.6918],
          [0.4399, 0.5192],
          [0.1783, 0.5962]],

         [[0.3021, 0.4297],
          [0.9558, 0.0046],
          [0.9994, 0.1249]],

         [[0.8348, 0.7249],
          [0.1525, 0.3867],
          [0.8992, 0.6996]]],


        [[[0.5918, 0.9135],
          [0.8205, 0.5719],
          [0.8127, 0.3856]],

         [[0.1870, 0.6190],
          [0.2991, 0.9424],
          [0.5405, 0.4200]],

         [[0.9396, 0.8072],
          [0.0319, 0.6586],
          [0.4849, 0.6193]],

         [[0.5268, 0.2794],
          [0.7877, 0.9502],
          [0.6553, 0.9574]],

         [[0.4079, 0.4648],
          [0.6375, 0.8829],
          [0.6280, 0.1463]]]

第2个维度的第1个元素:

python 复制代码
 		[[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]]

		[[0.9979, 0.6738],
          [0.2486, 0.4152],
          [0.5896, 0.8879]]

		[[0.1870, 0.6190],
          [0.2991, 0.9424],
          [0.5405, 0.4200]]

第3个维度的第0个元素:

python 复制代码
		[0.0713, 0.5281]
		[0.9979, 0.6738]
		[0.1870, 0.6190]

第4个维度的第1个元素:

python 复制代码
		 0.5281
		 0.6738
		 0.6190

最终结果:

python 复制代码
tensor([0.5281, 0.6738, 0.6190])
相关推荐
迁旭3 分钟前
claude code 规划模式(Plan Mode)完整指南
人工智能·机器学习·文心一言·知识图谱
user29876982706544 分钟前
七、Git Worktree:为什么隔离工作空间很重要
人工智能
八月瓜科技7 分钟前
豆包启动付费会员测试,承诺基础服务永久免费,免费AI时代是否终结?
数据库·人工智能·科技·深度学习·机器人
人道领域8 分钟前
【黑马点评日记】社交平台用户关注功能全解析Feed流相关操作
java·开发语言·数据库·redis·python
Andy Dennis8 分钟前
mcp python-sdk使用记录
python·agent·mcp
Jiude10 分钟前
经验正在失去垄断性
人工智能·架构·设计
海域云-罗鹏12 分钟前
豆包开启付费订阅,想白嫖越来越难了,企业不如部署自己的算力服务器
服务器·人工智能·github
战族狼魂13 分钟前
AI技术发展动态与行业趋势分析
人工智能
TTGGGFF16 分钟前
自动化天塌了?AI 替你画 Simulink!Agent——MCP 配置踩坑指南(附真实环境实测)
人工智能·自动化·simulink
zhoutongsheng21 分钟前
mysql如何处理表空间碎片问题_执行OPTIMIZE TABLE整理
jvm·数据库·python