Pytorch中高维度张量理解

Pytorch中高维度张量理解

创建一个tensor

python 复制代码
tensor = torch.rand(3,5,3,2)

结果如下:

python 复制代码
```python
tensor([[[[0.3844, 0.9532],
          [0.0787, 0.4187],
          [0.4144, 0.9552]],

         [[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]],

         [[0.0040, 0.1001],
          [0.3837, 0.6088],
          [0.1752, 0.3184]],

         [[0.2762, 0.8417],
          [0.5438, 0.4406],
          [0.0529, 0.5175]],

         [[0.1038, 0.7948],
          [0.4991, 0.5155],
          [0.4651, 0.8095]]],


        [[[0.0377, 0.0249],
          [0.2440, 0.8501],
          [0.1176, 0.7303]],

         [[0.9979, 0.6738],
          [0.2486, 0.4152],
          [0.5896, 0.8879]],

         [[0.3499, 0.6918],
          [0.4399, 0.5192],
          [0.1783, 0.5962]],

         [[0.3021, 0.4297],
          [0.9558, 0.0046],
          [0.9994, 0.1249]],

         [[0.8348, 0.7249],
          [0.1525, 0.3867],
          [0.8992, 0.6996]]],


        [[[0.5918, 0.9135],
          [0.8205, 0.5719],
          [0.8127, 0.3856]],

         [[0.1870, 0.6190],
          [0.2991, 0.9424],
          [0.5405, 0.4200]],

         [[0.9396, 0.8072],
          [0.0319, 0.6586],
          [0.4849, 0.6193]],

         [[0.5268, 0.2794],
          [0.7877, 0.9502],
          [0.6553, 0.9574]],

         [[0.4079, 0.4648],
          [0.6375, 0.8829],
          [0.6280, 0.1463]]]])

现在我想获取

python 复制代码
tensor[0,0,0,0]

获取第一个维度的第0个元素:

python 复制代码
		[[[0.3844, 0.9532],
          [0.0787, 0.4187],
          [0.4144, 0.9552]],

         [[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]],

         [[0.0040, 0.1001],
          [0.3837, 0.6088],
          [0.1752, 0.3184]],

         [[0.2762, 0.8417],
          [0.5438, 0.4406],
          [0.0529, 0.5175]],

         [[0.1038, 0.7948],
          [0.4991, 0.5155],
          [0.4651, 0.8095]]]

获取第二个维度的第0个元素:

python 复制代码
		[[0.3844, 0.9532],
		  [0.0787, 0.4187],
		  [0.4144, 0.9552]]

获取第三个维度的第0个元素:

python 复制代码
		[0.3844, 0.9532]

获取第四个维度的第0个元素:

python 复制代码
		0.3844

其他情况

tensor[-1]

获取第1个维度的最后一个元素:

python 复制代码
		[[[0.5918, 0.9135],
          [0.8205, 0.5719],
          [0.8127, 0.3856]],

         [[0.1870, 0.6190],
          [0.2991, 0.9424],
          [0.5405, 0.4200]],

         [[0.9396, 0.8072],
          [0.0319, 0.6586],
          [0.4849, 0.6193]],

         [[0.5268, 0.2794],
          [0.7877, 0.9502],
          [0.6553, 0.9574]],

         [[0.4079, 0.4648],
          [0.6375, 0.8829],
          [0.6280, 0.1463]]]

tensor[0,1]

获取第1个维度的第0个元素 :

python 复制代码
		[[[0.3844, 0.9532],
          [0.0787, 0.4187],
          [0.4144, 0.9552]],

         [[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]],

         [[0.0040, 0.1001],
          [0.3837, 0.6088],
          [0.1752, 0.3184]],

         [[0.2762, 0.8417],
          [0.5438, 0.4406],
          [0.0529, 0.5175]],

         [[0.1038, 0.7948],
          [0.4991, 0.5155],
          [0.4651, 0.8095]]]

第2个维度的第1个元素:

python 复制代码
 		[[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]]

tensor[:,1,0,1]

获取第1个维度的所有元素:

python 复制代码
		[[[0.3844, 0.9532],
          [0.0787, 0.4187],
          [0.4144, 0.9552]],

         [[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]],

         [[0.0040, 0.1001],
          [0.3837, 0.6088],
          [0.1752, 0.3184]],

         [[0.2762, 0.8417],
          [0.5438, 0.4406],
          [0.0529, 0.5175]],

         [[0.1038, 0.7948],
          [0.4991, 0.5155],
          [0.4651, 0.8095]]],


        [[[0.0377, 0.0249],
          [0.2440, 0.8501],
          [0.1176, 0.7303]],

         [[0.9979, 0.6738],
          [0.2486, 0.4152],
          [0.5896, 0.8879]],

         [[0.3499, 0.6918],
          [0.4399, 0.5192],
          [0.1783, 0.5962]],

         [[0.3021, 0.4297],
          [0.9558, 0.0046],
          [0.9994, 0.1249]],

         [[0.8348, 0.7249],
          [0.1525, 0.3867],
          [0.8992, 0.6996]]],


        [[[0.5918, 0.9135],
          [0.8205, 0.5719],
          [0.8127, 0.3856]],

         [[0.1870, 0.6190],
          [0.2991, 0.9424],
          [0.5405, 0.4200]],

         [[0.9396, 0.8072],
          [0.0319, 0.6586],
          [0.4849, 0.6193]],

         [[0.5268, 0.2794],
          [0.7877, 0.9502],
          [0.6553, 0.9574]],

         [[0.4079, 0.4648],
          [0.6375, 0.8829],
          [0.6280, 0.1463]]]

第2个维度的第1个元素:

python 复制代码
 		[[0.0713, 0.5281],
          [0.0230, 0.8433],
          [0.1113, 0.5927]]

		[[0.9979, 0.6738],
          [0.2486, 0.4152],
          [0.5896, 0.8879]]

		[[0.1870, 0.6190],
          [0.2991, 0.9424],
          [0.5405, 0.4200]]

第3个维度的第0个元素:

python 复制代码
		[0.0713, 0.5281]
		[0.9979, 0.6738]
		[0.1870, 0.6190]

第4个维度的第1个元素:

python 复制代码
		 0.5281
		 0.6738
		 0.6190

最终结果:

python 复制代码
tensor([0.5281, 0.6738, 0.6190])
相关推荐
C嘎嘎嵌入式开发19 小时前
(2)100天python从入门到拿捏
开发语言·python
Stanford_110619 小时前
如何利用Python进行数据分析与可视化的具体操作指南
开发语言·c++·python·微信小程序·微信公众平台·twitter·微信开放平台
西猫雷婶19 小时前
CNN卷积计算
人工智能·神经网络·cnn
white-persist21 小时前
Python实例方法与Python类的构造方法全解析
开发语言·前端·python·原型模式
Java 码农21 小时前
Centos7 maven 安装
java·python·centos·maven
格林威21 小时前
常规线扫描镜头有哪些类型?能做什么?
人工智能·深度学习·数码相机·算法·计算机视觉·视觉检测·工业镜头
lyx331369675921 小时前
#深度学习基础:神经网络基础与PyTorch
pytorch·深度学习·神经网络·参数初始化
倔强青铜三1 天前
苦练Python第63天:零基础玩转TOML配置读写,tomllib模块实战
人工智能·python·面试
递归不收敛1 天前
吴恩达机器学习课程(PyTorch 适配)学习笔记:3.3 推荐系统全面解析
pytorch·学习·机器学习
浔川python社1 天前
《网络爬虫技术规范与应用指南系列》(xc—3):合规实操与场景落地
python