Pytorch – torch.stack参数详解与使用
本文作者:StubbornHuang
版权声明:本文为站长原创文章,如果转载请注明原文链接!
原文标题:Pytorch – torch.stack参数详解与使用
原文链接:https://www.stubbornhuang.com/2217/
发布于:2022年07月27日 9:02:05
修改于:2022年07月27日 9:02:05

1 torch.stack参数详解与使用
1.1 torch.stack
1.函数形式
torch.stack(tensors, dim=0, *, out=None) → Tensor
2.函数功能
沿指定维度连接Tensor序列,所有的Tensor必须是同样大小
3.函数参数
- tensors:Tensor序列,需要连接的Tensor序列
- dim:int类型,连接Tensor的维度,必须在0和所需连接的Tensor维度之间
4.函数返回值
返回连接后的Tensor
1.2 torch.stack的使用
1.2.1 torch.stack连接一维Tensor序列
import torch
if __name__ == '__main__':
tensor1 = torch.tensor([1,2,3,4])
tensor2 = torch.tensor([5,6,7,8])
tensor3 = torch.tensor([9,10,11,12])
tensor4 = torch.tensor([13,14,15,16])
output0 = torch.stack([tensor1,tensor2,tensor3,tensor4],dim=0)
output1 = torch.stack([tensor1, tensor2, tensor3, tensor4], dim=1)
print('torch.stack dim=0:{0},{1}'.format(output0,output0.shape))
print('torch.stack dim=1:{0},{1}'.format(output1, output1.shape))
输出
torch.stack dim=0:tensor([[ 1, 2, 3, 4],
[ 5, 6, 7, 8],
[ 9, 10, 11, 12],
[13, 14, 15, 16]]),torch.Size([4, 4])
torch.stack dim=1:tensor([[ 1, 5, 9, 13],
[ 2, 6, 10, 14],
[ 3, 7, 11, 15],
[ 4, 8, 12, 16]]),torch.Size([4, 4])
1.2.2 torch.stack连接二维Tensor序列
import torch
if __name__ == '__main__':
tensor1 = torch.tensor([1,2,3,4]).view(2,2)
tensor2 = torch.tensor([5,6,7,8]).view(2,2)
tensor3 = torch.tensor([9,10,11,12]).view(2,2)
tensor4 = torch.tensor([13,14,15,16]).view(2,2)
output0 = torch.stack([tensor1,tensor2,tensor3,tensor4],dim=0)
output1 = torch.stack([tensor1, tensor2, tensor3, tensor4], dim=1)
output2 = torch.stack([tensor1, tensor2, tensor3, tensor4], dim=2)
print('torch.stack dim=0:{0},{1}'.format(output0, output0.shape))
print('torch.stack dim=1:{0},{1}'.format(output1, output1.shape))
print('torch.stack dim=2:{0},{1}'.format(output2, output2.shape))
输出
torch.stack dim=0:tensor([[[ 1, 2],
[ 3, 4]],
[[ 5, 6],
[ 7, 8]],
[[ 9, 10],
[11, 12]],
[[13, 14],
[15, 16]]]),torch.Size([4, 2, 2])
torch.stack dim=1:tensor([[[ 1, 2],
[ 5, 6],
[ 9, 10],
[13, 14]],
[[ 3, 4],
[ 7, 8],
[11, 12],
[15, 16]]]),torch.Size([2, 4, 2])
torch.stack dim=2:tensor([[[ 1, 5, 9, 13],
[ 2, 6, 10, 14]],
[[ 3, 7, 11, 15],
[ 4, 8, 12, 16]]]),torch.Size([2, 2, 4])
1.2.3 torch.stack连接三维Tensor序列
import torch
if __name__ == '__main__':
tensor1 = torch.arange(0,8).view(2,2,2)
tensor2 = torch.arange(8,16).view(2,2,2)
tensor3 = torch.arange(16,24).view(2,2,2)
tensor4 = torch.arange(24,32).view(2,2,2)
output0 = torch.stack([tensor1,tensor2,tensor3,tensor4],dim=0)
output1 = torch.stack([tensor1, tensor2, tensor3, tensor4], dim=1)
output2 = torch.stack([tensor1, tensor2, tensor3, tensor4], dim=2)
output3 = torch.stack([tensor1, tensor2, tensor3, tensor4], dim=3)
print('torch.stack dim=0:{0},{1}'.format(output0, output0.shape))
print('torch.stack dim=1:{0},{1}'.format(output1, output1.shape))
print('torch.stack dim=2:{0},{1}'.format(output2, output2.shape))
print('torch.stack dim=3:{0},{1}'.format(output3, output3.shape))
输出
torch.stack dim=0:tensor([[[[ 0, 1],
[ 2, 3]],
[[ 4, 5],
[ 6, 7]]],
[[[ 8, 9],
[10, 11]],
[[12, 13],
[14, 15]]],
[[[16, 17],
[18, 19]],
[[20, 21],
[22, 23]]],
[[[24, 25],
[26, 27]],
[[28, 29],
[30, 31]]]]),torch.Size([4, 2, 2, 2])
torch.stack dim=1:tensor([[[[ 0, 1],
[ 2, 3]],
[[ 8, 9],
[10, 11]],
[[16, 17],
[18, 19]],
[[24, 25],
[26, 27]]],
[[[ 4, 5],
[ 6, 7]],
[[12, 13],
[14, 15]],
[[20, 21],
[22, 23]],
[[28, 29],
[30, 31]]]]),torch.Size([2, 4, 2, 2])
torch.stack dim=2:tensor([[[[ 0, 1],
[ 8, 9],
[16, 17],
[24, 25]],
[[ 2, 3],
[10, 11],
[18, 19],
[26, 27]]],
[[[ 4, 5],
[12, 13],
[20, 21],
[28, 29]],
[[ 6, 7],
[14, 15],
[22, 23],
[30, 31]]]]),torch.Size([2, 2, 4, 2])
torch.stack dim=3:tensor([[[[ 0, 8, 16, 24],
[ 1, 9, 17, 25]],
[[ 2, 10, 18, 26],
[ 3, 11, 19, 27]]],
[[[ 4, 12, 20, 28],
[ 5, 13, 21, 29]],
[[ 6, 14, 22, 30],
[ 7, 15, 23, 31]]]]),torch.Size([2, 2, 2, 4])
当前分类随机文章推荐
- Pytorch - torch.stack参数详解与使用 阅读558次,点赞0次
- Pytorch - torch.unsqueeze和torch.squeeze函数 阅读224次,点赞0次
- Pytorch - torch.chunk参数详解与使用 阅读885次,点赞0次
- Pytorch - torch.optim优化器 阅读540次,点赞0次
- Pytorch - 梯度累积/梯度累加trick,在显存有限的情况下使用更大batch_size训练模型 阅读239次,点赞0次
- Pytorch - reshape和view的用法和区别 阅读205次,点赞0次
- Pytorch – 使用torch.matmul()替换torch.einsum('bhxyd,md->bhxym',(a,b))算子模式 阅读912次,点赞0次
- Pytorch - 使用torch.matmul()替换torch.einsum('nctw,cd->ndtw',(a,b))算子模式 阅读1735次,点赞0次
- Pytorch - torch.cat参数详解与使用 阅读1144次,点赞1次
- Pytorch - torch.cat函数 阅读190次,点赞0次
全站随机文章推荐
- 资源分享 - Game Programming Patterns 英文高清PDF下载 阅读1506次,点赞0次
- 资源分享 - Ray Tracing Gems II - Next Generation Real-Time Rendering with DXR, Vulkan, and OptiX-Apress 英文高清PDF下载 阅读1857次,点赞0次
- 左手坐标系与右手坐标系 阅读3020次,点赞0次
- C++11 - 解析并获取可变参数模板中的所有参数 阅读1126次,点赞0次
- 资源分享 - 物理渲染-从理论到实践 第2版,Physically Based Rendering From Theory To Implementation(Second Edition) 中文版PDF下载 阅读1892次,点赞0次
- 资源分享 - Collision Detection in Interactive 3D Environments 英文高清PDF下载 阅读1870次,点赞0次
- 资源分享 - Robust and Error-Free Geometric Computing 英文高清PDF下载 阅读2173次,点赞0次
- 资源分享 - Digital Image Processing , Fourth Edition 英文高清PDF下载 阅读2564次,点赞1次
- Python - opencv-python保存视频时出现Failed to load OpenH264 library: openh264-1.8.0-win64.dll错误 阅读808次,点赞0次
- 资源分享 - GPU Pro 360 - Guide to 3D Engine Design 英文高清PDF下载 阅读2209次,点赞0次
评论
167