pytorch拼接與拆分

2021-09-28 20:59:53 字數 4423 閱讀 6248

**

cat/stack

cat在指定的維度上進行連線;

stack建立了新的維度進行連線。

in [1]

:import torch

in [2]

: a = torch.rand(4,

32,8)

in [3]

: b = torch.rand(5,

32,8)

in [4]

: torch.cat(

[a,b]

,dim=0)

.shape

out[4]

: torch.size([9

,32,8

])in [5]

: a1 = torch.rand(4,

3,32,

32)in [6]

: a2 = torch.rand(5,

3,32,

32)in [7]

: torch.cat(

[a1,a2]

,dim=0)

.shape

out[7]

: torch.size([9

,3,32

,32])

in [8]

: a2 = torch.rand(4,

1,32,

32)in [9]

: torch.cat(

[a1,a2]

,dim=1)

.shape

out[9]

: torch.size([4

,4,32

,32])

in [10]

: a1 = torch.rand(4,

3,16,

32)in [11]

: a2 = torch.rand(4,

3,16,

32)in [12]

: torch.cat(

[a1,a2]

,dim=2)

.shape

out[12]

: torch.size([4

,3,32

,32])

in [10]

: a1 = torch.rand(4,

3,16,

32)in [11]

: a2 = torch.rand(4,

3,16,

32)in [12]

: torch.cat(

[a1,a2]

,dim=2)

.shape

out[12]

: torch.size([4

,3,32

,32])

in [13]

: torch.stack(

[a1,a2]

,dim=2)

.shape

out[13]

: torch.size([4

,3,2

,16,32

])in [14]

: a = torch.rand(32,

8)in [15]

: b = torch.rand(32,

8)in [16]

: torch.stack(

[a,b]

,dim=0)

.shape

out[16]

: torch.size([2

,32,8

])

**

split/chunk

split根據每個單元固定長度來拆分或者根據需要的長度來拆分

chunk按照數量來拆分,給定的引數為需要拆分的數量

in [14]

: a = torch.rand(32,

8)in [15]

: b = torch.rand(32,

8)in [16]

: torch.stack(

[a,b]

,dim=0)

.shape

out[16]

: torch.size([2

,32,8

])in [17]

: a = torch.rand(32,

8)in [18]

: b = torch.rand(32,

8)in [19]

: c = torch.stack(

[a,b]

,dim=0)

in [20]

: a.shape,b.shape,c.shape

out[20]

:(torch.size([32

,8])

, torch.size([32

,8])

, torch.size([2

,32,8

]))in [21]

: aa,bb = c.split([1

,1],dim=0)

in [22]

: aa.shape,bb.shape

out[22]

:(torch.size([1

,32,8

]), torch.size([1

,32,8

]))in [23]

: aa,bb = c.split(

1,dim=0)

in [24]

: aa.shape,bb.shape

out[24]

:(torch.size([1

,32,8

]), torch.size([1

,32,8

]))in [25]

: aa,bb = c.split(

2,dim=0)

----

----

----

----

----

----

----

----

----

----

----

----

----

----

----

----

----

----

---valueerror traceback (most recent call last)

input-25

-27b6f4946a79

>

in--

-->

1 aa,bb = c.split(

2,dim=0)

valueerror:

not enough values to unpack (expected 2

, got 1

)

in [32]

: a = torch.rand(6,

32,8)

in [33]

: aa,bb,cc = a.split(

2,dim=0)

#固定長度拆分

in [34]

: aa.shape,bb.shape,cc.shape

out[34]

:(torch.size([2

,32,8

]), torch.size([2

,32,8

]), torch.size([2

,32,8

]))in [35]

: aa,bb,cc = a.split([1

,3,2

],dim=0)

#按照實際需求長度拆分

in [36]

: aa.shape,bb.shape,cc.shape

out[36]

:(torch.size([1

,32,8

]), torch.size([3

,32,8

]), torch.size([2

,32,8

]))

in [26]

: aa,bb = c.chunk(

2,dim=0)

in [27]

: aa.shape,bb.shape

out[27]

:(torch.size([1

,32,8

]), torch.size([1

,32,8

]))in [37]

: aa,bb,cc = a.chunk(

3,dim=0)

in [38]

: aa.shape,bb.shape,cc.shape

out[38]

:(torch.size([2

,32,8

]), torch.size([2

,32,8

]), torch.size([2

,32,8

]))

pytorch入門 拆分與拼接

其他相關操作 本篇pytorch的tensor拆分與拼接進行展示,包含 使用方法和含義均在 的批註中給出,因為有較多的輸出,所以設定輸出內容的第乙個值為當前print 方法所在的行 import torch import numpy as np import sys loc sys.getframe...

Tensor 拼接與拆分

2.拆分 注意要指出在哪個維度上進行拼接 import torch a torch.rand 4,32,8 b torch.rand 5,32,8 torch.cat a,b dim 0 shape torch.size 9,32,8 且除了要拼接的維度外,其他維度數值必須保持一致,否則會報錯 im...

VC位拆分與拼接

在編寫串列埠通訊時,想到是否可以模擬can通訊,將接收到的wparam型別的變數轉變成乙個結構體變數,並且在傳送時將結構體變數轉變成wparam變數傳送。1.首先定義結構體 typedef volatile struct b wreceiveval receivestruct 2.在接收函式中對結構...