PyTorch - Basic Operations
PyTorch - Basic Operations
PyTorch - Basic Operations
Basic
By selecting different configuration options, the tool in the PyTorch site shows you the required
and the latest wheel for your host platform. For example, on a Mac platform, the pip3 command
generated by the tool is:
Run the following code and you should see an un-initialized 2x3 Tensor is printed out. Tensor is
a data structure representing multi-dimensional array. It is similar to a NumPy ndarray. It’s size
is equivalent to the shape of the NumPy ndarray.
import torch
This printout represents the Tensor type and its size (dimension: 2x3).
Sample programs:
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 1/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
import torch
# Initialize
# Operations
z1 = x + y
z2 = torch.add(x, y) # Same as above
Operations
The syntax on a tensor operation:
torch.is_tensor(obj)
In-place operation
x.add_(y) # Same as x = x + y
out
We can assign the operation result to a variable. Alternatively, all operation methods have an
out parameter to store the result.
r1 = torch.Tensor(2, 3)
torch.add(x, y, out=r1)
r2 = torch.add(x, y)
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 2/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
Indexing
During the conversion, both ndarray and Tensor share the same memory storage. Change
value from either side will affect the other.
# Conversion
a = np.array([1, 2, 3])
v = torch.from_numpy(a) # Convert a numpy array to a Tensor
Tensor meta-data
Reshape Tensor
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 3/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
Create a Tensor
To increase the reproducibility of result, we often set the random seed to a specific value first.
torch.manual_seed(1)
Tensor type
x = torch.randn(5, 3).type(torch.FloatTensor)
# 1 1 1
# 2 2 2
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 4/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
# 3 3 3
v = torch.ones(3, 3)
v[1].fill_(2)
v[2].fill_(3)
# 0 1 2
# 3 4 5
# 6 7 8
v = torch.arange(9)
v = v.view(3, 3)
Initialize a ByteTensor
c = torch.ByteTensor([0, 1, 1, 0])
Summary
Creation Ops
~~~~~~~~~~~~~~~~~~~~~~
.. autofunction:: eye
.. autofunction:: from_numpy
.. autofunction:: linspace
.. autofunction:: logspace
.. autofunction:: ones
.. autofunction:: ones_like
.. autofunction:: arange
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 5/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
.. autofunction:: range
.. autofunction:: zeros
.. autofunction:: zeros_like
# 0 1 2
# 3 4 5
# 6 7 8
v = torch.arange(9)
v = v.view(3, 3)
Concatenate, stack
# Concatenation
torch.cat((x, x, x), 0) # Concatenate in the 0 dimension
# Stack
r = torch.stack((v, v))
# Gather element
# torch.gather(input, dim, index, out=None)
# out[i][j][k] = input[index[i][j][k]][j][k] # if dim == 0
# out[i][j][k] = input[i][index[i][j][k]][k] # if dim == 1
# out[i][j][k] = input[i][j][index[i][j][k]] # if dim == 2
# 0 1
# 4 3
# 8 7
r = torch.gather(v, 1, torch.LongTensor([[0,1],[1,0],[2,1]]))
Split a Tensor
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 6/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
# Index select
# 0 2
# 3 5
# 6 8
indices = torch.LongTensor([0, 2])
r = torch.index_select(v, 1, indices) # Select element 0 and 2 for each dimension
# Masked select
# 0 0 0
# 1 1 1
# 1 1 1
mask = v.ge(3)
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 7/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
# Size 6: 3 4 5 6 7 8
r = torch.masked_select(v, mask)
# Un-squeeze a dimension
x = torch.Tensor([1, 2, 3])
r = torch.unsqueeze(x, 0) # Size: 1x3
r = torch.unsqueeze(x, 1) # Size: 3x1
Non-zero elements
# Non-zero
# [torch.LongTensor of size 8x2]
# [i, j] index for non-zero elements
# 0 1
# 0 2
# 1 0
# 1 1
# 1 2
# 2 0
# 2 1
# 2 2
r = torch.nonzero(v)
take
transpose
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 8/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
Summary
Distribution
# bernoulli
r = torch.bernoulli(r) # Size: 2x2. Bernoulli with probability p stored in elem
# Multinomial
w = torch.Tensor([0, 4, 8, 2]) # Create a tensor of weights
r = torch.multinomial(w, 4, replacement=True) # Size 4: 3, 2, 1, 2
# Normal distribution
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 9/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
Summary
Random sampling
----------------------------------
.. autofunction:: manual_seed - Set a manual seed
.. autofunction:: initial_seed - Randomize a seed by the system
.. autofunction:: get_rng_state
.. autofunction:: set_rng_state
.. autodata:: default_generator
.. autofunction:: bernoulli
.. autofunction:: multinomial
.. autofunction:: normal
.. autofunction:: rand
.. autofunction:: randn
.. autofunction:: randperm
There are a few more in-place random sampling functions defined on Tensors as wel
Point-wise operations
### Math operations
f= torch.FloatTensor([-1, -2, 3])
r = torch.abs(f) # 1 2 3
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 10/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
# Element-wise divide
r = torch.div(v, v+0.03)
# Element-wise multiple
r = torch.mul(v, v)
Summary
Pointwise Ops
~~~~~~~~~~~~~~~~~~~~~~
.. autofunction:: abs
.. autofunction:: acos - arc cosine
.. autofunction:: add
.. autofunction:: addcdiv - element wise: t1 + s * t2/t3
.. autofunction:: addcmul - element wise: t1 + s * t2 * t3
.. autofunction:: asin - arc sin
.. autofunction:: atan
.. autofunction:: atan2
.. autofunction:: ceil - ceiling
.. autofunction:: clamp - clamp elements into a range
.. autofunction:: cos
.. autofunction:: cosh
.. autofunction:: div - divide
.. autofunction:: erf - Gaussian error functiom
.. autofunction:: erfinv - Inverse
.. autofunction:: exp
.. autofunction:: expm1 - exponential of each element minus 1
.. autofunction:: floor
.. autofunction:: fmod - element wise remainder of division
.. autofunction:: frac - fraction part 3.4 -> 0.4
.. autofunction:: lerp - linear interpolation
.. autofunction:: log - natural log
.. autofunction:: log1p - y = log(1 + x)
.. autofunction:: mul - multiple
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 11/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
.. autofunction:: neg
.. autofunction:: pow
.. autofunction:: reciprocal - 1/x
.. autofunction:: remainder - remainder of division
.. autofunction:: round
.. autofunction:: rsqrt - the reciprocal of the square-root
.. autofunction:: sigmoid - sigmode(x)
.. autofunction:: sign
.. autofunction:: sin
.. autofunction:: sinh
.. autofunction:: sqrt
.. autofunction:: tan
.. autofunction:: tanh
.. autofunction:: trunc - truncated integer
Reduction operations
### Reduction operations
# Accumulate sum
# 0 1 2
# 3 5 7
# 9 12 15
r = torch.cumsum(v, dim=0)
# L-P norm
r = torch.dist(v, v+3, p=2) # L-2 norm: ((3^2)*9)^(1/2) = 9.0
# Mean
# 1 4 7
r = torch.mean(v, 1) # Size 3: Mean in dim 1
# Sum
# 3 12 21
r = torch.sum(v, 1) # Sum over dim 1
# 36
r = torch.sum(v)
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 12/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
Summary
Reduction Ops
~~~~~~~~~~~~~~~~~~~~~~
.. autofunction:: cumprod - accumulate product of elements x1*x2*x3...
.. autofunction:: cumsum
.. autofunction:: dist - L-p norm
.. autofunction:: mean
.. autofunction:: median
.. autofunction:: mode
.. autofunction:: norm - L-p norm
.. autofunction:: prod - accumulate product
.. autofunction:: std - compute standard deviation
.. autofunction:: sum
.. autofunction:: var - variance of all elements
Comparison operation
### Comparison
# Size 3x3: Element-wise comparison
r = torch.eq(v, v)
Sort
# Sort
# Second tuple store the index
# (
# 0 1 2
# 3 4 5
# 6 7 8
# [torch.FloatTensor of size 3x3]
# ,
# 0 1 2
# 0 1 2
# 0 1 2
# [torch.LongTensor of size 3x3]
r = torch.sort(v, 1)
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 13/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
# Top k
# (
# 2 5 8
# [torch.FloatTensor of size 3x1]
# ,
# 2 2 2
# [torch.LongTensor of size 3x1]
# )
r = torch.topk(v, 1)
Comparison Ops
~~~~~~~~~~~~~~~~~~~~~~
.. autofunction:: eq - Compare elements
.. autofunction:: equal - True of 2 tensors are the same
.. autofunction:: ge - Element-wise greater or equal comparison
.. autofunction:: gt
.. autofunction:: kthvalue - k-th element
.. autofunction:: le
.. autofunction:: lt
.. autofunction:: max
.. autofunction:: min
.. autofunction:: ne
.. autofunction:: sort
.. autofunction:: topk - top k
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 14/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
# Matrix X vector
# Size 2x4
mat = torch.randn(2, 4)
vec = torch.randn(4)
r = torch.mv(mat, vec)
# Matrix x Matrix
# Size 2x4
mat1 = torch.randn(2, 3)
mat2 = torch.randn(3, 4)
r = torch.mm(mat1, mat2)
v1 = torch.arange(1, 4) # Size 3
v2 = torch.arange(1, 3) # Size 2
r = torch.ger(v1, v2)
Other
Cross product
m1 = torch.ones(3, 5)
m2 = torch.ones(3, 5)
v1 = torch.ones(3)
# Cross product
# Size 3x5
r = torch.cross(m1, m2)
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 16/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
Diagonal matrix
# Diagonal matrix
# Size 3x3
r = torch.diag(v1)
Histogram
# Histogram
# [0, 2, 1, 0]
torch.histc(torch.FloatTensor([1, 2, 1]), bins=4, min=0, max=3)
Renormalization
# Renormalize
# Renormalize for L-1 at dim 0 with max of 1
# 0.0000 0.3333 0.6667
# 0.2500 0.3333 0.4167
# 0.2857 0.3333 0.3810
r = torch.renorm(v, 1, 0, 1)
Summary
Other Operations
~~~~~~~~~~~~~~~~~~~~~~
.. autofunction:: cross - cross product
.. autofunction:: diag - convert vector to diagonal matrix
.. autofunction:: histc - histogram
.. autofunction:: renorm - renormalize a tensor
.. autofunction:: trace - tr(M)
.. autofunction:: tril - lower triangle of 2-D matrix
.. autofunction:: triu - uppser triangle
Tensors
----------------------------------
.. autofunction:: is_tensor
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 17/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
.. autofunction:: is_storage
.. autofunction:: set_default_tensor_type
.. autofunction:: numel
.. autofunction:: set_printoptions
Serialization
----------------------------------
.. autofunction:: save - Saves an object to a disk file
.. autofunction:: load - Loads an object saved with torch.save() from a
Parallelism
----------------------------------
.. autofunction:: get_num_threads - Gets the number of OpenMP threads used for pa
.. autofunction:: set_num_threads
Spectral Ops
~~~~~~~~~~~~~~~~~~~~~~
.. autofunction:: stft - Short-time Fourier transform
.. autofunction:: hann_window - Hann window function
.. autofunction:: hamming_window - Hamming window function
.. autofunction:: bartlett_window - Bartlett window function
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 18/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 19/20
9/4/23, 2:52 PM “PyTorch - Basic operations”
4 Comments
1 Login
Name
H
Hari K − ⚑
4 years ago
0 0 Reply • Share ›
J
Jae Duk Seo − ⚑
4 years ago
thank you
0 0 Reply • Share ›
zedus − ⚑
5 years ago
Thanks! :)
0 0 Reply • Share ›
F
Farzad Sharif − ⚑
5 years ago
Beautiful
0 1 Reply • Share ›
https://jhui.github.io/2018/02/09/PyTorch-Basic-operations/ 20/20