A tensor in PyTorch is like a NumPy array with the difference that the tensors can utilize the power of GPU whereas arrays can’t. To normalize a tensor, we transform the tensor such that the mean and standard deviation become 0 and 1 respectively. As we know that the variance is the square of the standard deviation so variance also becomes 1.
We could take the following steps to normalize a tensor to 0 mean and 1 variance
Stepwise Implementation
Steps 1: Import required libraries
The only required library is PyTorch. PyTorch is a machine learning library for Python. We can import it using the below code. Please make sure that you have already installed it.
Python3
import torch |
Step 2: Create PyTorch Tensor
There are different ways to create a tensor in PyTorch. Create a tensor to which you want to normalize with 0 mean and 1 variance. We have created a float tensor with size 5. You can follow the article Tensors in PyTorch to create a tensor.
Python3
t = torch.tensor([ 1. , 2. , 3. , 4. , 5. ]) |
Step 3: Calculate the Mean, Standard Deviation (Std), and Variance of the tensor
We compute the mean standard deviation (std) and variance before normalizing the tensor. We will be using the mean and std to normalize the tensor (next step). We calculated the variance before normalizing the tensor to compare it with variance after normalizing the tensor.
Python3
mean, std, var = torch.mean(t), torch.std(t), torch.var(t) |
Step 4: Normalize the Tensor using Mean and Standard Deviation
To normalize the input tensor we first subtract the mean from the tensor and then the result is divided by the standard deviation. Print the tensor to see how the tensor looks like after normalization.
Python3
t = (t - mean) / std |
Step 5: Calculate again Mean and Variance to Verify 0 Mean and 1 Variance
Again we compute the mean, std, and variance to verify that after normalizing the tensor, now the mean is 0 and variance is 1.
Python3
mean, std, var = torch.mean(t), torch.std(t), torch.var(t) print ( "Mean, std and Var after normalize:\n" , mean, std, var) |
Example:
Now let’s have a look at the complete code of the above steps to normalize an input tensor to 0 mean and 1 variance and see that after normalizing the tensor the mean is 0 and variance is 1. Notice how the input tensor is transformed to a new tensor after normalization.
Python3
# Python program to normalize a tensor to # 0 mean and 1 variance # Step 1: Importing torch import torch # Step 2: creating a torch tensor t = torch.tensor([ 1. , 2. , 3. , 4. , 5. ]) print ( "Tensor before Normalize:\n" , t) # Step 3: Computing the mean, std and variance mean, std, var = torch.mean(t), torch.std(t), torch.var(t) print ( "Mean, Std and Var before Normalize:\n" , mean, std, var) # Step 4: Normalizing the tensor t = (t - mean) / std print ( "Tensor after Normalize:\n" , t) # Step 5: Again compute the mean, std and variance # after Normalize mean, std, var = torch.mean(t), torch.std(t), torch.var(t) print ( "Mean, std and Var after normalize:\n" , mean, std, var) |
Output:
Tensor before Normalize: tensor([1., 2., 3., 4., 5.]) Mean, Std and Var before Normalize: tensor(3.) tensor(1.5811) tensor(2.5000) Tensor after Normalize: tensor([-1.2649, -0.6325, 0.0000, 0.6325, 1.2649]) Mean, std and Var after normalize: tensor(0.) tensor(1.) tensor(1.)