Pytorch | Using batch normalization to normalize/instance normalize of variable

Source: Internet
Author: User
Tags pytorch

God, chatter. I have found that the newer Pytorch have instance normalization.
You don't have to toss yourself.
-2017.5.25

Use NN. The subclass _batchnorm (defined in torch.nn.modules.batchnorm) in Module can achieve normalize of various requirements.

In docs, you can see, there are 3 kinds of normalization layer, but in fact they are inherited _batchnorm this class, so we look at batchnorm2d, can be extrapolate to other ways ~

Take a look at the document first

No, it doesn't matter, then use the example to explain:
The method for creating an batchnorm2d instance is as follows

Import torch.nn as nn
norm = nn. Batchnorm2d (Fea_num, Affine=false). Cuda ()

Among them, the Fea_num is pulled out of the dimension, that is, according to the Fea_num dimension, the other dimensions pull into a long strip to normalize,fea_num corresponding input 1th (dimension from 0) dimension, so both values should be equal ... cuda () is to put this module on the GPU. in the case of ordinary batch normalize

Input is (batchsize,channel,height,width) = (4,3,5,5) to see, fea_num corresponding channel. So when channel=0, ask once Mean,var, do a normalize;channel=1, ask once. When channel=2, ask once.
In training, there are two parameters to learn gamma & Beta, so you should create and use Batchnorm layer when Gamma & Beta is set to variable parameters:

#input is Cuda float Variable of batchsize x Channel x Height x-width
#train state
norm = nn. BATCHNORM2D (channel). Cuda () #默认affine =true
input = norm (input)

Note: The correct initialization of variable parameters before train in Test/eval mode, you should use. Eval () to hold the variable parameters. A test example for input:

Import NumPy as NP from
Torch.autograd import Variable
BS = 2
C = 3
H = 2
W = 2
input = Np.arange ( BS*C*H*W)
input.resize (BS, C, H, W)
input = Variable (Torch.from_numpy (Input). Clone (). float ()). Cuda (). Contiguous ()

If you do not need variable parameters Gamma & Beta, that's direct:

#input is Cuda float Variable of batchsize x Channel x Height x width
norm = nn. Batchnorm2d (channel, Affine=false). Cuda ()
input = norm (input)
other cases of normalize, such as instance normalize

Input or (batchsize,channel,height,width) = (4,3,5,5) Suppose we want to pull the batchsize dimension out of each instance (batchsize=0~3) as (3, 5,5) 3D tensor ask once normalize, then how to do it. In fact, it is very simple, the input of the No. 0 and 1th dimensions of the exchange will be good.

#input is Cuda float Variable of batchsize x Channel x Height x width
instancenorm = nn. Batchnorm2d (BS, Affine=false). Cuda ()
input = input. Transpose (0,1). contiguous ()
input = instancenorm (input)
input = input. Transpose (0,1). Contiguous ()

Note: Affine parameters to look at the requirements set, attention to the common batch normalize situation if not used. Contiguous (), it is possible to error

Runtimeerror:assertion ' Thctensor_ (iscontiguous) (state, T) ' failed.  At **/pytorch/torch/lib/thcunn/generic/batchnormalization.cu:20

All in all, remember Batchnorm layer Fea_num value =input pull out of that dimension, and the dimension should be the 1th dimension of input, if not, with resize, transpose, unsqueeze what to get is good

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.