Autograd
1、深度学习的算法本质上是通过反向传播求导数,Pytorch的Autograd模块实现了此功能;在Tensor上的所有操作,Autograd都能为他们自动提供微分,避免手动计算导数的复杂过程。2、autograd.Variable是Autograd中的核心类,它简单的封装了Tensor,并支持几乎所有Tensor操作;Tensor被封装为Variable之后,可以调用它的.backward()实现反向传播,自动计算所有的梯度。3、Variable主要包含三个属性: data:保存Variable所包含的Tensor; grad:保存data对应的梯度,grad也是个Variable,而不是Tensor,它和data的形状一样; grad_fn:指向一个Function对象,这个Function用来反向传播计算输入的梯度。
Specific code parsing
- #_Author_: Monkey
- #!/usr/bin/env python
- #-*-Coding:utf-8-*-
- Import Torch as T
- From Torch.autograd import Variable
- x = Variable (T.ones (2,2), Requires_grad = True)
- Print (x)
- "'"tensor ([[1., 1.],
- [1., 1.]], requires_grad=true) "
- y = X.sum ()
- Print (y)
- "'tensor (4., grad_fn=<sumbackward0>) '
- Print (Y.GRAD_FN) #指向一个Function对象 This function is used to reverse-propagate the gradient of the computed input
- "'<sumbackward0 object at 0x000002d4240ab860> '
- Y.backward ()
- Print (X.grad)
- "'"tensor ([[1., 1.],
- [1., 1.]]
- Y.backward ()
- Print (X.grad)
- "'"tensor ([[2., 2.],
- [2., 2.]]
- Y.backward ()
- Print (X.grad)
- "'"tensor ([[3., 3.],
- [3., 3.]]
- "'"grad in the process of reverse propagation (accumulated), which means running
- Reverse propagation, the gradient will accumulate before the gradient, so the reverse propagation requires a gradient clear 0 "
- Print (X.grad.data.zero_ ())
- "'"tensor ([[0., 0.],
- [0., 0.]]
- Y.backward ()
- Print (X.grad)
- "'"tensor ([[1., 1.],
- [1., 1.]]
- m = Variable (T.ones (4,5))
- n = T.cos (m)
- Print (m)
- Print (n)
- "'"tensor ([[1., 1., 1., 1., 1.],
- [1., 1., 1., 1., 1.],
- [1., 1., 1., 1., 1.],
- [1., 1., 1., 1., 1.]]
- Tensor ([[0.5403, 0.5403, 0.5403, 0.5403, 0.5403],
- [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],
- [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],
- [0.5403, 0.5403, 0.5403, 0.5403, 0.5403]] "
- M_tensor_cos = T.cos (m.data)
- Print (M_tensor_cos)
- "'"Ensor ([0.5403, 0.5403, 0.5403, 0.5403, 0.5403],
- [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],
- [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],
- [0.5403, 0.5403, 0.5403, 0.5403, 0.5403]] "
Autograd: Automatic differential