1. Rationale (illustrated here as an example of creating a picture)
Suppose there are 2 meshes, G (Generator) and D (discriminator), the functions are:
G: The mesh that generates the image receives a random noise Z, generating a picture from this noise, which is recorded as G (Z);
D: discriminant grid, to determine whether a picture is "real"
Its input parameter is x,x represents a picture, output D (x) represents the probability of x real picture;
If 1, the 100% is the real picture, if 0, the representative can not be the real picture.
2. During the training process, * * The goal of generating network g is * * Try to generate real images to deceive discriminant grid D;
The goal of **d is to try to separate G-generated images from real pictures.
Thus, G and D constitute a dynamic "game process".
3. The most ideal result: g can generate enough "real" picture g (z); D satisfied to determine whether the image generated by G is true. D (G (Z)) = 0.5. So our goal is achieved: to get a generated model of G, can be used to generate images.
4. Mathematical formula: See Arxiv.org/abs/1406.2661.gan first Paper:lan Goodfellow generative adversarial Networks
5. Algorithm: Using random gradient descent method to train d,g. Specifically also in the above article.
6.DCGAN Principle Introduction:
The best model for image processing applications in deep learning is CNN, how CNN and Gan combine. The answer is Dcgan.
The principle is the same as Gan. Just replaced the above G and D with two convolutional neural networks CNN. But it is not a direct change, it has made some changes to the structure of convolutional neural networks to improve the quality of samples and the speed of convergence. These changes are:
A. Cancel the pooling layer. The G network uses the transpose convolution for sampling, and the D network replaces pooling with the stride convolution.
B. Use batch normalization in D,g
C. Remove the FC layer and turn the network into an all-convolution network
D.G network using Relu as the activation function, the last layer of the hand that with Tanh
Use Leakyrelu as activation function in E.D network