Model
Training Mode
For some models that use the dropout layer, some neurons in the training phase are kept inactive in order to ensure that the model does not have an over-fitting behavior. In practice, these inactivated neurons are all enabled and involved in the processing of data:
To switch the model to training mode:
Model.train ()
To convert a model to an evaluation mode:
Model.eval ()
the connection between the convolution layer and the full connection layer
The convolution layer is a feature map, belonging to a pile of rectangles, and their next level of the full join layer belongs to one-dimensional vector. How to seamlessly connect data from the convolution layer to the data of the full-attached layer. Use the following code in the model's forward function:
class Net (NN. Module): def __init__ (self): Super (DQN, self). __init__ () Self.conv1 = nn. conv2d (3, kernel_size=5, stride=2) self.bn1 = nn. batchnorm2d (+) Self.conv2 = nn. conv2d (+, kernel_size=5, stride=2) self.bn2 = nn. batchnorm2d (+) Self.fc1 = nn. Linear (3808,448) self.dropout = nn. Dropout () Self.head = nn. Linear (448, 2) def forward (self, x): x = F.relu (SELF.BN1 (x)) x = Self.conv1 (F.relu (self.bn2 V2 (x))) x = X.view ( -1, 3808) # it ' s very important,without it, we wouldn ' t konw how to calculate it x = SE LF.FC1 (x) return Self.head (X.view (x.size (0),-1))