The Keras framework is concise and elegant, and its design is a model. Tensorflow is bloated and complicated, and it is confusing. Of course, the peripheral components of Keras, such as callbacks, datasets, and preprocessing, have a lot of over-designed feelings, but the core of Keras is good, the perfect core of this design makes the system highly scalable and the code logic strong. However, because there are still some small details, once you cannot understand it, you will have a "magical" sense of the Keras principle. For example, the question to be discussed next in this article: stop_training in callbacks.
The model # fit () function of Keras receives a callback list. Different callback operations are triggered at different stages of training. These stages include:
- Start and end of training
- Batch start and end
- Round start and end
All elements in the callback list are instances of the callback derived class. Each callback derived class can selectively override the above six functions.
One of the common operations in callback is:Callback_model.stop_training = true or false
.
Callback_model
Is a member variable of each callback instance. Its corresponding type is model.
But the model does notStop_training
This member variable, model inherited from network, and network does not have this member variable.Stop_training
The only difference between this attribute is that when the callback interface is defined in Callbacks. py, train_array.py performs the training operation.
This property seems to have fallen from the sky, and is out of nothing.
In fact, the model's parent class network implements_ Setattr __
Function, so that the variable can be "suspended" on the network at will. The model also inherits this feature.
Stop_training in Keras callback