Keras Loss Function Summary

Source: Internet
Author: User
Tags theano keras

Objective function Objectives

The objective function, or loss function, is one of the two parameters that must be compiled for a model:

Model.compile (loss= ' mean_squared_error ', optimizer= ' SGD ')
You can specify a target function by passing a predefined target function name, or you can pass a Theano/tensroflow symbolic function as the target function, which should return only a scalar value for each data point, with the following two parameters as parameters:

Y_true: Real data labels, theano/tensorflow tensor
Y_pred: Predictive value, theano/tensorflow tensor of the same shape as Y_true
From Keras import losses

Model.compile (Loss=losses.mean_squared_error, optimizer= ' SGD ')
The real optimization objective function is the mean value of the sum of the loss function values obtained at each data point

Please refer to the target implementation code for more information.

Available target functions

Mean_squared_error or MSE
Mean_absolute_error or Mae
Mean_absolute_percentage_error or Mape
Mean_squared_logarithmic_error or Msle
Squared_hinge
Hinge
Categorical_hinge
Binary_crossentropy (also known as logarithmic loss, Logloss)
Logcosh
Categorical_crossentropy: Also known as a multi-class logarithmic loss, note that when using this objective function, you need to convert the label to a sequence of two values (Nb_samples, nb_classes)
Sparse_categorical_crossentrop: As above, but accept sparse tags. Note that when you use this function, you still need your label to be the same dimension as the output value, and you may need to add a dimension to the label data: Np.expand_dims (y,-1)
Kullback_leibler_divergence: The information gain from the probability distribution Q of the predicted value to the truth probability distribution p to measure the difference of two distributions.
Poisson: The mean value (predictions-targets * log (predictions))
Cosine_proximity: That is, the inverse of the mean of the cosine distance between the predicted value and the real label
Note: When using "Categorical_crossentropy" as the target function, the label should be a multi-class pattern, that is, the one-hot encoded vector, not a single value. You can use the To_categorical function in the tool to complete the transformation. Examples are as follows:

From keras.utils.np_utils import to_categorical

Categorical_labels = to_categorical (Int_labels, Num_classes=none)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.