TF. Train. exponential_decay () exponential attenuation method in tensorflow

Source: Internet
Author: User
Tags integer division

 
Exponential_decay (learning_rate, global_step, decay_steps, decay_rate, staircase = false, name = none)

Usage:

 
TF. Train. exponential_decay ()

In tensorflow, exponential_decay () is an exponential decay Function Applied to the learning rate.

During model training, we recommend that you gradually reduce the learning rate as training progresses. This function requires the 'Global _ step' value to calculate the attenuation learning rate.

This function returns the learning rate after attenuation. The calculation equation of this function is as follows:

 

Parameters:

Learning_rate-initial learning rate

Global_step-global step for attenuation calculation. It must not be a negative number. Once fed, bacth_size is counted as one step.

Decay_steps-attenuation speed, which must not be a negative number. The learning_rate value is updated once every interval.

Decay_rate-attenuation coefficient and attenuation rate. For more information, see the function compute equation.

Decay_rate: exponential attenuation parameter (corresponding to α in α ^ t)

Decay_steps indicates the attenuation speed.

Attenuation speed. It must not be a negative number.

The step of the Learning Rate Update, that is, the number of steps at which the learning rate value is updated

 
Learning_rate, global_step, decay_steps, decay_rate, staircase = false, name = none

If the parameter 'stdcase' is 'true', 'Global _ Step/decay_steps 'is an integer division, and the attenuation learning rate follows a step function.

 

Global_step = TF. Variable (0, trainable = false)
Starter_learning_rate = 0.1
Learning_rate = TF. Train. exponential_decay (starter_learning_rate, global_step,
100000, 0.96, staircase = true)
# Passing global_step to minimize () will increment it at each step.
Learning_step = (
TF. Train. gradientdescentoptimizer (learning_rate)
. Minimize (... my loss..., global_step = global_step)
)

 

 

 

 

You can use the TF. Train. exponential_decay function to achieve the exponential attenuation learning rate.

Step: 1. First use a large learning rate (objective: to quickly obtain a better solution );

2. Then gradually reduce the learning rate through iteration (objective: to make the model more stable after training );

TF. Train. exponential_decay () exponential attenuation method in tensorflow

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.