TensorFlow Get variables & print weights and Other methods

Source: Internet
Author: User
Tags net return variable scope
TensorFlow Get variables & print weights and Other methods

In the use of tensorflow, we often need to get the value of a variable, such as: print the weight of a layer, usually we can directly use the variable's Name property to get, but when we use some third party library to construct neural network layer, There is a situation where we cannot define variables of this layer ourselves because they are automatically defined. For example, when using TensorFlow's slim library:
def resnet_stack (images, Output_shape, Hparams, Scope=none): "" "Create a resnet style transfer block.  Args:images: [Batch-size, height, width, channels] image tensor to feed as input output_shape:output image shape In form [height, width, channels] hparams:hparams objects scope:variable scope returns:images after pro
  Cessing with resnet blocks. "" "end_points = {} if Hparams.noise_channel: # Separate the noise for visualization end_points[' noise '] = ima ges[:,:,:,-1] assert images.shape.as_list () [1:3] = = Output_shape[0:2] with Tf.variable_scope (scope, ' Resnet_style_ Transfer ', [images]): With Slim.arg_scope ([slim.conv2d], Normalizer_fn=slim.batch_norm, Kerne L_size=[hparams.generator_kernel_size] * 2, stride=1): NET = slim.conv2d (images, Hparam S.resnet_filters, Normalizer_fn=none, Activation_fn=tf.nn.relu) for blocks in range (Hparams.resne T_blocks): NET = Resnet_block (NET, hparams) end_points[' resnet_block_{} '. Format (block) = Net net = Slim.con v2d (NET, output_shape[-1], kernel_size=[1, 1], Normalizer_fn=none, ACTI Vation_fn=tf.nn.tanh, scope= ' conv_out ') end_points[' transferred_images '] = net return NET, end_points

We want to get the weight weight of the first convolution layer.
In training, these can be TensorFlow stored in tf.trainable_variables (), so we can print tf.trainable_variables () To get the name of the convolution layer (or you can also view the variable's name according to scope), and then use Tf.get_default_grap (). Get_tensor_by_name to get the variable. For a simple example:
Import TensorFlow as TF with
tf.variable_scope ("Generate"): With
    tf.variable_scope ("Resnet_stack"):
        # For simplicity's sake, there is no Third-party library to illustrate,
        bias = tf. Variable (0.0,name= "bias")
        weight = tf. Variable (0.0,name= "weight") for

TV in Tf.trainable_variables ():
    print (tv.name)

B = tf.get_default_graph (). Get_tensor_by_name ("generate/resnet_stack/bias:0")
W = tf.get_default_graph (). Get_tensor_by_name (" Generate/resnet_stack/weight:0 ") with

TF. Session () as Sess:
    Tf.global_variables_initializer (). Run ()
    print (Sess.run (b))
    Print (Sess.run (w))

The results are as follows:



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.