First stage-Getting Started details tensorflow1.4-(11) Tensorboard histogram Dashboard

Source: Internet
Author: User
Tags time interval

The Tensorboard histogram dashboard shows how the distribution of some tensor in the TensorFlow graph changes over time. It is visualized by displaying many histograms of tensor at different points in time.

One, see a basic example

A normally-distributed variable, a normal distribution value. The mean changes over time.
We use the tf.random_normal operation directly. Perfect solution.
Of course, you also use the tensorboard,summary operation to populate the data. Code Snippets.

Import TensorFlow as tf

k = Tf.placeholder (tf.float32)

# generates a normal distribution and changes the median of the normal distribution mean
Mean_moving_normal = Tf.random_normal (shape=[1000], mean= (5*k), stddev=1)
# Record This distribution as a histogram summary
tf.summary.histogram ("Normal/moving_ Mean ", Mean_moving_normal)

# Set a session, and write summary events files
sess = tf. Session ()
writer = Tf.summary.FileWriter ("/tmp/histogram_example")

summaries = Tf.summary.merge_all ()

# Set a 400 loop and write these summary to the hard drive
N = + for
step in range (n):
  k_val = step/float (n)
  Summ = Sess.run ( summaries, feed_dict={k:k_val})
  writer.add_summary (Summ, Global_step=step)

Use the command to start a Tensorboard instance.

Tensorboard--logdir=d:\tmp\histogram_example

In addition, you may notice that the histogram slices are not always evenly distributed at the step count or time interval. This is because Tensorboard uses pool sampling to preserve a subset of all histograms to conserve memory. (Reservoir sampling guarantees) pool sampling ensures that each sample has the same probability of being included, but because it is a random algorithm, the selected sample does not occur in even steps.

II, Overlay mode (overlay mode)


Select overly mode in histogram mode

In offset mode, the visualization rotates 45 degrees so that the individual histogram slices are not expanded in time, but are all drawn on the same y-axis.

Each slice is now a separate row on the chart, and the y-axis shows the number of items in each bucket. The darker lines are older, the earlier steps are lighter, and the lighter lines are relatively close. Again, you can hover your mouse over the chart to see some additional information.

In general, the overlay visualization are useful if you want to directly compare the counts of different histograms.
In general, overlay patterns are applied to directly compare different histograms.

Three, multimodal distributions (multi-model distribution)

The histogram dashboard is ideal for visualizing multi-mode distributions. Let's construct a simple Shuangfeng distribution by connecting the outputs of two different normal distributions. The code will look like the following.

Import TensorFlow as tf k = Tf.placeholder (tf.float32) # Make a normal distribution, with a shifting mean mean_moving_no Rmal = Tf.random_normal (shape=[1000], mean= (5*k), stddev=1) # Record that distribution into a histogram summary tf.summary . Histogram ("Normal/moving_mean", Mean_moving_normal) # Make a normal distribution with shrinking variance Variance_shrin King_normal = Tf.random_normal (shape=[1000], mean=0, stddev=1-(k)) # Record that distribution too Tf.summary.histogram (" Normal/shrinking_variance ", Variance_shrinking_normal) # Let's combine both of those distributions into one dataset Norma l_combined = Tf.concat ([Mean_moving_normal, Variance_shrinking_normal], 0) # We Add another histogram summary to record th e Combined distribution Tf.summary.histogram ("Normal/bimodal", normal_combined) summaries = Tf.summary.merge_all () # Se Tup a session and summary writer Sess = tf. Session () writer = Tf.summary.FileWriter ("/tmp/histogram_example") # Setup a loop and write the summariesto disk N = Net for step in range (n): K_val = Step/float (n) summ = Sess.run (summaries, feed_dict={k:k_val}) writer.
 Add_summary (Summ, Global_step=step)

You already remember the "moving mean" normal distribution in the example above. Now we also have a "shrinkage difference" distribution. Alongside, they look like this:

When we connect them together, we get a chart that clearly shows the different bimodal structures:

Four, Some more distributions (more distribution)

Interesting, let's build and visualize more distributions, and then combine them into one chart. Here's the code we'll use:

Import TensorFlow as tf k = Tf.placeholder (tf.float32) # Make a normal distribution, with a shifting mean mean_moving_no Rmal = Tf.random_normal (shape=[1000], mean= (5*k), stddev=1) # Record that distribution into a histogram summary tf.summary . Histogram ("Normal/moving_mean", Mean_moving_normal) # Make a normal distribution with shrinking variance Variance_shrin King_normal = Tf.random_normal (shape=[1000], mean=0, stddev=1-(k)) # Record that distribution too Tf.summary.histogram (" Normal/shrinking_variance ", Variance_shrinking_normal) # Let's combine both of those distributions into one dataset Norma l_combined = Tf.concat ([Mean_moving_normal, Variance_shrinking_normal], 0) # We Add another histogram summary to record th e Combined distribution Tf.summary.histogram ("Normal/bimodal", normal_combined) # ADD a gamma distribution gamma = Tf.ran Dom_gamma (shape=[1000], alpha=k) Tf.summary.histogram ("Gamma", gamma) # and a poisson distribution Poisson = Tf.random_po
Isson (shape=[1000], lam=k)Tf.summary.histogram ("Poisson", Poisson) # and a uniform distribution uniform = Tf.random_uniform (shape=[1000], maxval=k
*10) Tf.summary.histogram ("Uniform", uniform) # Finally, combine everything together! All_distributions = [Mean_moving_normal, Variance_shrinking_normal, Gamma, Poisson, uniform] All_comb ined = Tf.concat (all_distributions, 0) tf.summary.histogram ("all_combined", all_combined) summaries = Tf.summary.merge _all () # Setup a session and summary writer Sess = tf. 
Session () writer = Tf.summary.FileWriter ("/tmp/histogram_example") # Setup a loop and write the summaries to disk N = 400 For step in range (n): K_val = Step/float (n) summ = Sess.run (summaries, feed_dict={k:k_val}) writer.add_summary (sum
 M, Global_step=step)

Gamma distribution

Uniform distribution

Poisson distribution

The Poisson distribution is defined on an integer. Therefore, all generated values are perfect integers. Histogram compression moves the data to a floating-point database, causing the visual file to show a small bump on the integer value, rather than a perfect spike.

All Together Now

Finally, we can concatenate all the data into an interesting curve.

Blog ended.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.