TensorFlow Application Fizzbuzz

Source: Internet
Author: User

60 characters to solve Fizzbuzz problem:

For x in range (101):p rint "Fizz" [x%3*4::]+ "Buzz" [X%5*4::]or X

The following is solved with TensorFlow, compared with the above is very complex, but very interesting, and suitable for learning tensorflow, Divergent thinking, expand the scope of TensorFlow application.

TensorFlow Application Fizzbuzz

Reprint Please specify address: http://www.cnblogs.com/SSSR/p/5630497.html

The direct code is as follows:

For specific case explanation, please refer to: http://joelgrus.com/2016/05/23/fizz-buzz-in-tensorflow/

#-*-Coding:utf-8-*-"" "Created on Wed June 10:57:41 2016@author:ubuntu" "" # Fizz Buzz in tensorflow!# see Http://joel Grus.com/2016/05/23/fizz-buzz-in-tensorflow/import NumPy as Npimport tensorflow as Tfnum_digits = 10# Represent each Inpu T by a array of its binary digits.def binary_encode (i, num_digits): Return Np.array ([i >> D & 1 for D in Ran   GE (num_digits)] # One-hot encode the desired outputs: [Number, "Fizz", "Buzz", "Fizzbuzz"]def Fizz_buzz_encode (i): if  I% = = 0:return Np.array ([0, 0, 0, 1]) elif I% 5 = = 0:return Np.array ([0, 0, 1, 0]) elif I% 3 = 0:return Np.array ([0, 1, 0, 0]) Else:return np.array ([1, 0, 0, 0]) # Our goal are to produce fizzbuzz for the number S 1 to 100. So it would be# unfair to include these in our training data. Accordingly, the training data# corresponds to the numbers 101 to (2 * * num_digits-1). TrX = Np.array ([Binary_encode (I, N Um_digits) for I in range (101, 2 * * num_digits)]) TrY = Np.array ([Fizz_buzZ_encode (i) for I in range (101, 2 * * num_digits)]) # We LL want to randomly initialize weights.def init_weights (sh APE): Return TF. Variable (Tf.random_normal (Shape, stddev=0.01)) # Our model was a standard 1-hidden-layer Multi-layer-perceptron with relu# Activation.  The Softmax (which turns arbitrary real-valued outputs into# probabilities) gets applied in the cost Function.def model (X, W_h, w_o): h = Tf.nn.relu (Tf.matmul (X, W_h)) return Tf.matmul (H, w_o) # our variables. The input has a width of num_digits, and the output has width 4.X = tf.placeholder ("float", [None, num_digits]) Y = Tf.placehold ER ("float", [None, 4]) # How many units in the hidden layer.  Num_hidden = 100# Initialize the Weights.w_h = Init_weights ([Num_digits, Num_hidden]) w_o = Init_weights ([Num_hidden, 4]) # Predict y given x using the model.py_x = Model (x, W_h, w_o) # We ' ll train our model by minimizing a cost Function.cost = t F.reduce_mean (Tf.nn.softmax_cross_entropy_with_logits (py_x, Y)) Train_op = Tf.train.GradIentdescentoptimizer (0.05). Minimize (cost) # and we'll make predictions by choosing the largest output.predict_op = TF.ARGM  Ax (py_x, 1) # Finally, we need a-turn a prediction (and an original number) # into a fizz buzz Outputdef Fizz_buzz (i, Prediction): return [Str (i), "Fizz", "Buzz", "fizzbuzz"][prediction]batch_size = 128# Launch The graph in a sessionwit H TF. Session () as Sess:tf.initialize_all_variables (). Run () for epoch in range (10000): # Shuffle the data before E        Ach training iteration.        p = np.random.permutation (range (len (TrX)) TrX, TrY = Trx[p], try[p] # Train in batches of inputs. For start in range (0, Len (trX), batch_size): End = start + Batch_size sess.run (Train_op, Feed_dic        T={x:trx[start:end], Y:try[start:end]}) # and print the current accuracy on the training data. Print (Epoch, Np.mean (Np.argmax (TrY, axis=1) = = Sess.run (Predict_op, Feed_dict={x:trx, y:try} )) # and now for some fizz buzz numbers = Np.arange (1, 101) TeX = Np.transpose (Binary_encode (Numbers, num_digits) ) Tey = Sess.run (Predict_op, Feed_dict={x:tex}) output = Np.vectorize (Fizz_buzz) (numbers, Tey) print (output)

  

TensorFlow Application Fizzbuzz

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.