first to do simple offline regression, least squares using tensorflow to achieve, the code principle is as follows:
#encoding: utf-8 Import sys import tensorflow as TF import NumPy as NP X_data=np.random.rand (MB). Astype (Np.float32) Y_dat a=x_data*0.1+0.55 #create tensortdlow strctru start WEIGHTS=TF. Variable (Tf.random_uniform ([1],-1.0,1.0)) biases=tf. Variable (Tf.zeros ([1])) y=weights*x_data+biases Loss=tf.reduce_mean (Tf.square (y-y_data)) op= Tf.train.GradientDescentOptimizer (0.5) train=op.minimize (loss) init=tf.initialize_all_variables () sess=tf. Session () Sess.run (init) to I in Xrange (): Sess.run (train) if I%20==0:print (I,sess.run (Weights), Sess.run (biases)) The results are: (' 0 ', array ([ -0.17141712], Dtype=float32), Array ([0.96016634], Dtype=float32)) (' m ', array ([ -0.00157562], dtype= float32), Array ([0.60436034], Dtype=float32)) (' + ', array ([0.07296405], Dtype=float32), Array ([0.56446886], dtype= float32)) (' m ', array ([0.09280396], Dtype=float32), Array ([0.55385113], Dtype=float32)) (' O ', Array ([0.09808466], DT YPE=FLOAT32), Array ([0.55102503], Dtype=float32)) (' MB ', array ([0.09949023], DTYPE=FLOAT32), Array ([0.55027282], Dtype=float32)) (' Click ', Array ([0.09986429], Dtype=float32), Array ([0.55007261], dtype= float32)) (' 140 ', array ([0.09996392], Dtype=float32), Array ([0.55001932], Dtype=float32)) (' 160 ', array ([0.09999041], DTYPE=FLOAT32), Array ([0.55000514], Dtype=float32)) (' 180 ', array ([0.09999743], Dtype=float32), Array ([0.55000138], Dtype=float32)) (' m ', array ([0.09999929], Dtype=float32), Array ([0.55000037], Dtype=float32)) (' The ') ', Array ([ 0.0999998], Dtype=float32), Array ([0.55000013], Dtype=float32)) ('--", Array ([0.09999982], Dtype=float32), Array ([ 0.55000013], Dtype=float32)
The Softmax situation is as follows:
From tensorflow.examples.tutorials.mnist import input_data import TensorFlow as TF mnist=input_data.read_data_sets (" mnist_data/", One_hot=true) X=tf.placeholder (" float ", [none,784]) w=tf. Variable (Tf.zeros ([784,10])) B=TF. Variable (Tf.zeros) Y=tf.nn.softmax (Tf.matmul (x,w) +b) #模型拟合值 y_=tf.placeholder ("float", [none,10]) #实际值 Cross_ Entropy=-tf.reduce_sum (Y_*tf.log (y)) Train_step = Tf.train.GradientDescentOptimizer (0.01). Minimize (Cross_entropy) init = Tf.global_variables_initializer () sess=tf. Session () Sess.run (init) to I in range (10000): Batch_xs,batch_ys=mnist.