Tensorflow training MNIST (1), tensorflowmnist
First, I encountered a problem. When downloading MNIST training data, the Code reported an error:
Urllib. error. URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ ssl. c: 748)>
This is because a new feature is introduced after Python is upgraded to 2.7.9. When urllib. urlopen is used to open an https link, the SSL certificate is verified once. When the target website uses a self-signed certificate, a urllib2.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ ssl. c: 581)> error message, details can be viewed here (https://www.python.org/dev/peps/pep-0476 ). There are two solutions:
1. Use ssl to create unauthenticated context and pass context parameters in urlopen
import sslimport urllib2context = ssl._create_unverified_context()print urllib2.urlopen("https://www.12306.cn/mormhweb/", context=context).read()
2. Cancel certificate verification globally
Import sslimport urllib2 ssl. _ create_default_https_context = ssl. _ create_unverified_context print urllib2.urlopen ("https://www.12306.cn/mormhweb/"). read ()
Thanks for the original article, see: http://bookshadow.com/weblog/2015/04/22/sae-python-weibo-sdk-certificate-verify-failed/
Enter the subject:
1. Create input_data.py to download MNIST data.
1 from __future__ import absolute_import 2 from __future__ import division 3 from __future__ import print_function 4 5 import gzip 6 import os 7 import tempfile 8 9 import numpy10 from six.moves import urllib11 from six.moves import xrange # pylint: disable=redefined-builtin12 import tensorflow as tf13 from tensorflow.contrib.learn.python.learn.datasets.mnist import read_data_sets
2. Establish and train the Softmax regression model.
#! /Usr/bin/env python3 #-*-coding: UTF-8-*-import input_dataimport tensorflow as tf # MNIST data input mnist = input_data.read_data_sets ("MNIST_data/", one_hot =) x = tf. placeholder (tf. float32, [None, 784]) # image input vector W = tf. variable (tf. zeros ([]) # weight. The initial value is all zero. B = tf. variable (tf. zeros ([10]) # offset. The initial value is zero. # Calculate the model. y indicates prediction, and y _ indicates actual y = tf. nn. softmax (tf. matmul (x, W) + B) y _ = tf. placeholder ("float", [None, 10]) # calculates the cross entropy cross_entropy =-tf. reduce_sum (y _ * tf. log (y) # Next we will use the BP Algorithm for fine-tuning and train_step = tf at a learning rate of 0.01. train. gradientDescentOptimizer (0.01 ). minimize (cross_entropy) # Set the model above and add the initialization operation init = tf. initialize_all_variables () # Start the created model and initialize the variable sess = tf. session () sess. run (init) # Start training model, 1000 for I in range (1000): #100 batch processing data points (batch_xs, batch_ys = mnist) in the training data are randomly captured. train. next_batch (100) sess. run (train_step, feed_dict = {x: batch_xs, y _: batch_ys}) ''' evaluate the model ''' # determine whether the prediction tag matches the actual tag with correct_prediction = tf. equal (tf. argmax (y, 1), tf. argmax (y _, 1) accuracy = tf. performance_mean (tf. cast (correct_prediction, "float") # Calculate the correct rate of the learned model on the test dataset print (sess. run (accuracy, feed_dict = {x: mnist. test. images, y _: mnist. test. labels }))
See: http://blog.csdn.net/willduan1/article/details/52024254
Into a small advertisement: the most distinctive marketing platform www.91ifx.com