Python Machine learning Library Keras--autoencoder encoding, feature compression __

Source: Internet
Author: User
Tags keras

Full Stack Engineer Development Manual (author: Shangpeng)

Python Tutorial Full Solution

Keras uses a depth network to achieve the encoding, that is, the n-dimensional characteristics of each sample, using K as a feature to achieve the function of coding compression. The feature selection function is also realized. For example, the handwriting contains 754 pixels, and it contains 754 features, if you want to represent them with two features. How do you recognize handwritten numbers in a two-dimensional matrix?

Self-encoder is unsupervised learning. It is an imitation of the human brain in the process of abstraction of the characteristics of layers, the learning process has two points: first, unsupervised learning, that is, the training data do not need labeling, this learning is the data content of the form of learning, extraction is the frequent occurrence of the characteristics of the second layer of abstraction, the characteristics of the need for continuous abstraction.

Since the encoder (Autoencoder), which can use its own higher-order characteristics of self-coding, since the encoder is actually a neural network, its input and output are consistent, with the idea of sparse coding, the goal is to use sparse higher-order features to reconstruct themselves.

It uses the handwriting read and write py file and the handwriting data file.

Reference Github:https://github.com/626626cdllp/kears/tree/master/autoencoder

Import NumPy as NP import mnist np.random.seed (1337) # for reproducibility to keras.datasets import mnist from KERAS.M Odels Import Model # generic models from keras.layers import dense, Input import matplotlib.pyplot as Plt x_train, Y_train = mnist . Get_training_data_set (60000, True,false) # Loads the Training sample dataset and one-hot the encoded sample tag DataSet. Max 60000 x_test, y_test = Mnist.get_test_data_set (10000, true,false) # Load test feature dataset, and One-hot coded test tag DataSet, max 10000 = NP . Array (X_train). Astype (BOOL) converts to Black-and-white Y_train = Np.array (y_train) x_test = Np.array (x_test). Astype (BOOL) # converted to black and white y_t EST = np.array (y_test) print (Dimension of the sample dataset: ', X_train.shape,y_train.shape) # (784) (d.) print (' Dimension of the test dataset: ', X_test . Shape,y_test.shape) # (100, 784) (100, 10) # compression feature dimensions to 2-D Encoding_dim = 2 # This are our input placeholder input_img = Input (shape= (784,)) # coding Layer encoded = dense (128, activation= ' Relu ') (input_img) encoded = dense ($, activation= ' Relu ') (Enco DED) encoded = dense (activation= ' relu ') (encoded) Encoder_output = DenSE (Encoding_dim) (encoded) # decoding Layer decoded = dense (activation= ' Relu ') (encoder_output) decoded = dense ($, activation= ' r Elu ') (decoded) decoded = dense (128, activation= ' Relu ') (decoded) decoded = dense (784, activation= ' Tanh ') (decoded) # Build self-coded Models Autoencoder = Model (inputs=input_img, outputs=decoded) # build code Model ENCODER = models (inputs=input_img, Outputs=encoder_out
Put) # compile Autoencoder autoencoder.compile (optimizer= ' Adam ', loss= ' MSE ') # will be the training features as input and output, so that the simultaneous training of encoding and decoding Autoencoder.fit (X_train, X_train, epochs=200, batch_size=256, shuffle=true) # Plotting Encoded_imgs = Encoder.predict (X _test) print (Encoded_imgs) plt.scatter (encoded_imgs[:, 0], encoded_imgs[:, 1], c=y_test, s=6) Plt.colorbar () plt.show ()

One trick we've used in our code is to convert a grayscale image into a black-and-white image. This simplifies the model. We can see from the result graph that the handwritten compression of different digits has a good distinction after two dimensions. We can also differentiate them in two-dimensional, so this two-dimensional property is an effective compression of the source 784 dimension.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.