Unsupervised Learning:use Cases
Contents
- Visualization
- K-means Clustering
- Transfer Learning
- K-nearest Neighbors
The features learned by deep neural networks can is used for the purposes of classification, clustering and regression.
Neural nets is simply universal approximators using non-linearities. They produce "good" features by learning to reconstruct data through pretraining or through backpropagation. In the latter case, neural nets plug into arbitrary loss functions to map inputs to outputs.
The features learned by neural networks can is fed into any variety of other algorithms, including traditional Machine-lea rning algorithms that group input, softmax/logistic regression This classifies it, or simple regression that predicts a VA Lue.
So you can think of neural networks as feature-producers, plug modularly into other functions. For example, could make a convolutional neural network learn image features on ImageNet with supervised training, and Then you could take the activations/features learned by that neural network and feeds it into a second algorithm that would Learn to group images.
Here are a list of use cases for features generated by neural networks:
Visualization
t-distributed Stochastic neighbor embedding (T-sne) is a algorithm used to reduce high-dimensional data into or three dimensions, which can then is represented in a scatterplot. T-sne is used for finding latent trends in data. DEEPLEARNING4J relies on T-sne for some visualizations, and it's an interesting end point for neural network features. For more information and downloads, see Thispage on T-sne.
renders -deeplearning4j relies on visual renders as heuristics to monitor how well a neural network is learning. That is, renders was used to debug. They help us visualize activations over time, and activations over time is an indicator of what and how much the network is learning.
K-means Clustering
K-means is an algorithm used for automatically labeling activations based on their raw distances from other input in a VEC Tor space. There is no target or loss function; K-means picks so-called centroids. K-means creates centroids through a repeated averaging of all the data points. K-means classifies new data by its proximity to a given centroid. Each centroid are associated with a label. This was an example of unsupervised learning (learning lacking a loss function) that applies labels.
Transfer Learning
Transfer Learning takes the activations of one neural network and puts them to use as features for another algorithm or CL Assifier. For example, you can take the model of a convnet trained on ImageNet, and pass fresh images through it into another Algori tHM, such as K-nearest Neighbor. The strict definition of transfer learning is just that:taking the model trained on one set of data, and plugging it into Another problem.
K-nearest Neighbors
This algorithm serves the purposes of classification and regression, and relies on a kd-tree. A kd-tree is a data structure for storing a finite set of points from a k-dimensional space. It partitions a space of arbitrary dimensions into a tree, and which may also is called a vantage point tree. Kd-trees Subdivide a space with a tree structure, and you navigate the tree to find the closest points. The label associated with the closest points are applied to input.
Let your input and training examples be vectors. Training vectors might is arranged in a binary tree like so:
If you were to visualize those nodes in both dimensions, partitioning space at all branch, then the Kd-tree would look lik E this:
Now, let's saw you place a new input, X, in the tree ' s partitioned space. This allows identify both the parent and a child of that space within the tree. The X then constitutes the center of a circle whose radius are the distance to the child node of this space. By definition, nodes within the circle's circumference can be nearer.
And finally, if you want to make art with kd-trees, you could does a lot worse than this:
(Hat tip To Andrew Moore's CMU for his excellent diagrams.)
Other Resources
- Introduction to deep neural Networks
- Iris Tutorial
- deeplearning4j Quickstart Examples
- Nd4j:numpy for the JVM
Unsupervised Learning:use Cases