Paper notes: Large scale distributed semi-supervised learning Using streaming approximation

Source: Internet
Author: User

Large Scale distributed semi-supervised learning Using streaming approximation

Google

Official Blog Link:https://research.googleblog.com/2016/10/graph-powered-machine-learning-at-google.html

  

Today is about a large-scale distributed semi-supervised learning framework based on streaming approximation, from Google.

  absrtact : It is well known that traditional graph-based semi-supervised learning methods are not suitable for processing large volumes of data and large label scenes because of their computational capacity and their side | e| and the number of direct label M is a linear relationship. In order to deal with large scale labeling problems, recent work has proposed sketch-based methods to predict the label distribution of each node, thus reducing the spatial complexity from O (m) to O (log m) under certain conditions.

This paper presents a novel streaming graph-based SSL approximation method which effectively captures the sparsity of the label distribution (the sparisity) and further reduces the space complexity to O (1). At the same time, this paper proposes a distributed version of the algorithm can deal with the situation of large quantities of data. Experiments in the real world dataset show that the proposed method can achieve significant memory degradation compared with the existing methods. Finally, a graph augmentation strategy with semi-supervised deep learning framework is proposed in this paper, and a better semi-supervised learning effect is obtained in the application of natural language.

  Introduction : SSL is used to train a predictive system (prediction systems) with a small amount of tagged data and massive, untagged data. Its research significance is that the existing labeling is always small, and labeling work is tedious time-consuming, and no label data is massive, how to use limited tagged data combined with a large number of unlabeled data, and further improve the performance of existing models, is a worthy concern.

The limitations of different SSL methods are mainly embodied in: expensive computational costs! For example, the Transductive SVM and the graph-based SSL algorithm are a relatively famous subclass of the SSL algorithm. The core idea of these methods is to construct and smooth a graph, using points and edges to link their relationships. Benquan (Edge weights) is based on the similarity between nodes. Graph-based methods, which is based on label Transfer (label propagation), uses the existing seed nodes to pass the tag information through Graph. These methods usually converge quickly, and their time and space complexity are linearly related to the number of edges and the number of labels.

However, some of the scenarios involved in the number of samples and the number of labels is really very large, the regular graph-based SSL method can not be processed. In general, individual nodes are initialized with sparse label distributions, but as the number of iterations increases, they become dense. Talukdar and Cohen recently proposed a method of "1" to try to overcome the label scale problem by a count-min Sketch method to predict each node's label and their score. This makes the memory complexity very low. However, in the real world application, the number of actual label K and the connection of each node is actually sparse, although the total label space is very huge, that is, K is much smaller than M. Obviously, in practical applications, considering the sparsity of the label can significantly reduce the complexity.

  Contributions:

1. This paper presents a new graph propagation algorithm for general purpose SSL.

2. The algorithm can handle cases with a large number of labels. The core is that using a approximation effectively captures the sparsity of the label distribution, ensuring that the algorithm can accurately pass the label.

3. The algorithm of parallel processing version is proposed, which can deal with large graph sizes well.

4. Proposed an effective linear time composition strategy, can effectively combine a variety of signals, can be dynamic from sparse to dense representation.

5. In particular, graphs, nodes represent textual information, using only raw text and top-notch DL technology, and may learn from the latent semantic embeddings that are associated with these nodes.

This embedding enhances the original graph, and then uses graph SSL to produce a noticeable boost.

  graph-based semi-supervised Learning :

  Preliminary : The goal is to produce a soft assignment of labels to every node in a graph g= (v,e,w).

  

  Graph SSL Optimization :

learn a label distribution $Y ^\hat$ by minimizing the following target functions:

where N (v) represents the nearest neighbor of Node V, and U is the prior distribution of all labels.

  

 

Paper notes: Large scale distributed semi-supervised learning Using streaming approximation

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.