Two Methods for setting the initial value of Keras embeding

Source: Internet
Author: User
Tags theano keras

Random initialization of embedding

from keras.models import Sequentialfrom keras.layers import Embeddingimport numpy as npmodel = Sequential()model.add(Embedding(1000, 64, input_length=10))# the model will take as input an integer matrix of size (batch, input_length).# the largest integer (i.e. word index) in the input should be no larger than 999 (vocabulary size).# now model.output_shape == (None, 10, 64), where None is the batch dimension.input_array = np.random.randint(1000, size=(32, 10))model.compile('rmsprop', 'mse')output_array = model.predict(input_array)print(output_array)assert output_array.shape == (32, 10, 64)

Use the weights parameter to specify the initial value of embedding.

Import numpy as npimport kerasm = Keras. models. sequential () "You can use the weights parameter to specify the initial weights parameter. Because the embedding layer is an inaccessible gradient East stream, it makes no sense to place the embedding in the middle layer, emebedding can only be used as the first layer. It is very complicated to bind weights to embeddings. weights is a list "" embedding = Keras. layers. embedding (input_dim = 3, output_dim = 2, input_length = 1, weights = [NP. arange (3*2 ). reshape (3, 2)], mask_zero = true) M. add (embedding) # once added, the build function of embedding, print (Keras. backend. get_value (embedding. embeddings) M. compile (Keras. optimizers. rmsprop (), Keras. losses. MSE) print (M. predict ([1, 2, 2, 1, 2, 0]) print (M. get_layer (Index = 0 ). get_weights () print (Keras. backend. get_value (embedding. embeddings ))

The second method for setting initial values for embedding: Use initializer

Import numpy as npimport kerasm = Keras. models. sequential () "You can use the weights parameter to specify the initial weights parameter. Because the embedding layer is an inaccessible gradient East stream, it makes no sense to place the embedding in the middle layer, emebedding can only be used as the second method to set weights for embedding at the first layer, using constant_initializer "embedding = Keras. layers. embedding (input_dim = 3, output_dim = 2, input_length = 1, embeddings_initializer = Keras. initializers. constant (NP. arange (3*2, dtype = NP. float32 ). reshape (3, 2) M. add (embedding) print (Keras. backend. get_value (embedding. embeddings) M. compile (Keras. optimizers. rmsprop (), Keras. losses. MSE) print (M. predict ([1, 2, 2, 1, 2]) print (M. get_layer (Index = 0 ). get_weights () print (Keras. backend. get_value (embedding. embeddings ))

The key difficulty lies in figuring out how weights is passed into the embedding. embeddings tensor.

Embedding is a layer inherited from the layer. The layer has the weights parameter, and the weights parameter is a list. All the elements in it are numpy arrays. When the layer constructor is called, the weights parameter is stored_initial_weightsVariable
Basic_layer.py layer class

        if 'weights' in kwargs:            self._initial_weights = kwargs['weights']        else:            self._initial_weights = None

When the embedding layer is added to the model and spliced with the previous layer of the model, the layer (the previous layer) function is called. Here the layer is an embedding instance, embedding is a class that inherits layer, and the embedding class is not overwritten.__call__()Method, layer implements__call__()Method. Parent Layer__call__The method calls the call () method of the subclass to obtain the result. So the final call isLayer.__call__(). In this method, the system automatically checks whether the layer has been built (based on the Self. Built Boolean variable ).

Layer.__call__Functions are very important.

Def _ call _ (self, inputs, ** kwargs): "" wrapper around self. call (), for handling internal references. if a Keras tensor is passed:-We call self. _ add_inbound_node (). -If necessary, we 'built' the layer to match the _ keras_shape of the input (s ). -We update the _ keras_shape of every input tensor with its new shape (obtained via self. compute_output_shape ). this is done as part of _ add_inbound_node (). -We update the _ keras_history of the output tensor (s) with the current layer. this is done as part of _ add_inbound_node (). # arguments inputs: can be a tensor or list/tuple of tensors. ** kwargs: Additional keyword arguments to be passed to 'call ()'. # returns output of the layer's 'call' method. # raises valueerror: In case the layer is missing shape information for its 'build' call. "If I Sinstance (inputs, list): Inputs = inputs [:] with K. name_scope (self. name): # handle laying building (weight creating, input spec locking ). if not self. built: # If the build has not been performed, run build before calling the call function # raise events in case the input is not compatible # With the input_spec specified in the layer constructor. self. assert_input_compatibility (inputs) # collect input shapes to build layer. input_shapes = [] fo R x_elem in to_list (inputs): If hasattr (x_elem, '_ keras_shape'): input_shapes.append (x_elem. _ keras_shape) Elif hasattr (K, 'int _ shape'): input_shapes.append (k.int _ shape (x_elem) else: Raise valueerror ('You tried to call layer "'+ self. name + '". this layer has no information ''about its expected input shape, ''and thus cannot be built. ''you can build it manually via: ''' layer. build (batch_inpu T_shape) '') self. build (unpack_singleton (input_shapes) self. built = true # This statement is actually redundant, because self. the build function has set built to true # Load weights that were specified at Layer instantiation. if self. _ initial_weights is not none: # If weights is passed in, assign the weights parameter to each variable, which overwrites the preceding self. value assignment in the build function. Self. set_weights (self. _ initial_weights) # Raise exceptions in case the input is not compatible # With the input_spec set at build time. self. assert_input_compatibility (inputs) # handle mask propagation. previous_mask = _ collect_previus_mask (inputs) user_kwargs = copy. copy (kwargs) if not is_all_none (previus_mask): # the previous layer generated a mask. if has_arg (self. call, 'mask'): If 'mask' not in kwargs: # If mask is explicitly passed to _ call __, # We shoshould override the default mask. kwargs ['mask'] = previous_mask # handle Automatic shape inference (only useful for theano ). input_shape = _ collect_input_shape (inputs) # actually call the layer, # Collecting output (s), mask (s), and shape (s ). output = self. call (inputs, ** kwargs) output_mask = self. compute_mask (inputs, previus_mask) # If the layer returns tensors from its inputs, unmodified, # We copy them to avoid loss of tensor metadata. output_ls = to_list (output) inputs_ls = to_list (inputs) output_ls_copy = [] for X in output_ls: If X in inputs_ls: x = K. identity (x) output_ls_copy.append (x) Output = unpack_singleton (output_ls_copy) # inferring the output shape is only relevant for theano. if all ([S is not none for S in to_list (input_shape)]): output_shape = self. compute_output_shape (input_shape) else: If isinstance (input_shape, list): output_shape = [none for _ in input_shape] else: output_shape = none if (not isinstance (output_mask, (list, tuple) and Len (output_ls)> 1): # augment the mask to match the length of the output. output_mask = [output_mask] * Len (output_ls) # Add an inbound node to the layer, so that it keeps track # Of the call and of all new variables created during the call. # This also updates the layer history of the output tensor (s ). # If the input tensor (s) had not previous Keras history, # This does nothing. self. _ outputs (outputs = inputs, outputs = output, input_masks = inputs, output_masks = inputs, input_shapes = input_shape, output_shapes = output_shape, arguments = inputs) # apply activity regularizer if any: if (hasattr (self, 'activity _ regularizer ') and self. activity_regularizer is not none): with K. name_scope ('activity _ regularizer '): regularization_losses = [self. activity_regularizer (x) for X in to_list (output)] self. add_loss (regularization_losses, inputs = to_list (inputs) return output

If no build exists, the build () function of the embedding class is automatically called. Embedding. Build () does not care about weights. If the initializer it uses is not passed in,self.embeddings_initializerWill be changed to random initialization. If so, we can initialize weights in this step. If bothembeddings_initializerAnd weights parameters.Embedding#embeddingsOverwrite.

Embedding. py embedding class build Function

    def build(self, input_shape):        self.embeddings = self.add_weight(            shape=(self.input_dim, self.output_dim),            initializer=self.embeddings_initializer,            name='embeddings',            regularizer=self.embeddings_regularizer,            constraint=self.embeddings_constraint,            dtype=self.dtype)        self.built = True

In summary, using weights to assign values to layer variables in Keras is a common method, but it is not intuitive enough. Keras encourages more explicit initializer than weights.

Two Methods for setting the initial value of Keras embeding

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.