Rnns in TensorFlow, a practical Guide and undocumented Features

Source: Internet
Author: User
Tags jupyter jupyter notebook

In a previous tutorial series I went over some of the theory behind recurrent neural (Networks) and the Rnns N of a simple RNN from scratch. That's a useful exercise, but in practice we do libraries like tensorflow with high-level primitives for dealing S.

With this using an RNN should is as easy as calling a function, right? Unfortunately that ' s not quite the case. In this post I want the some of the best practices for working with Rnns in TensorFlow, especially the Functionalit Y that isn ' t-documented on the official site.

The post comes with a Github repository this contains jupyter notebooks with minimal examples for:using TF. Sequenceexample batching and Padding dynamic RNN bidirectional dynamic RNN RNN Cells and Cell Wrappers masking the Loss

preprocessing Data:use TF. Sequenceexample

Check out the TF. Sequenceexample Jupyter Notebook here!

Rnns are used for sequential data this has inputs and/or outputs at multiple time steps. TensorFlow comes with a protocol buffer definition to deal with such data:tf. Sequenceexample.

Can load data directly from your python/numpy arrays, but it's probably in your best interest to USE TF. Sequenceexample instead. This data structure consists of the ' context ' for non-sequential features and ' feature_lists ' for sequential features. It ' s somewhat verbose (it blew up my latest dataset by 10x), but it comes with a few benefits that are worth ;d istributed training. split up data to multiple Tfrecord files, each containing many, and use Sequenceexamples Sorflow ' s built-in support for distributed training. Reusability. other people can re-use your model by bringing their data own. Sequenceexample format. No model code changes required. Use of TensorFlow data loading pipelines functions like tf.parse_single_sequence_example. Libraries like Tf.learn also come with convenience function, that expect your to feed data in protocol buffer format. Separation of data preprocessing and model CODE. USING TF. SequEnceexample forces to separate your data preprocessing and TensorFlow model code. This is good practice, as your model shouldn ' t make any assumptions about the input data it gets.

In practice, write a little the script that converts your the data into TF. Sequenceexample format and then writes one or more Tfrecord files. These tfrecord files are parsed from TensorFlow to become the input to your model:convert the data into TF. Sequenceexample format Write One or more Tfrecord files with the serialized data use TF. Tfrecordreader to read examples from the file Parse each example using Tf.parse_single_sequence_example (not in the Offici Al docs yet)

sequences = [[1, 2, 3], [4, 5, 1], [1, 2]] label_sequences = [[0, 1, 0], [1, 0]

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.