Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • TensorFlow 2.0 Quick Start Guide
  • Toc
  • feedback
TensorFlow 2.0 Quick Start Guide

TensorFlow 2.0 Quick Start Guide

By : Holdroyd
2.3 (3)
close
TensorFlow 2.0 Quick Start Guide

TensorFlow 2.0 Quick Start Guide

2.3 (3)
By: Holdroyd

Overview of this book

TensorFlow is one of the most popular machine learning frameworks in Python. With this book, you will improve your knowledge of some of the latest TensorFlow features and will be able to perform supervised and unsupervised machine learning and also train neural networks. After giving you an overview of what's new in TensorFlow 2.0 Alpha, the book moves on to setting up your machine learning environment using the TensorFlow library. You will perform popular supervised machine learning tasks using techniques such as linear regression, logistic regression, and clustering. You will get familiar with unsupervised learning for autoencoder applications. The book will also show you how to train effective neural networks using straightforward examples in a variety of different domains. By the end of the book, you will have been exposed to a large variety of machine learning and neural network TensorFlow techniques.
Table of Contents (15 chapters)
close
Free Chapter
1
Section 1: Introduction to TensorFlow 2.00 Alpha
5
Section 2: Supervised and Unsupervised Learning in TensorFlow 2.00 Alpha
7
Unsupervised Learning Using TensorFlow 2
8
Section 3: Neural Network Applications of TensorFlow 2.00 Alpha
13
Converting from tf1.12 to tf2

Building and instantiating our model

As we have seen previously, one technique for building a model is to pass the required layers into the tf.keras.Sequential() constructor. In this instance, we have three layers: an embedding layer, an RNN layer, and a dense layer.

The first, embedding layer is a lookup table of vectors, one vector for the numeric value of each character. It has the dimension, embedding_dimension. The middle, the recurrent layer is a GRU; its size is recurrent_nn_units. The last layer is a dense output layer of the length vocabulary_length units.

What the model does is look up the embedding, run the GRU for a single time step using the embedding for input, and pass this to the dense layer, which generates logits (log odds) for the next character.

A diagram showing this is as follows:

The code that implements this model is, therefore, as follows:

def build_model...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete