Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Hands-On Deep Learning Algorithms with Python
  • Table Of Contents Toc
  • Feedback & Rating feedback
Hands-On Deep Learning Algorithms with Python

Hands-On Deep Learning Algorithms with Python

By : Sudharsan Ravichandiran
4.1 (13)
close
close
Hands-On Deep Learning Algorithms with Python

Hands-On Deep Learning Algorithms with Python

4.1 (13)
By: Sudharsan Ravichandiran

Overview of this book

Deep learning is one of the most popular domains in the AI space that allows you to develop multi-layered models of varying complexities. This book introduces you to popular deep learning algorithms—from basic to advanced—and shows you how to implement them from scratch using TensorFlow. Throughout the book, you will gain insights into each algorithm, the mathematical principles involved, and how to implement it in the best possible manner. The book starts by explaining how you can build your own neural networks, followed by introducing you to TensorFlow, the powerful Python-based library for machine learning and deep learning. Moving on, you will get up to speed with gradient descent variants, such as NAG, AMSGrad, AdaDelta, Adam, and Nadam. The book will then provide you with insights into recurrent neural networks (RNNs) and LSTM and how to generate song lyrics with RNN. Next, you will master the math necessary to work with convolutional and capsule networks, widely used for image recognition tasks. You will also learn how machines understand the semantics of words and documents using CBOW, skip-gram, and PV-DM. Finally, you will explore GANs, including InfoGAN and LSGAN, and autoencoders, such as contractive autoencoders and VAE. By the end of this book, you will be equipped with all the skills you need to implement deep learning in your own projects.
Table of Contents (17 chapters)
close
close
Free Chapter
1
Section 1: Getting Started with Deep Learning
4
Section 2: Fundamental Deep Learning Algorithms
10
Section 3: Advanced Deep Learning Algorithms

Chapter 5 - Improvements to the RNN

  1. A Long Short-Term Memory (LSTM) cell is a variant of an RNN that resolves the vanishing gradient problem by using a special structure called gates. Gates keep the information in the memory as long as it is required. They learn what information to keep and what information to discard from the memory.
  2. LSTM consists of three types of gates, namely, the forget gate, the input gate, and the output gate. The forget gate is responsible for deciding what information should be removed from the cell state (memory). The input gate is responsible for deciding what information should be stored in the cell state. The output gate is responsible for deciding what information should be taken from the cell state to give as an output.
  3. The cell state is also called internal memory where all the information will be stored.
  4. While backpropagating the LSTM network...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech

Create a Note

Modal Close icon
You need to login to use this feature.
notes
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Delete Note

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Edit Note

Modal Close icon
Write a note (max 255 characters)
Cancel
Update Note

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY