Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Neural Networks with Keras Cookbook
  • Toc
  • feedback
Neural Networks with Keras Cookbook

Neural Networks with Keras Cookbook

By : V Kishore Ayyadevara
3.3 (8)
close
Neural Networks with Keras Cookbook

Neural Networks with Keras Cookbook

3.3 (8)
By: V Kishore Ayyadevara

Overview of this book

This book will take you from the basics of neural networks to advanced implementations of architectures using a recipe-based approach. We will learn about how neural networks work and the impact of various hyper parameters on a network's accuracy along with leveraging neural networks for structured and unstructured data. Later, we will learn how to classify and detect objects in images. We will also learn to use transfer learning for multiple applications, including a self-driving car using Convolutional Neural Networks. We will generate images while leveraging GANs and also by performing image encoding. Additionally, we will perform text analysis using word vector based techniques. Later, we will use Recurrent Neural Networks and LSTM to implement chatbot and Machine Translation systems. Finally, you will learn about transcribing images, audio, and generating captions and also use Deep Q-learning to build an agent that plays Space Invaders game. By the end of this book, you will have developed the skills to choose and customize multiple neural network architectures for various deep learning problems you might encounter.
Table of Contents (18 chapters)
close

Generating captions, using beam search

In the previous section on caption generation, we have decoded based on the word that has the highest probability in a given time step. In this section, we'll improve upon the predicted captions by using beam search.

Getting ready

Beam search works as follows:

  • Extract the probability of various words in first time step (where VGG16 features of the picture and the start token are the input)
  • Instead of providing the most probable word as the output, we'll consider the top three probable words
  • We'll proceed to the next time step, where we extract the top three characters in this time step
  • We'll loop through the top three predictions in first time step, as an input to...
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete