Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Deep Learning Quick Reference
  • Table Of Contents Toc
  • Feedback & Rating feedback
Deep Learning Quick Reference

Deep Learning Quick Reference

By : Mike Bernico
4.5 (6)
close
close
Deep Learning Quick Reference

Deep Learning Quick Reference

4.5 (6)
By: Mike Bernico

Overview of this book

Deep learning has become an essential necessity to enter the world of artificial intelligence. With this book deep learning techniques will become more accessible, practical, and relevant to practicing data scientists. It moves deep learning from academia to the real world through practical examples. You will learn how Tensor Board is used to monitor the training of deep neural networks and solve binary classification problems using deep learning. Readers will then learn to optimize hyperparameters in their deep learning models. The book then takes the readers through the practical implementation of training CNN's, RNN's, and LSTM's with word embeddings and seq2seq models from scratch. Later the book explores advanced topics such as Deep Q Network to solve an autonomous agent problem and how to use two adversarial networks to generate artificial images that appear real. For implementation purposes, we look at popular Python-based deep learning frameworks such as Keras and Tensorflow, Each chapter provides best practices and safe choices to help readers make the right decision while training deep neural networks. By the end of this book, you will be able to solve real-world problems quickly with deep neural networks.
Table of Contents (15 chapters)
close
close

1D CNNs for natural language processing

Way back in Chapter 7, Training a CNN From Scratch, we used convolutions to slide a window over regions of an image to learn complex visual features. This allowed us to learn important local visual features, regardless of where in the picture those features might have been, and then hierarchically learn more and more complex features as our network got deeper. We typically used a 3 x 3 or 5 x 5 filter on a 2D or 3D image. You may want to review Chapter 7, Training a CNN From Scratch, if you are feeling rusty on your understanding of convolution layers and how they work.

It turns out that we can use the same strategy on a sequence of words. Here, our 2D matrix is the output from an embedding layer. Each row represents a word, and all the elements in that row are its word vector. Continuing with the preceding example, we would have a 10 x...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech

Create a Note

Modal Close icon
You need to login to use this feature.
notes
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Delete Note

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Edit Note

Modal Close icon
Write a note (max 255 characters)
Cancel
Update Note

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY