Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Deep Learning with PyTorch Quick Start Guide
  • Toc
  • feedback
Deep Learning with PyTorch Quick Start Guide

Deep Learning with PyTorch Quick Start Guide

By : David Julian
3.3 (3)
close
Deep Learning with PyTorch Quick Start Guide

Deep Learning with PyTorch Quick Start Guide

3.3 (3)
By: David Julian

Overview of this book

PyTorch is extremely powerful and yet easy to learn. It provides advanced features, such as supporting multiprocessor, distributed, and parallel computation. This book is an excellent entry point for those wanting to explore deep learning with PyTorch to harness its power. This book will introduce you to the PyTorch deep learning library and teach you how to train deep learning models without any hassle. We will set up the deep learning environment using PyTorch, and then train and deploy different types of deep learning models, such as CNN, RNN, and autoencoders. You will learn how to optimize models by tuning hyperparameters and how to use PyTorch in multiprocessor and distributed environments. We will discuss long short-term memory network (LSTMs) and build a language model to predict text. By the end of this book, you will be familiar with PyTorch's capabilities and be able to utilize the library to train your neural networks with relative ease.
Table of Contents (8 chapters)
close

Optimization techniques

The torch.optim package contains a number of optimization algorithms, and each of these algorithms has several parameters that we can use to fine-tune deep learning models. Optimization is a critical component in deep learning, so it is no surprise that different optimization techniques can be key to a model's performance. Remember, its role is to store and update the parameter state based on the calculated gradients of the loss function.

Optimizer algorithms

There are a number of optimization algorithms besides SGD available in PyTorch. The following code shows one such algorithm:

optim.Adadelta(params, lr=1.0, rho=0.9, eps=1e-06, weight_decay=0)

The Adedelta algorithm is based on stochastic gradient...

bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete