Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Hands-On Neural Network Programming with C#
  • Table Of Contents Toc
  • Feedback & Rating feedback
Hands-On Neural Network Programming with C#

Hands-On Neural Network Programming with C#

By : Matt Cole
2 (1)
close
close
Hands-On Neural Network Programming with C#

Hands-On Neural Network Programming with C#

2 (1)
By: Matt Cole

Overview of this book

Neural networks have made a surprise comeback in the last few years and have brought tremendous innovation in the world of artificial intelligence. The goal of this book is to provide C# programmers with practical guidance in solving complex computational challenges using neural networks and C# libraries such as CNTK, and TensorFlowSharp. This book will take you on a step-by-step practical journey, covering everything from the mathematical and theoretical aspects of neural networks, to building your own deep neural networks into your applications with the C# and .NET frameworks. This book begins by giving you a quick refresher of neural networks. You will learn how to build a neural network from scratch using packages such as Encog, Aforge, and Accord. You will learn about various concepts and techniques, such as deep networks, perceptrons, optimization algorithms, convolutional networks, and autoencoders. You will learn ways to add intelligent features to your .NET apps, such as facial and motion detection, object detection and labeling, language understanding, knowledge, and intelligent search. Throughout this book, you will be working on interesting demonstrations that will make it easier to implement complex neural networks in your enterprise applications.
Table of Contents (16 chapters)
close
close
13
Activation Function Timings

Long short-term memory

Long short-term memory (LSTM) networks are a specialized form of recurrent neural network. They have the ability to retain long-term memory of things they have encountered in the past. In an LSTM, each neuron is replaced by what is known as a memory unit. This memory unit is activated and deactivated at the appropriate time, and is actually what is known as a recurrent self-connection.

If we step back for a second and look at the back-propagation phase of a regular recurrent network, the gradient signal can end up being multiplied many times by the weight matrix of the synapses between the neurons within the hidden layer. What does this mean exactly? Well, it means that the magnitude of those weights can then have a stronger impact on the learning process. This can be both good and bad.

If the weights are small they can lead to what is known as vanishing...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech

Create a Note

Modal Close icon
You need to login to use this feature.
notes
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Delete Note

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Edit Note

Modal Close icon
Write a note (max 255 characters)
Cancel
Update Note

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY