Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Intelligent Projects Using Python
  • Toc
  • feedback
Intelligent Projects Using Python

Intelligent Projects Using Python

By : Pattanayak
5 (3)
close
Intelligent Projects Using Python

Intelligent Projects Using Python

5 (3)
By: Pattanayak

Overview of this book

This book will be a perfect companion if you want to build insightful projects from leading AI domains using Python. The book covers detailed implementation of projects from all the core disciplines of AI. We start by covering the basics of how to create smart systems using machine learning and deep learning techniques. You will assimilate various neural network architectures such as CNN, RNN, LSTM, to solve critical new world challenges. You will learn to train a model to detect diabetic retinopathy conditions in the human eye and create an intelligent system for performing a video-to-text translation. You will use the transfer learning technique in the healthcare domain and implement style transfer using GANs. Later you will learn to build AI-based recommendation systems, a mobile app for sentiment analysis and a powerful chatbot for carrying customer services. You will implement AI techniques in the cybersecurity domain to generate Captchas. Later you will train and build autonomous vehicles to self-drive using reinforcement learning. You will be using libraries from the Python ecosystem such as TensorFlow, Keras and more to bring the core aspects of machine learning, deep learning, and AI. By the end of this book, you will be skilled to build your own smart models for tackling any kind of AI problems without any hassle.
Table of Contents (12 chapters)
close

Building a sequence-to-sequence model

The architecture of the sequence-to-sequence model that we will be using for building the chatbot will have slight modifications to the basic sequence-to-sequence architecture illustrated previously in Figure 8.2. The modified architecture can be seen in the following diagram (Figure 8.3):

Figure 8.3: Sequence-to-sequence model

Instead of feeding the hidden state and the cell state of the last step of the encoder to the initial hidden and cell states of the Decoder LSTM, we feed the hidden state at each input step of the decoder. To predict the target word wt at any step t, the inputs are the previous target word, wt-1, at any step, t-1, and the hidden state .

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete