Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying The Handbook of NLP with Gensim
  • Table Of Contents Toc
  • Feedback & Rating feedback
The Handbook of NLP with Gensim

The Handbook of NLP with Gensim

By : Chris Kuo
5 (6)
close
close
The Handbook of NLP with Gensim

The Handbook of NLP with Gensim

5 (6)
By: Chris Kuo

Overview of this book

Navigating the terrain of NLP research and applying it practically can be a formidable task made easy with The Handbook of NLP with Gensim. This book demystifies NLP and equips you with hands-on strategies spanning healthcare, e-commerce, finance, and more to enable you to leverage Gensim in real-world scenarios. You’ll begin by exploring motives and techniques for extracting text information like bag-of-words, TF-IDF, and word embeddings. This book will then guide you on topic modeling using methods such as Latent Semantic Analysis (LSA) for dimensionality reduction and discovering latent semantic relationships in text data, Latent Dirichlet Allocation (LDA) for probabilistic topic modeling, and Ensemble LDA to enhance topic modeling stability and accuracy. Next, you’ll learn text summarization techniques with Word2Vec and Doc2Vec to build the modeling pipeline and optimize models using hyperparameters. As you get acquainted with practical applications in various industries, this book will inspire you to design innovative projects. Alongside topic modeling, you’ll also explore named entity handling and NER tools, modeling procedures, and tools for effective topic modeling applications. By the end of this book, you’ll have mastered the techniques essential to create applications with Gensim and integrate NLP into your business processes.
Table of Contents (24 chapters)
close
close
1
Part 1: NLP Basics
5
Part 2: Latent Semantic Analysis/Latent Semantic Indexing
9
Part 3: Word2Vec and Doc2Vec
12
Part 4: Topic Modeling with Latent Dirichlet Allocation
18
Part 5: Comparison and Applications

Determining the optimal number of topics

What defines a topic? A topic should be distinctive enough that it can represent a concept and the words associated with the concept. On the other hand, if a topic is a mixed BoW such that the topic is not concrete enough, it is better to separate the topic into two or more topics. As a result, the closeness of words in a topic is an important measure. Words in the same topic are better being close to each other.

In NLP, the metric to measure the closeness of a topic is called the coherence score. In Chapter 5, Cosine Similarity, we learned the cosine similarity that measures the similarities between any two words. The coherence score is the average or median of the word similarities of the top words in a topic. This definition was given by Röder, Both, and Hinneburg (2015) [2]. There are three metrics to compute the coherence score, as outlined here:

  • Content Vectors (CV): The default metric of gensim
  • UMass: A more popular...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY