Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Book Overview & Buying Natural Language Processing with Java
  • Table Of Contents Toc
  • Feedback & Rating feedback
Natural Language Processing with Java

Natural Language Processing with Java

By : Richard M. Reese
2 (3)
close
close
Natural Language Processing with Java

Natural Language Processing with Java

2 (3)
By: Richard M. Reese

Overview of this book

Natural Language Processing (NLP) allows you to take any sentence and identify patterns, special names, company names, and more. The second edition of Natural Language Processing with Java teaches you how to perform language analysis with the help of Java libraries, while constantly gaining insights from the outcomes. You’ll start by understanding how NLP and its various concepts work. Having got to grips with the basics, you’ll explore important tools and libraries in Java for NLP, such as CoreNLP, OpenNLP, Neuroph, and Mallet. You’ll then start performing NLP on different inputs and tasks, such as tokenization, model training, parts-of-speech and parsing trees. You’ll learn about statistical machine translation, summarization, dialog systems, complex searches, supervised and unsupervised NLP, and more. By the end of this book, you’ll have learned more about NLP, neural networks, and various other trained models in Java for enhancing the performance of NLP applications.
Table of Contents (14 chapters)
close
close

Representing Text with Features

Text contains features that need to be extracted, bearing in mind their context, but processing a whole section of text together to include context is very difficult for machines.

In this chapter, we will see how text is presented using N-grams and what role they play in associating the context. We will see word embedding, in which the words' representations are converted or mapped to numbers (real numbers) so that machines can understand and process them in a better way. This may lead to the issue of high dimensionality due to the amount of text. So, next, we will see how to reduce the dimensions of vectors in such a way that the context is preserved. 

In this chapter we will cover the following topics:

  • N-grams
  • Word embedding
  • GloVe
  • word2vec
  • Dimensionality reduction
  • Principle component analysis
  • Distributed stochastic neighbor embedding...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech

Create a Note

Modal Close icon
You need to login to use this feature.
notes
bookmark search playlist font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Delete Note

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete

Edit Note

Modal Close icon
Write a note (max 255 characters)
Cancel
Update Note

Confirmation

Modal Close icon
claim successful

Buy this book with your credits?

Modal Close icon
Are you sure you want to buy this book with one of your credits?
Close
YES, BUY