Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Go Machine Learning Projects
  • Toc
  • feedback
Go Machine Learning Projects

Go Machine Learning Projects

By : Xuanyi Chew
5 (1)
close
Go Machine Learning Projects

Go Machine Learning Projects

5 (1)
By: Xuanyi Chew

Overview of this book

Go is the perfect language for machine learning; it helps to clearly describe complex algorithms, and also helps developers to understand how to run efficient optimized code. This book will teach you how to implement machine learning in Go to make programs that are easy to deploy and code that is not only easy to understand and debug, but also to have its performance measured. The book begins by guiding you through setting up your machine learning environment with Go libraries and capabilities. You will then plunge into regression analysis of a real-life house pricing dataset and build a classification model in Go to classify emails as spam or ham. Using Gonum, Gorgonia, and STL, you will explore time series analysis along with decomposition and clean up your personal Twitter timeline by clustering tweets. In addition to this, you will learn how to recognize handwriting using neural networks and convolutional neural networks. Lastly, you'll learn how to choose the most appropriate machine learning algorithms to use for your projects with the help of a facial detection project. By the end of this book, you will have developed a solid machine learning mindset, a strong hold on the powerful Go toolkit, and a sound understanding of the practical implementations of machine learning algorithms in real-world projects.
Table of Contents (12 chapters)
close

Putting it all together

Now we have all the pieces. Let's look at how to put it all together:

  1. We first ingest the dataset and then split the data out into training and cross validation sets. The dataset is split into ten parts for a k-fold cross-validation. We won't do that. Instead, we'll do a single fold cross-validation by holding out 30% of the data for cross-validation:
  typ := "bare"
examples, err := ingest(typ)
log.Printf("errs %v", err)
log.Printf("Examples loaded: %d", len(examples))
shuffle(examples)
cvStart := len(examples) - len(examples)/3
cv := examples[cvStart:]
examples = examples[:cvStart]
  1. We then train the classifier and then check to see whether the classifier can predict its own dataset well:
  c := New()
c.Train(examples)

var corrects, totals float64
for _, ex := range examples {
// log.Printf...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete