Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Machine Learning with Swift
  • Toc
  • feedback
Machine Learning with Swift

Machine Learning with Swift

By : Alexander Sosnovshchenko , Jojo Moolayil, Oleksandr Baiev
3 (1)
close
Machine Learning with Swift

Machine Learning with Swift

3 (1)
By: Alexander Sosnovshchenko , Jojo Moolayil, Oleksandr Baiev

Overview of this book

Machine learning as a field promises to bring increased intelligence to the software by helping us learn and analyse information efficiently and discover certain patterns that humans cannot. This book will be your guide as you embark on an exciting journey in machine learning using the popular Swift language. We’ll start with machine learning basics in the first part of the book to develop a lasting intuition about fundamental machine learning concepts. We explore various supervised and unsupervised statistical learning techniques and how to implement them in Swift, while the third section walks you through deep learning techniques with the help of typical real-world cases. In the last section, we will dive into some hard core topics such as model compression, GPU acceleration and provide some recommendations to avoid common mistakes during machine learning application development. By the end of the book, you'll be able to develop intelligent applications written in Swift that can learn for themselves.
Table of Contents (14 chapters)
close

Choosing the number of clusters

If you don't know in advance how many clusters you have, then how do you choose the optimal k? This is essentially an egg-and-chicken problem. Several approaches are popular and we'll discuss one of them: the elbow method.

Do you remember those mysterious WCSS that we calculated on every iteration of k-means? This measure tells us how much points in every cluster are different from their centroid. We can calculate it for several different k values and plot the result. It usually looks somewhat similar to the plot on the following graph:

Figure 4.3: WCSS plotted against the number of clusters

This plot should remind you about the similar plots of loss functions from Chapter 3K-Nearest Neighbors Classifier. It shows how well our model fits the data. The idea of the elbow method is to choose the k value after which the result is not...

Unlock full access

Continue reading for free

A Packt free trial gives you instant online access to our library of over 7000 practical eBooks and videos, constantly updated with the latest in tech
bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete