-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

10 Machine Learning Blueprints You Should Know for Cybersecurity
By :

In the sections so far, we covered how differential privacy can be implemented in standard machine learning classifiers. In this section, we will cover how it can be implemented for neural networks.
Differentially private stochastic gradient descent (DP-SGD) is a technique used in machine learning to train models on sensitive or private data without revealing the data itself. The technique is based on the concept of differential privacy, which guarantees that an algorithm’s output remains largely unchanged, even if an individual’s data is added or removed.
DP-SGD is a variation of the stochastic gradient descent (SGD) algorithm, which is commonly used for training deep neural networks. In SGD, the algorithm updates the model parameters by computing the gradient of the loss function on a small randomly selected subset (or “batch”) of the training data. This is done iteratively until the algorithm...