-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

Generative AI Foundations in Python
By :

In the previous chapter, we established the key distinction between generative and discriminative models. Discriminative models focus on predicting outputs by learning p(output
∣
input)
, or the conditional probability of some expected output given an input or set of inputs. In contrast, generative models, such as Generative Pretrained Transformer (GPT), generate text by predicting the next token (a partial word, whole word, or punctuation) using p(next token
∣
previous tokens)
, based on the probabilities of possible continuations given the current context. Tokens are represented as vectors containing embeddings that capture latent features and rich semantic dependencies learned through extensive training.
We briefly surveyed leading generative approaches, including Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), diffusion models, and autoregressive transformers. Each...