-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

The Handbook of NLP with Gensim
By :

The beta distribution is a distribution of probabilities for two outcomes – 1 and 0. The Dirichlet distribution is an extension of the beta distribution to multiple outcomes. The beta distribution is a distribution of two probabilities, and the Dirichlet distribution is a distribution of k-probabilities.
When we do not know the actual probabilities of the kth choice, we can form a prior guess for them. Again, this prior knowledge is called the prior probability in Bayesian terminology. Suppose you are at a table in a casino that bets on rolling a dice and you do not know the probabilities of “1”, “2”, “3”, “4”, “5”, or “6.” Suppose the casino is reputable, so you believe the probabilities of all six sides are equal as [1/6, 1/6, 1/6, 1/6, 1/6, 1/6]. This becomes your “prior belief” in Bayesian inference. The outcomes of the dice...