Contact

Eitan Farchi, Verification & Quality Technologies, IBM Research - Haifa

Generative models - Here we'll cover LDA in tw's meeting. Back to Bayesian inference -

This is part of our deep dive into generative models, which will eventually loop us back to BN, but will also shed light on GAN approaches. Here is some background and relevant resources:

Generative models

Under the generative model approach we attempt to model the joint distribution p(x y). Given x and applying the Bayesian rule to our model, we classify as y the y for which p(y | x) is largest.

A straight-forward application of the Bayes rule is to attempt the estimation of probabilities in the Bayesian rule p(y | x) p(x) = p(x | y) p(y). With the typical large number of dimensions of the vector x, density estimation of the required quantiles is hard. See the first 30 mins of this video for details.

As modeling the joint distribution p(x y) is hard simplifying assumption are introduced leading to different more concrete classification techniques.

LDA

LDA models each p(x | y) as a Gaussian distribution. This stat quest video describes how the average and standard deviation of the distribution are chosen to maximize the separation between the classes over the training set.

The second 30 mins. of the above lecture derives LDA and explains what happens if the covariance of all class matrices are I.

The estimation of a covariance matrices of a random vector is explained in detail here.

See Chapter 24 of the Understanding book for a broader coverage of generation methods.

The background required for the Gaussian distribution and the covariance matrix is covered here.