Fortunately, there exist a few special cases for which closed-form solutions exist. Perhaps the most important one of these special cases is the Gaussian distribution. As you may remember from previous courses, Gaussian PDFs \(\mathcal{N}(\boldsymbol{\mu},\boldsymbol{\Sigma})\) are defined by two coefficients:
a mean\(\boldsymbol{\mu}\) that defines the center of a Gaussian distribution, and
a covariance matrix\(\boldsymbol{\Sigma}\) that defines the spread and correlation of the marginal dimensions.
What makes the Gaussian case special is that the marginalization and conditioning of a Gaussian PDF always return another Gaussian PDF. In fact, both operations reduce to simple manipulations of the mean and the covariance:
Fig. 58 The marginal and conditional PDFs of a Gaussian PDF are also Gaussian PDFs.#
Marginalizing out \(x_{2}\) then reduces to simply deleting the corresponding entry in \(\boldsymbol{\mu}\) and the corresponding rows and columns in \(\boldsymbol{\Sigma}\):
Likewise, conditioning can be implemented with a manipulation of the mean and covariance matrix. In this case, it reduces to two short equations using linear algebra. Let \(p(\boldsymbol{x},\boldsymbol{y})\) be defined as a Gaussian joint distribution: