Posterior and MAP Estimation for Normal Distribution with Unknown Mean and Precision
This video explains how to perform posterior and MAP estimation for a Normal distribution when both the mean and precision are unknown, highlighting the role of prior knowledge in regularizing the parameters. Notes available at: htt...
🔥 Related Trending Topics
LIVE TRENDSThis video may be related to current global trending topics. Click any trend to explore more videos about what's hot right now!
THIS VIDEO IS TRENDING!
This video is currently trending in Bangladesh under the topic 's'.
About this video
Everything is unknown! Both parameters of a Normal/Gaussian need prior knowledge in order to be regularized. This video presents how. Here are the notes: https://raw.githubusercontent.com/Ceyron/machine-learning-and-simulation/main/english/essential_pmf_pdf/univariate_normal_posterior_and_map_unknown_mean_and_precision.pdf
In earlier videos we put a prior to only one of the two parameters of the Normal/Gaussian. That on the other hand meant we had clear knowledge on the other parameter, i.e. it was fixed. Of course, this is not an unrealistic scenario, but in reality we might be unsure about both parameters at the same time, meaning we need to incorporate prior knowledge in order to regularize our parameter estimation.
This increases the complexity of the derivation by quite a bit. The joint prior of the two parameters is called a Normal-Gamma or Gauss-Gamma distribution. In this video we will identify this distribution and introduce it. Also take a look at the interactive web plots here: https://share.streamlit.io/ceyron/numeric-notes/main/english/essential_pmf_pdf/normal_gamma_interactive_plot.py
After we found this distribution we can define the Directed Graphical Model of our problem. This helps us to express the joint distribution based on which we can start to derive the posterior. Since the Normal-Gamma/Gauss-Gamma was a conjugate prior we end up at the same distribution being our posterior.
We will identify its parameters and derive its mode as a point estimate, the so-called Maximum-A-Posteriori estimate.
In the last part we will look at an example in TensorFlow Probability. Here, we will use an artificial dataset to explore Maximum Likelihood Estimate (MLE) and Maximum A Posterior (MAP) estimate for both clean and corrupt data. We will also implement a Normal-Gamma/Gauss-Gamma prior and posterior.
If you enjoyed the video, then feel free to buy me a coffee ;) https://www.buymeacoffee.com/MLsim
If you are interested in the Python code I used to create the interactive plot, you can find it on my GitHub: https://github.com/Ceyron/numeric-notes/blob/main/english/essential_pmf_pdf/normal_gamma_interactive_plot.py
-------
📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-learning-and-simulation
📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: https://www.linkedin.com/in/felix-koehler and https://twitter.com/felix_m_koehler
💸 : If you want to support my work on the channel, you can become a Patreon here: https://www.patreon.com/MLsim
-------
Timestamps:
00:00 Introduction
01:02 Problem of noisy data and MLE
01:48 Task of this video
02:27 Finding the (Multivariate) Prior
07:56 Identifying a Normal-Gamma
09:03 Functional Form of the Prior
10:07 Normal-Gamma Plot: Intro
11:06 Normal-Gamma Plot: Changing mu_0
11:26 Normal-Gamma Plot: Changing tau_0
12:26 Normal-Gamma Plot: alpha_0 & beta_0
13:09 Directed Graphical Model
15:49 The joint distribution
17:55 The Posterior by Bayes' Rule
19:15 Deriving the Posterior
22:24 Simplifying the Exponent
26:28 Deriving the Posterior (cont.)
29:45 Identifying the Posterior Normal-Gamma
33:17 MAP: Optimization Problem
24:15 MAP: Log-Posterior
35:37 MAP: Maximizing for Mu
36:16 MAP: Maximizing for Tau
37:51 MAP: Plugging in the values
39:35 MAP for Standard Deviation
40:34 Discussing the MAP
41:25 TFP: Creating a dataset
42:51 TFP: MLE
43:46 TFP: Encoding prior knowledge
44:38 TFP: MAP
46:25 TFP: MLE vs. MAP
47:07 TFP: Defining a Normal-Gamma Generator
48:36 TFP: Creating the prior
49:12 TFP: Calculating posterior's parameters
50:41 TFP: Creating the posterior
51:07 TFP: Comparing prior and posterior
52:12 TFP: Mode of the posterior
52:32 TFP: Corrupt data
55:32 Outro
Video Information
Views
2.3K
Total views since publication
Likes
43
User likes and reactions
Duration
56:15
Video length
Published
May 9, 2021
Release date
Quality
hd
Video definition