This is an audio version of the Wikipedia Article:
https://en.wikipedia.org/wiki/Prior_probability
00:02:03 1 Informative priors
00:03:11 2 Weakly informative priors
00:03:54 3 Uninformative priors
00:05:50 4 Improper priors
00:23:21 4.1 Examples
00:26:36 5 Notes
Listening is a more natural way of learning, when compared to reading. Written language only began at around 3200 BC, but spoken language has existed long ago.
Learning by listening is a great way to:
- increases imagination and understanding
- improves your listening skills
- improves your own spoken accent
- learn while on the move
- reduce eye strain
Now learn the vast amount of general knowledge available on Wikipedia through audio (audio article). You could even learn subconsciously by playing the audio while you are sleeping! If you are planning to listen a lot, you could try using a bone conduction headphone, or a standard speaker instead of an earphone.
Listen on Google Assistant through Extra Audio:
https://assistant.google.com/services/invoke/uid/0000001a130b3f91
Other Wikipedia audio articles at:
https://www.youtube.com/results?search_query=wikipedia+tts
Upload your own Wikipedia articles through:
https://github.com/nodef/wikipedia-tts
Speaking Rate: 0.9153442364928753
Voice name: en-AU-Wavenet-B
"I cannot teach anybody anything, I can only make them think."
- Socrates
SUMMARY
=======
In Bayesian statistical inference, a prior probability distribution, often simply called the prior, of an uncertain quantity is the probability distribution that would express one's beliefs about this quantity before some evidence is taken into account. For example, the prior could be the probability distribution representing the relative proportions of voters who will vote for a particular politician in a future election. The unknown quantity may be a parameter of the model or a latent variable rather than an observable variable.
Bayes' theorem calculates the renormalized pointwise product of the prior and the likelihood function, to produce the posterior probability distribution, which is the conditional distribution of the uncertain quantity given the data.
Similarly, the prior probability of a random event or an uncertain proposition is the unconditional probability that is assigned before any relevant evidence is taken into account.
Priors can be created using a number of methods. A prior can be determined from past information, such as previous experiments. A prior can be elicited from the purely subjective assessment of an experienced expert. An uninformative prior can be created to reflect a balance among outcomes when no information is available. Priors can also be chosen according to some principle, such as symmetry or maximizing entropy given constraints; examples are the Jeffreys prior or Bernardo's reference prior. When a family of conjugate priors exists, choosing a prior from that family simplifies calculation of the posterior distribution.
Parameters of prior distributions are a kind of hyperparameter. For example, if one uses a beta distribution to model the distribution of the parameter p of a Bernoulli distribution, then:
p is a parameter of the underlying system (Bernoulli distribution), and
α and β are parameters of the prior distribution (beta distribution); hence hyperparameters.Hyperparameters themselves may have hyperprior distributions expressing beliefs about their values. A Bayesian model with more than one level of prior like this is called a hierarchical Bayes model.