Bayesian learning, Bayesian statistics, and Bayesian analysis
All of them are built around one of the most important statistical concepts:
Bayes' Theorem
Here is Bayes' Theorem clearly explained
🧵
Bayes' Theorem describes the probability of an event, based on prior knowledge of conditions that might be related to the event.
In Bayes’ theorem, we have three elements:
1. Prior
2. Event
3. Posterior
Let's discuss each!
1. The prior
The prior is what we think about A without observing B.
2. The Event or Likelihood
The event gives us new information.
This is a conditional probability.
It tells us how likely B is if A is true.
3. The Posterior
The posterior is the final (and more accurate) probability that we calculate using the prior probability and the likelihood.
How likely A is based on B.
In the disease example from yesterday:
The prior is 0.0001 since the only information we have is one out of every 10,000 patients is sick.
The new information we have is that you took a test and tested positive.
twitter.com/levikul09/status/1610546613189238784?s=20
Using the prior and the event we can calculate the posterior.
In this case, we got 0.0098.
In general, the Conditional probability given that we know one thing about the event can be derived from knowing the other thing about the event.