Home Artificial Intelligence Causal Diagram: Confronting the Achilles’ Heel in Observational Data From Bayes’s rule to inverse probability

Causal Diagram: Confronting the Achilles’ Heel in Observational Data From Bayes’s rule to inverse probability

0
Causal Diagram: Confronting the Achilles’ Heel in Observational Data
From Bayes’s rule to inverse probability

Photo by Андрей Сизов on Unsplash

“The Book of Why” Chapters 3&4, a Read with Me series

In my previous two articles, I kicked off the “Read with Me” series and finished reading the primary two chapters from “The Book of Why” by Judea Pearl. These articles discuss the need of introducing causality in enabling human-like decision-making and emphasize the Ladder of Causation that sets up the muse for future discussions. In this text, we are going to explore the keyholes that open the door from the primary to the second rung of the ladder of causation, allowing us to maneuver beyond probability and into causal pondering. We are going to go from Bayes’s rule to the Bayesian network to, finally, the causal diagrams.

As a fan of detective novels, my favorite series is Sherlock Holmes. I still remember all lately and nights I read them without noticing time passing by. Years later, plenty of the case details had already disappeared from my memories, but I still remember the famous quotes like everyone else:

When you might have eliminated the unimaginable, whatever stays, nonetheless improbable, have to be the reality.

Translating this quote into the sphere of statistics, there are two forms of probabilities — — forward probability and inverse probability. Based on Sherlock Holmes’s deductive reasoning, detective work is just finding the murderer with the very best inverse probability.

Photo by Markus Winkler on Unsplash

Going from forward probability to inverse probability, we should not only just flipping the variables sequentially but additionally enforcing a causal relationship. As briefly discussed within the previous article, Bayes’s rule provides a bridge that connects objective data (evidence) with subjective opinions (prior belief). Based on Bayes’s rule, we are able to calculate conditional probabilities from any two variables. For any variable A and B, provided that B has happened, the probability of A happening is:

P(A|B) = P(A&B)/P(B)

LEAVE A REPLY

Please enter your comment!
Please enter your name here