# Conditional dependence

In probability theory, conditional dependence is a relationship between two or more events that are dependent when a third event occurs. For example, if $\displaystyle{ A }$ and $\displaystyle{ B }$ are two events that individually increase the probability of a third event $\displaystyle{ C, }$ and do not directly affect each other, then initially (when it has not been observed whether or not the event $\displaystyle{ C }$ occurs) $\displaystyle{ \operatorname{P}(A \mid B) = \operatorname{P}(A) \quad \text{ and } \quad \operatorname{P}(B \mid A) = \operatorname{P}(B) }$ ($\displaystyle{ A \text{ and } B }$ are independent).

But suppose that now $\displaystyle{ C }$ is observed to occur. If event $\displaystyle{ B }$ occurs then the probability of occurrence of the event $\displaystyle{ A }$ will decrease because its positive relation to $\displaystyle{ C }$ is less necessary as an explanation for the occurrence of $\displaystyle{ C }$ (similarly, event $\displaystyle{ A }$ occurring will decrease the probability of occurrence of $\displaystyle{ B }$). Hence, now the two events $\displaystyle{ A }$ and $\displaystyle{ B }$ are conditionally negatively dependent on each other because the probability of occurrence of each is negatively dependent on whether the other occurs. We have $\displaystyle{ \operatorname{P}(A \mid C \text{ and } B) \lt \operatorname{P}(A \mid C). }$

Conditional dependence is different from conditional independence. In conditional independence two events (which may be dependent or not) become independent given the occurrence of a third event.

## Example

In essence probability is influenced by a person's information about the possible occurrence of an event. For example, let the event $\displaystyle{ A }$ be 'I have a new phone'; event $\displaystyle{ B }$ be 'I have a new watch'; and event $\displaystyle{ C }$ be 'I am happy'; and suppose that having either a new phone or a new watch increases the probability of my being happy. Let us assume that the event $\displaystyle{ C }$ has occurred – meaning 'I am happy'. Now if another person sees my new watch, he/she will reason that my likelihood of being happy was increased by my new watch, so there is less need to attribute my happiness to a new phone.

To make the example more numerically specific, suppose that there are four possible states $\displaystyle{ \Omega = \left\{ s_1, s_2, s_3, s_4 \right\}, }$ given in the middle four columns of the following table, in which the occurrence of event $\displaystyle{ A }$ is signified by a $\displaystyle{ 1 }$ in row $\displaystyle{ A }$ and its non-occurrence is signified by a $\displaystyle{ 0, }$ and likewise for $\displaystyle{ B }$ and $\displaystyle{ C. }$ That is, $\displaystyle{ A = \left\{ s_2, s_4 \right\}, B = \left\{ s_3, s_4 \right\}, }$ and $\displaystyle{ C = \left\{ s_2, s_3, s_4 \right\}. }$ The probability of $\displaystyle{ s_i }$ is $\displaystyle{ 1/4 }$ for every $\displaystyle{ i. }$

Event $\displaystyle{ \operatorname{P}(s_1)=1/4 }$ $\displaystyle{ \operatorname{P}(s_2)=1/4 }$ $\displaystyle{ \operatorname{P}(s_3)=1/4 }$ $\displaystyle{ \operatorname{P}(s_4)=1/4 }$ Probability of event
$\displaystyle{ A }$ 0 1 0 1 $\displaystyle{ \tfrac{1}{2} }$
$\displaystyle{ B }$ 0 0 1 1 $\displaystyle{ \tfrac{1}{2} }$
$\displaystyle{ C }$ 0 1 1 1 $\displaystyle{ \tfrac{3}{4} }$

and so

Event $\displaystyle{ s_1 }$ $\displaystyle{ s_2 }$ $\displaystyle{ s_3 }$ $\displaystyle{ s_4 }$ Probability of event
$\displaystyle{ A \cap B }$ 0 0 0 1 $\displaystyle{ \tfrac{1}{4} }$
$\displaystyle{ A \cap C }$ 0 1 0 1 $\displaystyle{ \tfrac{1}{2} }$
$\displaystyle{ B \cap C }$ 0 0 1 1 $\displaystyle{ \tfrac{1}{2} }$
$\displaystyle{ A \cap B \cap C }$ 0 0 0 1 $\displaystyle{ \tfrac{1}{4} }$

In this example, $\displaystyle{ C }$ occurs if and only if at least one of $\displaystyle{ A, B }$ occurs. Unconditionally (that is, without reference to $\displaystyle{ C }$), $\displaystyle{ A }$ and $\displaystyle{ B }$ are independent of each other because $\displaystyle{ \operatorname{P}(A) }$—the sum of the probabilities associated with a $\displaystyle{ 1 }$ in row $\displaystyle{ A }$—is $\displaystyle{ \tfrac{1}{2}, }$ while $\displaystyle{ \operatorname{P}(A\mid B) = \operatorname{P}(A \text{ and } B) / \operatorname{P}(B) = \tfrac{1/4}{1/2} = \tfrac{1}{2} = \operatorname{P}(A). }$ But conditional on $\displaystyle{ C }$ having occurred (the last three columns in the table), we have $\displaystyle{ \operatorname{P}(A \mid C) = \operatorname{P}(A \text{ and } C) / \operatorname{P}(C) = \tfrac{1/2}{3/4} = \tfrac{2}{3} }$ while $\displaystyle{ \operatorname{P}(A \mid C \text{ and } B) = \operatorname{P}(A \text{ and } C \text{ and } B) / \operatorname{P}(C \text{ and } B) = \tfrac{1/4}{1/2} = \tfrac{1}{2} \lt \operatorname{P}(A \mid C). }$ Since in the presence of $\displaystyle{ C }$ the probability of $\displaystyle{ A }$ is affected by the presence or absence of $\displaystyle{ B, A }$ and $\displaystyle{ B }$ are mutually dependent conditional on $\displaystyle{ C. }$