Suppose that P(B)P(B^{c})\gt 0. Then the events A and B are independent if and only if P(A\mid B)=P(A\mid B^{c}).
First, if A and B are independent, then A and B^c are also independent, by Theorem 6.
THEOREM 6
(i) If the events A_1, A_2 are independent, then so are all three sets of events: A_{1},A_{2}^{c};A_{1}^{c},A_{2};A_{1}^{c},A_{2}^{c}.
(ii) More generally, if the events A_{1},\ldots, A_{n} are independent, then so are the events A_{1}^{\prime},\ldots ,A_{n}^{\prime} where A_{i}^\prime stands either for A_{i} or A_{i}^{c},\,i=1,\ldots,n.
Thus, P(A\mid B^{c})=\frac{P(A\cap B^{c})}{P(B^{c})}=\frac{P(A){P}(B^{c})}{P(B^{c})}=P(A). Since also P(A\mid B)\,=\,P(A), the equality P(A\mid B)={}~P(A\mid {B}^{c}) holds. Next, P(A\mid B)=P(A\mid B^{c}) is equivalent to \frac{P({A}\cap B)}{P(B)}=\frac{P(A\cap B^{c})}{P(B^{c})} or P(A\cap B)P(B^{c})=P(A\cap B^{c})P(B)\mathrm{\;or\;}P(A\cap B)[1-P(B)]={{{P}}}(A\cap B^{c}){{{P}}}(B)\mathrm{\;or\;}P(A\cap B)-P(A\cap B)P(B)=P(A \cap B^c)P(B) \mathrm{~or~} P(A \cap B)=[P(A \cap B)+P(A \cap B^c)]P(B)=P(A)P(B), \mathrm {~since~}(A \cap B) \cup (A \cap B^c)=A. Thus, A and B are independent.
REMARK 3 It is to be pointed out that the condition P(A\mid B)=P(A\mid B^{c}) for independence of the events A and B is quite natural, intuitively. It says that the (conditional) probability of A remains the same no matter which one of B or B^{c} is given.