 #### Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

# Is the Bayes Factor a Misnomer?

edited March 2018

I often see the Bayes Factor (BF) defined as the ratio of two quantities: the probability of the data given one statistical hypotheses, and the probability of the data given the alternative hypothesis. Thus, BF = P(data | H) / P(data | H). But why is this Bayesian? Wouldn't a true Bayesian want to know the ratio of the posterior probabilities--i.e., the ratio of the probability of H given the data, and the probability of H given the data--such that The_Real_BF = P(H) | data) / P(H | data). After all, isn't Bayes' theorem's principle accomplishment to provide a way to derive posterior probabilities of hypotheses?

-- Richard Anderson

• Hi Richard,

You can multiply the Bayes factor by the prior model odds (p(H1)/p(H2)) and this then gives you the posterior model odds. The BF gives you the extent to which the data should change your beliefs. So yes, the prior model odds matter, but since they are so subjective, people usually report the BF and then consider the prior odds issue.

Cheers,
E.J.

• Thanks! that clarifies a things a lot for me. One more related question. For me, in the context of statistical analyses, I almost always to the prior model odds to be equal: p(H1) = p(H2). So am I correct to think that in this kind of (equal priors) situation, p(data | H1)/ p(data | H2) = p(H1 | data) / p(H2 | data)?

• That is correct
E.J.