2 ways to isolate effects in order to calculate Bayes Factors
I like the Bayesian approach, but I don't want to abandon conventional null hypothesis testing altogether.
So I want to run a conventional ANOVA and report the Bayes Factors for each effect / interaction:
e.g. "the factor A reliably modulated performance (F(x,y) = xxx, p = xxx, BF10 = xxx)".
But I am not exactly sure how I should calculate the BFs.
For 2 group comparisons, it's simple, because there is only one model that is compared with the null.
Hence, the BF for that model is the BF for the main effect (group1 vs group2).
In ANOVAs with several factors it is slightly more complicated.
1) One way to get the BF for all main effects and interactions is to isolate these effects by averaging the data across conditions. For example, in a 2 way ANOVA with factors A and B, one could average across the levels of B to isolate the main effect of A, whose BF can be calculated as mentioned above.
2) Another, somewhat simpler, way is to put the entire data set into a Bayesian Anova, but to test only for models with a single effect of interest (e.g. factor A). For example, if you have a matrix with 4 columns (for factor A and B, each with 2 levels) you would put this entire matrix into Jasp. If you only use the "Component" factor A for the Bayesian ANOVA, you'd get the BF for factor A.
Both approaches are equivalent for conventional analyses - they return the same p-values. However, the BFs obtained with the two approaches are not identical. In fact, the BFs obtained with 2) are more extreme, that is, they provide stronger evidence in favor - or in opposition - of the null model.
I assume that this is because approach 2) takes data into account that is not directly associated with the effect of interest - I am talking about the data that is lost if you collapse across 2 conditions. So I suspect that approach 2) uses "more" information than 1) to compute the BF.
Is this correct? Would you recommend to prefer approach 2) over 1)?
Any advice or comments are highly appreciated.