Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

JASP & Bayes for the absolute beginner

Hello,

I am trying to jump into learning Bayesian analyses with JASP, as it was suggested I do so for a manuscript. I am an absolute beginner with both JASP and Bayesian statistics, so I am so very lost. Could someone recommend some resources for me? I read through a JASP tutorial and it seems very easy to use. I need to run a Bayesian repeated measures ANOVA which I see how to execute in JASP already. What I don't understand is what ANYTHING on the output means! (Or how I would report any of it). And I can't seem to find any pdfs or videos that walk you through that - any advice, links, or resources would be greatly appreciated.

Thank you!!

Comments

  • edited February 10

    Thank you so much! Those were very helpful.

    I am working with a repeated measures ANOVA that has two significant main effects but a nonsignificant interaction based on null hypothesis testing. I ran a bayesian repeated measures ANOVA now, and got this output (attached).

    Would it be correct in saying that a model with the two main effects and interaction is .268 more likely than a model with only the two main effects? Therefore showing support against a significant interaction?

    Could you also tell me why the interaction does not show in a row by itself?

    THANK YOU for your continued help!!

    Ana


  • Yes, so the data are 1/0.268 = 3.73 times more likely under the two main effects model than under the model that also includes the interaction.

    JASP uses the principle of marginality, which means that if the interaction is included in the model, so are the constituent main effects.

    Cheers,

    E.J.

  • Thank you!! One last question. Is this 3.73 then considered moderate evidence that the two main effects model is more likely?

  • yes, although the labels are only heuristic guidelines. I would not bet my house on such evidence. One way to interpret the number is that if H0 and H1 were equally likely a priori (0.50 for H0) a BF of 3.73 gives a posterior model probability for H0 of 3.73/4.73 = 0.79, leaving 0.21 for H1.

    E.J.

Sign In or Register to comment.