Sign in with your CogSci, Facebook, Google, or Twitter account.

Or register to create a new account.

We'll use your information only for signing in to this forum.

Supported by

Can I use lmBF() for logistic regression?


I need to perform a logistic regression with two categorical predictor variables (two levels each). I am trying to figure out whether I can use the lmBF() function from the BayesFactor package to do this. I could not find any information on this in the documentation. Bringing up ?regressionBF in R gives me this information:

The vector of observations y is assumed to be distributed as: y ~ Normal(α 1 + Xβ, σ^2 I).

This suggests to me that binomial ys are not appropriate.

I did go ahead and tried it anyways. And lmBF will happily fit the models and give me results. I just don't know whether they actually mean anything. Specifically, I compared the output of

glmer(y ~ f1 + f2 + f1:f2+ (1|subj) + (1|item), data=data, family = binomial)

with the output of

lmBF(y ~ f1 + f2 + f1:f2, whichRandom = c("subj", "item"), data=data)

and they corresponded quite closely.

I constructed a simpler example (without the random effects etc.) to test whether the outcomes converge. And they seem to:

data <- data.frame(y = rbinom(100, 1, .5), 
                   f1 = as.factor(sample(rep(LETTERS[1:2], 50))), 
                   f2 = as.factor(sample(rep(letters[1:2], 50))))

# Traditional log. regression:
m.trad <- glm(y ~ .,family=binomial(link='logit'),data=data)

# Using lmBF:
m.bf <- lmBF(y ~ ., data=data)
chains <- posterior(m.bf, iterations = 10000)
coeff.est <- colMeans(chains)

# Comparing param. estimates for observation in f1 = B and f2 = b
# Trad. glm:
invlogit <- function(X) { 1 / (1+exp(-X)) }

# lmBF:
coeff.est['mu'] + coeff.est['f1-B'] + coeff.est['f2-b']

Can someone put my mind at ease and confirm that I am doing this right and that lmBF does return meaningful parameter estimates (etc.) for binomially distributed ys?

Thanks a lot!

  • Florian


Sign In or Register to comment.