bayesian logistics regression, solving dependency problems
Hi,
I have two experiment replications R1 and R2, (with some modifications in R2), deployed to intersecting (but not fully the same) populations of participants at two different time points.
I fit a Bayesian logistic regression to R1 data and want to improve our estimates by fitting another regression with R1 posteriors as R2 priors for shared factors in R1 and R2.
As R1 and R2 populations intersect, what are the potential consequences of that for R2 posteriors? I assume we might have an overly narrow CI? Anything else? Are there ways to estimate that?
What are good ways to alleviate the problem?
E.g. does it make sense to resample R2 dataset to increase variability before fitting the model on R2?
Fit only to R2 participants who didn’t participate in R1 (provided this is a high enough number)?
Or increase priors weight?
Potential specifications for a multilevel version are also appreciated, but the sample size in R1/R2 is limited and I’m afraid the multilevel solution might be underpowered, undermining the idea of aggregating two studies.
Thank you so much!
Comments
Interesting. What is reasonable to do depends on the extent to which R1 and R2 are the same. If they are completely the same, then you effectively have twice the number of observations for the intersecting participants. So you basically estimate the same individual-level effect. However, there are "some modifications" in R2. The overlapping design should allow you to estimate the degree to which the parameters ought to differ. So in a way this is a missing data problem, where the complete subset are the intersecting participants, and the incomplete subset are the participants who do either R1 or R2. So in BUGS/Stan, the "missing" participants could be coded "NA". You then model the difference between R1 and R2 as a separate parameter -- perhaps you'd even want to entertain a model which says that there are no meaningful differences between R1 and R2. That's one way to think about this at least.
E.J.