The Generalized Linear Mixed Model (GLMM) personality of the Fit Model platform enables you to analyze models that have complex covariance structures and a variety of response distributions. There are multiple distributions available: normal, exponential, gamma, lognormal, beta, binomial, Poisson, and negative binomial. These distributions enable you to fit categorical and count responses, as well as continuous responses. The GLMM framework is useful for the following types of model structures:
• Randomized complete and incomplete block designs
• Split-plot experiments
• Random coefficient models
The GLMM personality is a combination of two existing approaches: the linear mixed model framework and the generalized linear model framework.
Linear mixed models are fit in JMP using the Standard Least Squares or Mixed personalities of the Fit Model platform. Fitting a linear mixed model enables you to accurately represent random effects in the model. However, these models assume that the response variable is Gaussian, which means it is continuous with an infinite range. This assumption is problematic if you want to fit random effects, but have a non-Gaussian response.
Generalized linear models are fit in JMP using the Generalized Linear Model or Generalized Regression personalities of the Fit Model platform. Fitting a generalized linear model enables you to model non-Gaussian responses, such as discrete count data or binary data. However, you cannot include random effects.
Combining the linear mixed model and the generalized linear model frameworks enables you to test hypotheses and have accurate estimation for non-Gaussian response distributions when you also have random effects. For example, you can fit a logistic regression with random effects using the GLMM personality.