Publication date: 07/08/2024

Fitting Personalities

In the Fit Model launch window, select the fitting and analysis method by specifying a personality. Based on the response (or responses) and the factors that you enter, JMP makes an initial guess at the desired personality, but you can alter this selection in the Personality menu.

The following fitting personalities are available:

Standard Least Squares

Fits models where the response is continuous. Techniques include regression, analysis of variance, analysis of covariance, mixed models, and analysis of designed experiments. See “Standard Least Squares Models” and “Emphasis Options for Standard Least Squares”.

Stepwise

Facilitates variable selection for standard least squares and ordinal logistic analyses (or nominal with a binary response). For continuous responses, cross validation, p-value, BIC, and AICc criteria are provided. Also provided are options for fitting all possible models and for model averaging. For logistic fits, p-value, BIC, and AICc criteria are provided. See Stepwise Regression Models.

Image shown hereGeneralized Regression

Fits generalized linear models using regularized, also known as penalized, regression techniques. The regularization techniques include ridge regression, the lasso, the adaptive lasso, the elastic net, and the adaptive elastic net. The response distributions include the normal, binomial, Poisson, zero-inflated Poisson, negative binomial, zero-inflated negative binomial, and gamma. See Generalized Regression Models and Specify a Distribution.

Image shown hereMixed Model

Fits a wide variety of linear models for continuous-responses with complex covariance structures. The situations addressed include:

Split plot experiments

Random coefficients models

Repeated measures designs

Spatial data

Correlated response data

See Mixed Models.

Image shown hereGeneralized Linear Mixed Model

Fits generalized linear mixed models for non-Gaussian response variables with random effects, such as blocking. The response distributions include the binomial and Poisson. See Generalized Linear Mixed Models.

Manova

Fits models that involve multiple continuous Y variables. Techniques include multivariate analysis of variance, repeated measures, discriminant analysis, and canonical correlations. See Multivariate Response Models.

Loglinear Variance

For a continuous Y variable, constructs models for both the mean and the variance. You can specify different sets of effects for the two models. See Loglinear Variance Models.

Nominal Logistic

Fits a logistic regression model to a nominal response. See Logistic Regression Models.

Ordinal Logistic

Fits a logistic regression model to an ordinal response. See Logistic Regression Models.

Proportional Hazard

Fits a semiparametric regression model (the Cox proportional hazards model) to assess the effect of explanatory variables on survival times, taking censoring into account.

You can also launch this personality by selecting Analyze > Reliability and Survival > Fit Proportional Hazards. See Fit Parametric Survival in Reliability and Survival Methods.

Parametric Survival

Fits a general linear regression model to survival times. Use this option if you have survival times that can be expressed as a function of one or more explanatory variables. Takes into account various survival distributions and censoring.

You can also launch this personality by selecting Analyze > Reliability and Survival > Fit Parametric Survival. See Fit Parametric Survival in Reliability and Survival Methods.

Generalized Linear Model

Fits generalized linear models using various distribution and link functions. Techniques include logistic, Poisson, and exponential regression. See Generalized Linear Models.

Image shown herePartial Least Squares

Fits models to one or more Y variables using latent factors. This permits models to be fit when explanatory variables (X variables) are highly correlated, or when there are more X variables than observations.

You can also launch a partial least squares analysis by selecting Analyze > Multivariate Methods > Partial Least Squares. See Partial Least Squares Models in Multivariate Methods.

Response Screening

Automates the process of conducting tests for linear model effects across a large number of responses. Test results and summary statistics are presented in data tables and plots. A False-Discovery Rate (FDR) approach guards against incorrect declarations of significance. A robust estimation method reduces the sensitivity of tests to outliers.

In JMP Pro, the Response Screening personality also enables you to include random effects in your models. You can specify traditional variance component models or models with grouped regressors.

Note: This personality allows only continuous responses. Response Screening for individual factors is also available by selecting Analyze > Screening > Response Screening. This platform supports categorical responses, and also provides equivalence tests and tests of practical significance. See Response Screening in Predictive and Specialized Modeling.

Want more information? Have questions? Get answers in the JMP User Community (community.jmp.com).