JMP 14.0 Online Documentation (English)
Discovering JMP
Using JMP
Basic Analysis
Essential Graphing
Profilers
Design of Experiments Guide
Fitting Linear Models
Predictive and Specialized Modeling
Multivariate Methods
Quality and Process Methods
Reliability and Survival Methods
Consumer Research
Scripting Guide
JSL Syntax Reference
JMP iPad Help
JMP Interactive HTML
Capabilities Index
JMP 13 Online Documentation
JMP 12 Online Documentation
Predictive and Specialized Modeling
•
Partition Models
•
Validation
• K-Fold Crossvalidation
Previous
•
Next
K-Fold Crossvalidation
In K-Fold cross validation, the entire set of observations is partitioned into
K
subsets, called
folds
. Each fold is treated as a holdback sample with the remaining observations as a training set.
Unconstrained optimization of the crossvalidation RSquare value tends to overfit models. To address this tendency, the KFold crossvalidation stopping rule terminates stepping when improvement in the crossvalidation RSquare is minimal. Specifically, the stopping rule selects a model for which none of the next ten models have a crossvalidation RSquare showing an improvement of more than 0.005 units.
When you select the K Fold Crossvalidation option, a Crossvalidation report appears. The results in this report update as you split the decision tree. Or, if you click Go, the outline shows the results for the final model.
Crossvalidation Report
The Crossvalidation report shows the following:
k-fold
Number of folds.
-2LogLike or SSE
Gives twice the negative log-likelihood (-2LogLikelihood) values when the response is categorical. Gives sum of squared errors (SSE) when the response is continuous. The first row gives results averaged over the folds. The second row gives results for the single model fit to all observations.
RSquare
The first row gives the RSquare value averaged over the folds. The second row gives the RSquare value for the single model fit to all observations.
Previous
•
Next
Help created on 7/12/2018