JMP 13.2 Online Documentation (English)
Discovering JMP
Using JMP
Basic Analysis
Essential Graphing
Profilers
Design of Experiments Guide
Fitting Linear Models
Predictive and Specialized Modeling
Multivariate Methods
Quality and Process Methods
Reliability and Survival Methods
Consumer Research
Scripting Guide
JSL Syntax Reference
JMP iPad Help
JMP Interactive HTML
Capabilities Index
JMP 12 Online Documentation
Quality and Process Methods
•
Attribute Gauge Charts
•
The Attribute Gauge Chart and Reports
• Agreement Reports
Previous
•
Next
Agreement Reports
Note:
The Kappa value is a statistic that expresses agreement. The closer the Kappa value is to 1, the more agreement there is. A Kappa value closer to 0 indicates less agreement.
The Agreement Report shows agreement summarized for each rater and overall agreement. This report is a numeric form of the data presented in the second chart in the Attribute Gauge Chart report. See
Attribute Gauge Chart
.
The Agreement Comparisons report shows each rater compared with all other raters, using Kappa statistics. The rater is compared with the standard only if you have specified a Standard variable in the launch window.
The Agreement within Raters report shows the number of items that were inspected. The confidence intervals are score confidence intervals (as suggested by Agresti and Coull, 1998). The Number Matched is the sum of the number of items inspected, where the rater agreed with him or herself on each inspection of an individual item. The Rater Score is the Number Matched divided by the Number Inspected.
The Agreement across Categories report shows the agreement in classification over that which would be expected by chance. It assesses the agreement between a fixed number of raters when classifying items.
Agreement Reports
Previous
•
Next
Help created on 9/19/2017