Summary
Computes a confusion matrix based on errors of omission and commission, then derives a kappa index of agreement between the classified map and data that is considered to be ground truth.
This tool uses the outputs from the Create Accuracy Assessment Points tool or the Update Accuracy Assessment Points tool.
Usage
This tool computes a confusion matrix using the random accuracy assessment points generated by the Create Accuracy Assessment Points tool. It calculates the user's accuracy and producer's accuracy for each class as well as an overall kappa index of agreement. These accuracy rates range from 0–1, with 1 representing 100 percent accuracy. Traditionally, these are referred to as error rates even though they show accuracy. To maintain convention, this document will refer to them as error rates.
User's accuracy shows false positives, where pixels are incorrectly classified as a known class when they should have been classified as something else. An example would be where the classified image says a pixel is impervious but the ground truth says it is forest. The impervious class has extra pixels that it should not have according to the ground truth data.
User's accuracy is also referred to as errors of commission or type 1 error. The data to compute this error rate is read from the rows of the table.
The Total row shows the number of points that should have been identified as a given class, according to the ground truth data.
Producer's accuracy is a false negative, where pixels of a known class are classified as something other than that class. An example would be where the classified image says a pixel is forest, but it is actually impervious. In this case, the impervious class is missing pixels according to the ground truth data.
Producer's accuracy is also referred to as errors of omission or type 2 error. The data to compute this error rate is read in the columns of the table.
The Total column shows the number of points that were identified as a given class, according to the classified map.
Kappa index of agreement gives an overall assessment of the accuracy of the classification.
Syntax
ComputeConfusionMatrix (in_accuracy_assessment_points, out_confusion_matrix)
Parameter | Explanation | Data Type |
in_accuracy_assessment_points | The accuracy assessment point feature class, created from the Create Accuracy Assessment Points tool, containing the CLASSIFIED and GROUND_TRUTH fields. | Feature Layer |
out_confusion_matrix | The output file name of the confusion matrix in table format. The format of the table is determined by the output location and path. By default, the output will be a geodatabase table. If the path is not in a geodatabase, specify a .dbf extension to save it in dBASE format. | Table |
Code sample
ComputeConfusionMatrix example 1 (stand-alone script)
This example computes the confusion matrix based on accuracy assessment points.
import arcpy
from arcpy.sa import *
arcpy.gp.ComputeConfusionMatrix("aapnt2.shp", "confm.dbf")
Environments
Licensing information
- ArcGIS Desktop Basic: Requires Spatial Analyst
- ArcGIS Desktop Standard: Requires Spatial Analyst
- ArcGIS Desktop Advanced: Requires Spatial Analyst