Production Mapping ライセンスで利用できます。
Reviewer batch jobs are created with ArcGIS Data Reviewer for Desktop and can be linked to the data currently loaded in your map through the Batch Job dialog box. You can associate several batch jobs with the product library and associate them with any of the following categories:
- Attribute—This type of batch job contains checks that can be run through the Feature Manager, either in batch from the context menu on the Create or Update tabs or once attributes are edited and applied to a single feature.
- Spatial—This type of batch job contains checks that can be run using the Validate Current Selection or Validate Current Extent tool.
- Reviewer—This type of batch job contains checks that can be run using various tools in a Reviewer session.
All these batch jobs contain checks that relate to conditions that are specific to a certain data model or product specification. For instance, the product specification could indicate that all building area features should be no smaller than 100 square meters or that the NAM attribute cannot contain a null value. Each of these conditions can be specified as a check in a spatial or attribute batch job.
Attribute batch jobs
The Attribute batch job contains Reviewer checks that search for specific attribute combinations or values in the data. For example, the Domain check searches the selected features for values that do not comply with rules specified in the schema for domains. This could include domains that have null values or values that fall outside the specified range.
Only the following checks can be included in an Attribute batch job:
The checks included in this batch job can be run through the Update tab on the Manage Features window.
When the checks in this batch job are run after you change attributes on a feature, the Validation Results window appears and displays the check name as well as a description with the validation errors for the feature. If they are run on the current selection, the Browse Features window automatically opens so you can browse the results of the checks.
Error Severity
When a check is configured in a batch job, you can specify a severity value for that check to indicate how critical the issue is. When an error is found using the Feature Manager, any attribute changes to a feature will not be committed until the error is fixed. Feature Manager also provides warnings, which return a message but do not prevent the changes from being committed. Whether a check is considered an error or a warning is determined by the severity specified in the check and the Error Severity value chosen when the batch job is associated with the product library. When associating a batch job with the product library, you have the option to specify what severity values will be returned as an error.
For example, you have two checks for your data. The first check has a severity of 2, and the second check has a severity of 5. If you choose the 1–3 option for the Error Severity value on the Batch Jobs dialog box, features that fail the first check will be returned as an error, and features that fail the second check will be returned as a warning.
Tips for including checks in an Attribute batch job
- The checks in the Attribute batch jobs cannot have the Always Run on Full Database check box checked.
- It is not recommended that you include checks where you cannot specify a feature class in the Attribute batch job. Examples include the Connectivity Rules and Unique ID checks.
Spatial batch jobs
The Spatial batch job contains checks that validate spatial components of features as well as attributes. For instance, the Geometry on Geometry check can be used to check for building point features that have accidentally been digitized on lakes, or the Domain check can be used to ensure that there are no null values in fields that use domains.
The checks included in this batch job are run using the Validate Current Selection or Validate Current Extent tool.
When these checks are run on the selected features, the results appear in the Browse Features window.
Tips for including checks in a Spatial batch job
- The checks in the Spatial batch jobs cannot have the Always Run on Full Database check box checked.
- The following checks can only be included in a Reviewer batch job and therefore, cannot be included in a Spatial batch job:
- Custom check—because there is no way to know if the check will work on the selected set
- Unique ID check—because the check uses workspace as input not feature class
- Connectivity check—because the check uses workspace as input not feature class
- Sampling check—because the check uses workspace as input not feature class
- Metadata check—because the check does not return a feature as the result
- Topology rules check—because the check requires validating the topology and uses extensive time
Reviewer batch jobs
The Reviewer batch job contains any checks that can be run in a Reviewer session. This includes checks that cannot be included in the Attribute and Spatial batch jobs, such as the Metadata and Table to Table Attribute checks. For example, if you have a table with valid values for an attribute, you can compare the table against corresponding attribute values.
The checks in this batch job can be executed using the Reviewer Batch Validate tool or the Execute Reviewer Batch Job geoprocessing tool.
Learn more about the Reviewer Batch Validate tool
Learn more about the Execute Reviewer Batch Job tool
When the batch job in this category is run, the check results are written directly to the Reviewer table.
Batch jobs and the product library
To associate batch jobs with the current map data, a product library and production database must be specified. If you are not using a product library, you can upgrade a database to a product library and current data workspace to a production database. Once those two workspaces have been defined on the Product Library window, you can create a new data model version based on all the feature classes in the production database.
However, if you are using an existing product library that contains data models, you can also choose to use an existing data model or create a new version of one.
When you associate the batch jobs with the new data model version, you are checking them into the product library.
Learn more about associating batch jobs with data model versions