Disponible avec une licence Production Mapping.
Data conversion is the process of moving data from one format into another, whether it is from one data model to another or from one data format to another. With large amounts of geographic data available for use, it is more cost-effective to convert the data from one format to another than re-create it. The tools in Esri Production Mapping allow you to convert existing datasets into data formats or ingest them into an existing data model. Data conversion is also common after data production for distributing data to others.
With data models, you are typically moving data between two schemas that contain similar feature classes. With data formats, you are moving data from another format altogether, such as shapefiles, coverages, or Vector Product Format (VPF) sources into a geodatabase. You can also convert data in geodatabases to shapefiles and coverages or import them into geodatabases.
Before data conversion begins
Quality control is important to consider when doing data conversion. Before importing data, it is recommended that you ensure that features are appropriately attributed and geometrically correct to ensure that bad data is not imported or that features are not omitted because they have invalid geometries. Running ArcGIS Data Reviewer checks is a way to ensure the data integrity before performing data conversion.
Moving data from one data model to another
One form of data conversion is transferring data between data models or product specifications; that is, you are moving the data between two data models that have similar feature classes and fields. For example, there may be a more recent version of an existing data model that you want to use for production. You can load the data into a geodatabase that contains the schema for the newer version of the data model.
To move data between different data models, the following must be true for both the source data you are using and the target schema or workspace:
- The geodatabase schema must be defined appropriately for the data.
- Common fields must exist in both schemas so they can be mapped to each other.
Once these conditions are met, you can create a cross-reference database that can be used to map the source to a target workspace. The cross-reference database is a template for loading data from one schema into another and provides information on the source and target tables and fields that correspond with each other. This creates a reusable means of loading data between two particular schemas in a consistent manner. For example, if you have several workspaces that need to be converted from data model A to data model B and have a cross-reference database set up, you can use it for all workspaces that you want to convert from data model A to data model B.
The cross-reference database can be created using the Create Cross-Reference tool, which automatically creates the tables for the cross-reference database based on the source and target workspaces you define. The mapping between the source and target workspace can be refined using a mapping file that allows you to map table and field names between the source and target workspaces. For example, it can indicate that the Bridge_L table in the source workspace needs to be mapped to the BridgeL table in the target workspace.
After the cross-reference database is created, you can optionally validate it against the target workspace into which you want to load data. The Preload Validate tool can be used to validate the cross-reference database against the source and target workspaces you want to use. If these are different from the workspaces used to create the cross-reference database, you can use the Preload Validate tool to ensure the data is going to be successfully loaded into the target workspace. Once the process is finished, you can view a log file that indicates what tables and fields are not in the source or target workspace but are present in the cross-reference database.
The last part of the process for moving from one data model to another is to use the Data Loader tool to load the data into a geodatabase. This requires the cross-reference database as well as a source and target workspace. The source workspace contains the features you want to load, while the target workspace is what you want to populate with the data. After the loading process is complete, there is a log file you can view to see the number of features that have been loaded and if there were any errors encountered.
Importing data into a geodatabase
Data can be imported from sources other than another geodatabase. You can also import data that is stored in shapefiles, coverages, or VPF files.
Importing data from shapefiles and coverages
The process for importing data from a shapefile or coverage into a geodatabase is similar to the process described for moving data from one data model to another. However, for the data to be successfully imported, fields must be manually mapped to the target geodatabase. A cross-reference database is required to map the fields between the shapefiles or coverages you want to import and the target schema. You can create a mapping file that instructs the Data Loader to map dissimilar fields, for example, the Elevation field to the ZV2 field.
The mapping file can also be used as a workaround for certain limitations of the shapefile format:
- There is a 10-character limit on field names. For example, the field name could be Jurisdicti in the shapefile and be mapped to the Jurisdiction field in the target schema.
- There are no domains in shapefiles. For example, the value of Hard/Paved for a shapefile with roads could be mapped to the domain value of 1 in the target schema.
To test the cross-reference database against the shapefiles or coverages and the geodatabase, you can use the Preload Validate tool. This allows you to see which shapefiles or coverages are not mentioned in the cross-reference file as well as which items mentioned in the cross-reference file are not in the source or target workspaces. If the cross-reference database you are using with the Preload Validate tool has been preconfigured as part of a solution, you may have shapefiles that are not part of the cross-reference database.
After you establish that the shapefiles or coverages you want to import are compatible with the cross-reference database and are included in the target workspace, you can use the Data Loader to import the data. In this scenario, you use the directory that contains the shapefiles or coverages as the source. The target must always be a geodatabase in this case because you can only use the Data Loader to load data into a geodatabase.
Importing VPF data
VPF data can be imported using the VPF To Geodatabase geoprocessing tool. There are several source formats you can import data from: Digital Nautical Chart (DNC), Vector Map Level 1 (VMap1), Vector Map Level 2 (VMap2), Urban Vector Map (UVMap), Foundation Feature Data (FFD), and Vector Interim Terrain Data (VITD) sources.
Exporting data from a geodatabase to other formats
The last type of data conversion that can be done with Production Mapping is exporting data from a geodatabase to another format. Geodatabases can be exported to shapefiles or coverages using the Production Geodatabase To Shapefile tool or Geodatabase To Coverage command, respectively. When you create shapefiles, you can choose different options for the export, such as whether to include the feature class or subtype name, the attribute values or descriptions, and the Unicode format to use. This allows you to maintain the attribution on the features when they are in the shapefile format.
With the Geodatabase To Coverage command, you can choose to export selectable map layers as a coverage or as feature classes in a geodatabase to separate coverages based on a mapping file. When you export the selectable map layers, you can only create a single coverage with all the layers. When you export the feature classes using a mapping file, all the feature classes can be exported to several different thematic coverages. For example, the Road, Bridge, and Railroad features could be exported to a Transportation coverage. With both methods for exporting data as coverages, you can define a tolerance, which is the minimum distance between coordinates before they are considered equal.
After the data conversion process is complete
Once the data conversion process is finished, it is recommended that you verify that the import or export process has completed successfully.
What to verify for imported data
If the data has been imported, verify that
- The data has been imported correctly.
- The attribution is correct on the data.
After you have confirmed that the attribution is correct, you can edit the data or proceed to data validation tasks.
What to verify for exported data
If the data has been exported, verify that
- The data has been exported correctly.
- The output can be used as part of a deliverable.
After you have confirmed that the output is correct, you can modify the files as needed or prepare them to be distributed as part of a deliverable.