Summary
Downloads live traffic data from a web service and stores it in a dynamic traffic format (DTF) file, which is a file that network datasets can read for live-traffic analysis and display.
Usage
You need to have an account with one of the data providers listed in the Provider parameter before this tool can download traffic data.
By scheduling this tool to run at regular intervals or strategic times, you can ensure the latest traffic data is available. One common method of scheduling is to use Windows Task Scheduler on a Python script that calls this tool.
Syntax
UpdateTrafficData(provider, user_name, password, regions, traffic_data_output_folder, expected_update_interval, prediction_cutoff, compress_data, {maximum_file_age}, {speed_unit})
Parameter | Explanation | Data Type |
provider | Choose the name of your traffic data provider and the region.
| String |
user_name |
The user name authorized by the data provider to download traffic data. The tool fails to execute if the user name cannot be authenticated by the data provider. If the Provider parameter is TomTom North America or TomTom Europe, use "APIKEY" as the parameter value. | String |
password | The password provided by the data provider to download traffic data. The tool fails to execute if the password cannot be authenticated by the data provider. | Encrypted String |
regions [regions,...] | Enter the regions for which you want to download traffic data. To download all available regions, enter "#". | String |
traffic_data_output_folder | The folder in which the DTF file will be created. If the folder is empty, the tool creates a TrafficIndex.xml file along with the DTF file. For subsequent tool runs, the tool updates TrafficIndex.xml and creates a DTF file. If you download data from multiple data providers, a unique folder should be specified for each data provider. | Folder |
expected_update_interval | The time interval in minutes after which the downloaded traffic data is no longer up-to-date, and the data provider makes available refreshed data. After this interval has elapsed, it is recommended that you rerun the tool and download the latest data. | Long |
prediction_cutoff | The time interval (in minutes) for which the predictive traffic data is processed by the tool. Data providers may supply predictive data for the next 24 hours, week, or other time period. The time-span value for this parameter is used to limit the amount of predictive traffic data that is processed by the tool to speed up tool execution. For example, if your traffic provider has a prediction depth of 24 hours, but you need only one hour of prediction depth, you can save time by specifying 60 here. | Long |
compress_data |
| Boolean |
maximum_file_age (Optional) | The time interval (in minutes) for which the traffic files (that is, the DTF files) will be kept in the traffic data output folder. This parameter facilitates deleting traffic files that are no longer required. When the tool is rerun, any traffic data files that are older than the maximum file age are deleted automatically. The default value is 720 minutes (12 hours). | Long |
speed_unit (Optional) | The speed units when downloading data from a custom traffic feed. This parameter is ignored if you are using one of the standard commercial feeds that are documented in the Provider parameter.
Specifying the proper speed units of a custom traffic feed ensures the network dataset will interpret the speed values correctly. | String |
Code sample
UpdateTrafficData example 1 (Python window)
The following Python window script demonstrates how to use the UpdateTrafficData tool.
import arcpy
arcpy.na.UpdateTrafficData("NAVTEQ North America", "myUserName", "myPassword",
["New England", "New York/Northern NJ/Connecticut"],
"d:/data/dtfs", 15, 120,"NO_COMPRESS", 720)
UpdateTrafficData example 2 (workflow)
The following stand alone Python script describes how the UpdateTrafficData tool can be used to download live traffic data from a provider. The script can be run as a Windows Scheduler Task or a cron job on Linux so that the live traffic data is updated at regular intervals.
# Name: UpdateTrafficData_Workflow.py
# Description: Downloads live traffic data from a provider. The script can be run
# as a Windows Scheduler Task or a cron job on Linux so that the
# live traffic data is updated at regular intervals.
#Import system modules
import arcpy
#Set local variables
trafficDataFolder = "C:/data/TrafficData"
provider = "NAVTEQ North America"
username = "Replace with your username"
password = "Replace with your password"
#download data for all regions
regions = "#"
try:
#Update Traffic Data. Use default values for remaining parameters
arcpy.na.UpdateTrafficData(provider,username,password, regions,
trafficDataFolder)
print "Script completed successfully."
except Exception as e:
# If an error occurred, print line number and error message
import traceback, sys
tb = sys.exc_info()[2]
print "An error occurred on line %i" % tb.tb_lineno
print str(e)
Environments
This tool does not use any geoprocessing environments.
Licensing information
- Basic: Yes
- Standard: Yes
- Advanced: Yes