Available with 3D Analyst license.
A video layer needs to have a connection to the source data, be it a file, folder, or service, that can provide the individual video frames that will be draped on the surface of the globe. The video source will need to exist, and ArcGlobe will need to have access to it, for the video layer to be displayed.
A video's source information must include both the type of video and the location of the data.
There are three general types of video sources, represented by the following tag groups:
- <VideoFilesSource>: A video stored into a single logical file
- <ImageFolderSource>: A video stored as a set of ordered images in a single folder
- <CustomSource>: A video type that has been made available through custom coding
The video source path can be in any of the following formats:
- A local file name or path, such as C:\Project1\Camera1.mpg
- A UNC file name or path, such as \\myServer\Videos\Camera8_Frames
- A relative file name or path (starting from the AGV file location), such as .\CustomVideoLayer.dll
Each of the three general types of video sources are described in more detail below.
Video files source
This option is designed for videos that are stored in a supported video file format on disk, such as AVI or MPG. The AGV file will identify this video source by containing the connection information inside these XML tags: <VideoSource FrameSourceType="File"> and </VideoSource>, as shown in the example XML text below.
The primary data source tag for this video layer type is the <VideoFilePath> element, which defines the location of the video file. You can include multiple XML tags to connect several video files in a single video layer.
You will also need to define the following:
- The real-world refresh rate (in milliseconds) in the <FrameRequestRate> tag
- This value is in milliseconds and represents the length of time ArcGlobe should wait before requesting the next frame of the video. For example, a value of 50 would mean that ArcGlobe would request a frame every 0.05 seconds, thereby playing the video at a speed of 20 frames per second.
- Note that this is the "play speed" of the video, which can be different from the real-world time units that each frame represents.
Video files can be stored in a very large array of formats, with many possible codecs. For a video to play on the local machine—either inside ArcGlobe or within a video player such as Windows Media Player—the required codec must reside on that machine. Depending on your machine, the supported video file formats are these:
- MPG (.mpg, .mpeg, .mp4)
- AVI (.avi)
- WMV (.wmv)
Considerations
The following are some tips to keep in mind when creating video source files:
- If you specify multiple video files inside the VideoFilesSource group, all the videos must have the same image resolution.
- A simple test to see if your video can be displayed in ArcGlobe is to preview the source file in Windows Media Player. If the video does not display there, it will not display as a video layer in ArcGlobe.
Example
The following is an example video source layer made from two AVI source files with 20 frames per second:
<VideoSource FrameSourceType="File">
<VideoFilesSource>
<VideoFilePath>.\Security_BldF_10.00AM.avi</VideoFilePath>
<VideoFilePath>.\Security_BldF_10.05AM.avi</VideoFilePath>
</VideoFilesSource>
<FrameRequestRate>50</FrameRequestRate>
</VideoSource>
Image folder source
This option is designed for nonnatively supported video formats, such as the QuickTime (.mov) file format, by consuming them after they have been converted into a container of images. It also supports consuming the folder of frames that can be output from the ArcGIS animation framework. The AGV file will identify this video source by containing the connection information inside these XML tags: <ImageSource FrameSourceType="Folder"> and </ImageSource>, as shown in the example XML text below.
The primary data source tag for this video layer type is <ImageFolderPath>, which defines the path to the folder containing the image files.
You will also need to define the following:
- The file name formatting in the <NameFormat> tag
- This value is a string and provides a template for the image frame naming convention so the correct image files are accessed in the correct order.
- The
numeric portion of the image file name is represented with a hash
character, with the prefix and suffix of the file name surrounding
it.
For example, if the folder contained images named Frame1.jpg, Frame2.jpg, and so on, a <NameFormat> value of Frame#.jpg should be used.
- The first and last frame index values in the <FirstIndex> and <LastIndex> tags
- These two values are in whole numbers and specify the range of image files to be read.
- Example: If you have 300 sequential image frames in the folder, and the first file is named Frame1.jpg, the values required would be 1 and 300, respectively.
- These values can also be used to play a subset of a large folder of frames.
- The image size values in the <Width> and <Height> tags
- These two values are in pixels and represent the resolution of the video.
- Only images with this specified resolution will be displayed as frames in the video.
- The tags are read when the video layer is added to ArcGlobe or when the layer is manually reread by right-clicking the layer in the table of contents and clicking Refresh.
- Note that higher resolution image frames are more expensive to render and may result in a lower maximum frame rate.
- The video frame refresh rate in the <FrameRequestRate> tag
- This value is in milliseconds and represents the length of time
ArcGlobe should wait before requesting the next frame of the
video.
For example, a value of 100 would mean that ArcGlobe would request a frame every 0.1 seconds, thereby playing the video at a speed of 10 frames per second.
- Note that this is the "play speed" of the video, which can be different from the real-world time units that each frame represents.
- This value is in milliseconds and represents the length of time
ArcGlobe should wait before requesting the next frame of the
video.
Example
The following is an example video source layer made from a folder of frames at 10 frames per second.
<ImageSource FrameSourceType="Folder">
<ImageFolderPath>\\server1\VideoOverlays\Analysis1</ImageFolderPath>
<Frames>
<NameFormat>Frame#.jpg</NameFormat>
<FirstIndex>1</FirstIndex>
<Width>600</Width>
<Height>480</Height>
</Frames>
<FrameRequestRate>100</FrameRequestRate>
</ImageSource>
Custom source
This option is designed for advanced users who can write code to create a custom video layer, such as a live feed from a moving vehicle. ArcGlobe will request video frames from the custom DLL at the defined intervals rather than accessing a specific file or folder of frames on disk. The AGV file will identify this video source by containing the connection information inside these XML tags: <VideoSource FrameSourceType="DLLServer"> and <VideoSource>, as shown in the example XML text below.
The primary data source tag for this video layer type is <Location>, which defines the full path to the DLL that will respond to ArcGlobe requests for video frame images and georeferencing information.
You will also need to define the following:
- The real-world refresh rate (in milliseconds) in the
<FrameRequestRate> tag
- This value is in milliseconds and represents the length of time
ArcGlobe should wait before requesting the next frame of the video
from the custom DLL.
Example: A value of 6000 would mean that ArcGlobe would request a frame every 6 seconds, thereby playing the video at a speed of 10 frames per minute.
- Note that this is the play speed of the video, which can be different from the real-world time units that each frame represents
- This value is in milliseconds and represents the length of time
ArcGlobe should wait before requesting the next frame of the video
from the custom DLL.
- As many optional <Parameter> elements, as
needed by your custom DLL
- Example: Your custom DLL can handle requests for many videos, with each AGV file using a <Parameter> element to distinguish which video feed the particular layer is connected to.
Example
The following is an example video source layer made from a custom DLL server with 10 frames per minute:
<VideoSource FrameSourceType="DLLServer">
<Location>C:\Program Files\App1\bin\MyVideoFrameProvider.dll</Location>
<FrameRequestRate>6000</FrameRequestRate>
<Parameter>VideoSource 1</ Parameter >
</VideoSource>