The processing of a video input/output path is described by two sets of parameters:
Video parameters describe how to interpret and generate the signal as it arrives and leaves, as discussed in this chapter
Image parameters describe how to write/read the resulting bits to/from the device (see Chapter 7, “Image Buffer Parameters”)
Not all parameters may be supported on a particular video jack or path. Some parameters may be adjusted on both a path and a jack, or may be adjusted on just one or the other. Use mlGetCapabilities to obtain a list of parameters supported by a jack or path. In addition, not all values may be supported on a particular parameter. Use mlPvGetCapabilities to obtain a list of the values supported by the parameter.
This chapter contains the following sections:
|Note: This chapter assumes a working knowledge of digital video concepts. Readers unfamiliar with terms such as video timing , 422, or CbYCr should consult a text devoted to this subject. A good resource is A Technical Introduction to Digital Video by Charles Poynton, published by John Wiley & Sons, 1996 (ISBN 0-471-12253-X, hardcover).|
Progressive sampling is frame-based (for example, from film)
Interlaced sampling is field-based
Imagine an automatic film advance camera that can take 60 pictures per second, with which you take a series of pictures of a moving ball. Figure 6-1 shows 10 pictures from that sequence (different colors emphasize the different positions of the ball in time). The time delay between each picture is a 60th of a second, so this sequence lasts 1/6th of a second.
Pairs of sample fields are superimposed on each other ( interlaced) to create the video frame. In the video frame, the sample frames appear coincident to the eye even though they are consecutive. This effect is aided by the persistence of phosphors on the display screen that hold the impression of the first set of scanned lines as the second set displays. (For example, this sequence is made visible if you videotape a computer monitor display.)
Most video signals in use today, including several high-definition video formats, are field-based (interlaced) rather than frame-based (progressive). In ML, the value of the video timing parameter ML_VIDEO_TIMING_INT32 defines the specific video standard, and each standard is defined as progressive or interlaced.
For example, suppose you shoot the moving ball with an NTSC video camera. NTSC video has 60 fields-per-second, so you might think that the video camera would record the same series of pictures as shown in Figure 6-1, but it does not. The video camera does record 60 images per second, but each image consists of only half of the scanned lines of the complete picture at a given time, as shown in Figure 6-2, rather than a filmstrip of 10 complete images.
Note how the image lines alternate between odd- and even-numbered images.
Queries the incoming genlock signal for an output path. Not all devices may be able to sense genlock timing, but those that do will support this parameter. Common values match those for ML_VIDEO_TIMING listed in “Video Parameter Descriptions”, plus the following:
There is no signal present
The timing of the genlock cannot be determined
Describes the genlock source timing. Only accepted on output paths. Each genlock source is specified as an output timing on the path and corresponds to the same timings as available with ML_VIDEO_TIMING_INT32.
Output will use an internal colorbar generator
Output will use the default input signal as a pass-through
No output signal with legal synchronization values will be generated
The device repeats the last field. For progressive signals or interleaved formats, this is the same as ML_VIDEO_REPEAT_FRAME .
The device repeats the last two fields. This output capability is device dependent and the allowable settings should be queried via the get capabilities of the ML_VIDEO_OUTPUT_REPEAT_INT32 parameter.
The device does nothing, usually resulting in black output.
Used to query the incoming signal on an input path. Not all devices may be able to sense timing, but those that do will support this parameter. Common values match those for ML_VIDEO_TIMING listed in “Video Parameter Descriptions”, plus the following:
There is no signal present
The timing of the input signal cannot be determined
Sets the timing on an input or output video path. Not all timings may be supported on all devices. On devices that can auto-detect, the timing may be read-only on input. (Details of supported timings may be obtained by calling mlPvGetCapabilites on this parameter). Figure B-1 and Figure B-2 illustrate details of the 601 standard.
|Note: See Appendix B, “Common Video Standards” for diagrams of common video standards.|
The format is as follows:
Total number of lines.
|yyyy x zzzz|
Width by height of the active video region (high definition).
The frame rate, followed by one of the following:
MLpv message message.param = ML_VIDEO_TIMING_INT32 message.value.int32 = ML_TIMING_1125_1920x1080_5994p; message.param = ML_VIDEO_COLORSPACE_INT32; message.value.int32 = ML_COLORSPACE_CbYCr_709_HEAD; message.param = ML_END; mlSetControls( device, message);