The portion of the video signal containing the chrominance or luminance information; all video lines not occurring in the vertical blanking signal containing the chrominance or luminance information. See also chrominance, composite video, horizontal blanking, luminance, and video waveform.
One of several types of digital video artifact appearing as jagged edges. Aliasing results when an image is sampled that contains frequency components above the Nyquist limit for the sampling rate. See also Nyquist limit.
See alpha value.
Overlaying one image on another so that some of the underlying image may or may not be visible. See also key.
A bank of memory that stores alpha values; the values are 8 bits per pixel.
Registers that stores an alpha value.
The component of a pixel that specifies the pixel's opacity, translucency, or transparency. The alpha component is typically output as a separate component signal.
Filtering or blending lines of video to smooth the appearance of jagged edges in order to reduce the visibility of aliasing.
Average Picture Level, with respect to blanking, during active picture time, expressed as a percentage of the difference between the blanking and reference white levels. See also blanking level.
In video systems, an unnatural or artificial effect that occurs when the system reproduces an image; examples are aliasing, pixellation, and contouring.
The ratio of the width to the height of an electronic image. For example, the standard aspect ratio for television is 4:3.
The portion of the horizontal pedestal that follows the horizontal synchronizing pulse. In a composite signal, the color burst is located on the back porch, but is absent on a YUV or GBR signal. See also blanking level, video waveform.
A component videotape format developed by Sony® that uses a Y/R-Y/B-Y video signal and 1/2-inch tape.
Advanced form (Superior Performance) of Betacam using special metal tape and offering longer recording time (90 minutes instead of 30 minutes) and superior performance.
A region of memory that contains the pixels representing an image. The pixels are arranged in the sequence in which they are normally scanned to display the image.
One of a group of memory arrays for storing an image in bitmap format on a workstation. The workstation reads the bitplanes in parallel to re-create the image in real time.
Active video signal that has only black in it. The black portion of the video signal, containing color burst. See also color burst.
In the active video portion of the video waveform, the voltage level that defines black. See also horizontal blanking and video waveform.
The signal level at the beginning and end of the horizontal and vertical blanking intervals, typically representing zero output (0 IRE). See also video waveform and IRE units.
To combine proportional amounts of a 3D graphic over a clip frame by frame, pixel by pixel, with the alpha determining how they are combined. See also key, frame, and alpha.
In the horizontal blanking part of the video signal, the portion between the end of the horizontal sync pulse and the beginning of the color burst. See also horizontal blanking and video waveform.
Vertical synchronizing pulses in the center of the vertical interval. These pulses are long enough to be distinguished from other pulses in the signal; they are the part of the signal actually detected by vertical sync separators.
In PAL signals, a four-field burst blanking sequence used to ensure that burst phase is the same at the end of each vertical interval.
See color burst.
The ability of the output subcarrier to be locked to input subcarrier, or of output to be genlocked to an input burst.
In the RS-170A standard, burst phase is at field 1, line 10; in the European PAL standards, it is at field 1, line 1. Both define a continuous burst waveform to be in phase with the leading edge of sync at these points in the video timing. See also vertical blanking interval and video waveform.
One of the color difference signals used on the NTSC and PAL systems, obtained by subtracting luminance (Y) from the blue camera signal (B). This signal drives the horizontal axis of a vectorscope. Color mixture is close to blue; phase is 180 degrees opposite of color sync burst; bandwidth is 0.0 to 0.5 MHz. See also luminance, R-Y signal, Y signal, and Y/R-Y/B-Y.
Chrominance; the color portion of the signal. For example, the Y/C video format used for S-VHS has separate Y (luminance) and C (chrominance) signals. See also chrominance.
Component Analog Video; a generic term for all analog component video formats, which keep luminance and chrominance information separate. D1 is a digital version of this signal. See also component video.
Type C, or one-inch reel-to-reel videotape machine; an analog composite recording format still used in some broadcast and postproduction applications.
The digital interface standard developed by the CCIR (Comite' Consultatif International de Radiodiffusion, International Radio Consultative Committee) based on component color encoding, in which the luminance and chrominance (color difference) sampling frequencies are related in the ratio 4:2:2: four samples of luminance (spread across four pixels), two samples of CR color difference, and two samples of CB color difference. The standard, which is also referred to as 4:2:2, sets parameters for both 525-line and 625-line systems.
Overlaying one video source on another by choosing a key color. For example, if chroma keying is on blue, video source A might show through video source B everywhere the color blue appears in video source B. A common example is the TV weather reporter standing in front of the satellite weather map. The weather reporter, wearing any color but blue, stands in front of a blue background; keying on blue shows the satellite picture everywhere blue appears. Because there is no blue on the weatherperson, he or she appears to be standing in front of the weather map.
A 3.58 MHz (NTSC) or 4.43 MHz (PAL) subcarrier signal for color in television. SECAM uses two frequency-modulated color subcarriers transmitted on alternate horizontal lines; SCR is 4.406 MHz and SCB is 4.250 MHz.
In an image reproduction system, a separate signal that contains the color information. Black, white, and all shades of gray have no chrominance and contain only the luminance (brightness) portion of the signal. However, all colors have both chrominance and luminance.
Chrominance is derived from the I and Q signals in the NTSC television system and the U and V signals in the PAL television system. See also luminance.
Also called the chroma, or C, signal. The high-frequency portion of the video signal
(3.58 MHz for NTSC, 4.43 MHz for PAL) color subcarrier with quadrature modulation by I (R-Y) and Q (B-Y) color video signals. The amplitude of the C signal is saturation; the phase angle is hue. See also color subcarrier, hue, and saturation.
In the context of the Video Library, an application that has connected to the video daemon to perform video requests.
Segment of video, audio, or both. An image is a clip that is one frame long.
A test pattern used by video engineers to determine the quality of a video signal, developed by the Society of Television and Motion Picture Engineers (SMPTE). The test pattern consists of equal-width bars representing black, white, red, green, blue, and combinations of two of the three RGB values: yellow, cyan, and magenta. These colors are usually shown at 75% of their pure values. Figure Gl-1 diagrams the color bars.
Also called burst and burst flag. The segment of the horizontal blanking portion of the video signal that is used as a reference for decoding color information in the active video part of the signal. The color burst is required for synchronizing the phase of 3.58 MHz oscillator in the television receiver for correct hues in the chrominance signal.
In composite video, the image color is determined by the phase relationship of the color subcarrier to the color burst. The color burst sync is 8 to 11 cycles of 3.58 MHz color subcarrier transmitted on the back porch of every horizontal pulse; the hue of the color sync phase is yellow-green.
Figure Gl-2 diagrams the relationship of the color burst and the chrominance signal. See also color subcarrier and video waveform.
Signals used by color television systems to convey color information so that the signals go to zero when the picture contains no color; for example, unmodulated R-Y and B-Y, I and Q, U, and V.
In NTSC and S-Video, a two-frame sequence that must elapse before the same relationship between line pairs of video and frame sync repeats itself. In PAL, the color-frame sequence consists of four frames.
A color component encoding format defined by three color components, such as R, G, and B or Y, U, and V.
A portion of the active portion of a composite video signal that carries color information, referenced to the color burst. The color subcarrier's amplitude determines saturation; its phase angle determines hue. Hue and saturation are derived with respect to the color burst. Its frequency is defined as 3.58 MHz in NTSC and 4.43 MHz in PAL. See also color burst.
Opposite hue and phase angle from a primary color. Cyan, magenta, and yellow are complementary colors for red, green, and blue, respectively.
Process that improves the accuracy of extracting color and brightness portions of the signal from a composite video source.
A color encoding method for the three color signals—R, G, and B; Y, I, and Q; or Y, U, and V—that make up a color image. See also RGB, YIQ, and YUV.
A video signal in which luminance and chrominance are send as separate components, for example:
RGB (basic signals generated from a camera)
YIQ (used by the NTSC broadcasting standard)
Y/R-Y/B-Y (used by Betacam and M-II recording formats and SECAM broadcasting standard)
YUV (subset of Y/R-Y/B-Y used by the PAL broadcasting standard)
Separating these components yields a signal with a higher color bandwidth than that of composite video.
Figure Gl-3 depicts video signals for one horizontal scan of a color-bar test pattern. The RGB signals change in relation to the individual colors in the test pattern. When a secondary color is generated, a combination of the RGB signals occurs. Since only the primary and secondary colors are being displayed at 100% saturation, the R, G, and B waveforms are simply on or off. For more complex patterns of color, the individual R, G, and B signals would be varying amplitudes in the percentages needed to express that particular color.
See also composite video, RGB, YUV, Y/R-Y/B-Y, and YIQ.
Combining graphics with another image.
A color encoding method or a video signal that contains all of the color, brightness, and synchronizing information in one signal. The chief composite television standard signals are NTSC, PAL, and SECAM. See also NTSC, PAL, and SECAM.
Also known as cross-color, hanging dots, dot crawl; moving colors on stationary objects. This undesirable artifact is caused by high bandwidth luminance information being misinterpreted as color information. Hanging dots are a byproduct of the comb filters (used to help separate the color and brightness information) found in most modern television receivers. This artifact can be reduced or eliminated by using S-Video or a component video format.
A type of transition in which one video clip is faded down while another is faded up.
Digital recording technique for component video; also known as CCIR 601, 4:2:2. D1 is the best choice for high-end production work where many generations of video are needed. D1 can be an 8-bit or 10-bit signal. See also CCIR 601.
Digital recording technique for composite video. As with analog composite, the luminance and chrominance information is sent as one signal. A D2 VTR offers higher resolution and can make multiple generation copies without noticeable quality loss, because it samples an analog composite video signal at four times the subcarrier (using linear quantization), representing the samples as 8-bit digital words. D2 is not compatible with D1.
Developed by Panasonic, a 1/2-inch tape version of D2. More often called DX.
Hardware or software that converts, or decodes, a composite video signal into the various components of the signal. For example, to grab a frame of composite video from a VHS tapedeck and store it as an RGB file, it would have to be decoded first. Several Silicon Graphics video options have on-board decoders.
Approximating a signal value on a chroma-limited display device by producing a matrix of color values that fool human perception into believing that the signal value is being reproduced accurately. For example, dithering is used to display a true-color image on a display capable of rendering only 256 unique colors, such as IndigoVideo images on a Starter Graphics display.
In the context of the Video Library, a target or consumer of video signals.
The process in which data is examined, created, and modified. In video, the part of the postproduction process in which the finished videotape is derived from raw video footage. Animation is a subset of editing.
Device that combines the R, G, and B primary color video signals into hue and saturation for the C portion of a composite signal. Several Silicon Graphics video options have on-board encoders.
Pulse of one half the width of the horizontal sync pulse, transmitted at twice the rate of the horizontal sync pulse, during the portions of the vertical blanking interval immediately before and after the vertical sync pulse. The equalizing pulse makes the vertical deflection start at the same time in each interval, and also keeps the horizontal sweep circuits in step during the portions of the vertical blanking interval immediately before and after the vertical sync pulse.
Exceptional or noteworthy condition produced during video processing, such as loss of sync, dropping of frames or fields, and synchronization with other applications.
A term applied to usage of the video data stream and controls on a pathway. A pathway in exclusive-use mode is available for writing of controls only to the client that requested the exclusive use, yet any application may read the controls on that pathway.
To modify the opacity and/or volume of a clip. A faded-up clip is unaffected, a clip faded down to 50% has 50% less opacity or volume, and a faded-down clip is completely transparent of turned off.
One of two (or more) equal parts of information in which a frame is divided in interlace scanning. A vertical scan of a frame carrying only its odd-numbered or its even-numbered lines. The odd field and even field make up the complete frame. See also frame and interlace.
A filter that corrects flicker by averaging pixel values across successive fields. See also flicker.
The blanking signals at the end of each field, used to make the vertical retrace invisible. Also called vertical blanking; see vertical blanking and vertical blanking interval.
To process a clip with spatial or frequency domain methods. Each pixel is modified by using information from neighboring (or all) pixels of the image. Filter functions include blur (low-pass) and crisp (high-pass).
The effect caused by a one-pixel-deep line in a high-resolution graphics frame that is output to a low-resolution monitor, because the line is in only one of the alternating fields that make up the frame. This effect can be filtered out by field averaging. See also field and frame.
The result of a complete scanning of one image. In television, the odd field (all the odd lines of the frame) and the even field (all the even lines of the frame) make up the frame. In motion video, the image is scanned repeatedly, making a series of frames.
A condition on the digitized video signal where the digitizing is stopped and the contents of the signal appear frozen on the display or in the buffer. Sometimes used to capture the video data for processing or storage.
Signal cycles per second.
Placing of harmonic frequencies of C signal midway between harmonics of horizontal scanning frequency Fh. Accomplished by making color subcarrier frequency exactly 3.579545 MHz. This frequency is an odd multiple of H/2.
The portion of the video signal between the end of active video and the falling edge of sync. See also back porch, horizontal blanking, and video waveform.
Color mixture close to green, with a bandwidth 0.0 MHz to 0.5 MHz. Usually formed by combining B-Y and R-Y video signals.
Correction of gray-scale inconsistency. The brightness characteristic of a CRT is not linear with respect to voltage; the voltage-to-intensity characteristic is usually close to a power of 2.2. If left uncorrected, the resulting display has too much contrast and detail in black regions is not reproduced.
To correct this inconsistency, a correction factor using the 2.2 root of the input signal is included, so that equal steps of brightness or intensity at the input are reproduced with equal steps of intensity at the display.
Synchronizing with another video signal serving as a master timing source. The master timing source can be a composite video signal, a video signal with no active video (only sync information), or, for video studio, a device called house sync. When there is no master sync available, VideoFramer, for example, can be set to “free run” (or “stand-alone”) mode, so that it becomes the master timing device to which other devices sync. See also line lock.
Monochrome or black-and-white, as in a monitor that does not display color.
Number of complete horizontal lines, including trace and retrace, scanned per second.
High-definition television. Though there is more than one proposal for a broadcast standard for HDTV, most currently available equipment is based on the 1125/60 standard, that is, 1125 lines of video, with a refresh rate of 60Hz, 2:1 interlacing (same as NTSC and PAL), and aspect ratio of 16:9 (1920 x 1035 viewable resolution), trilevel sync, and 30 MHz RGB and luminance bandwidth.
An 8mm recording format developed by Sony; accepts composite and S-Video signals.
The period when the electron beam is turned off, beginning when each scan line finishes its horizontal path (scan) across the screen (see Figure Gl-4).
Also known as the horizontal retrace interval, the period when a scanning process is moving from the end of one horizontal line to the start of the next line. This portion of the signal is used to carry information other than video information. See also video waveform.
The portion of the horizontal blanking part of the video signal composed of the sync pulse together with the front porch and breezeway; that is, horizontal blanking minus the color burst. See also video waveform.
The lowest portion of the horizontal blanking part of the video signal, it provides a pulse for synchronizing video input with output. Also known as
h sync. See also horizontal blanking and video waveform.
Hue-saturation-value; see hue-saturation-intensity.
The designation of a color in the spectrum, such as cyan, blue, magenta. Sometimes called tint on NTSC television receivers. The varying phase angles in the 3.58 MHz (NTSC) or 4.43 MHz (PAL) C signal indicate the different hues in the picture information.
A tristimulus color system based on the parameters of hue, saturation, and intensity (luminance). Also referred to as HSI or HSV.
Color video signal transmitted as amplitude modulation of the 3.58 MHz C signal (NTSC). The hue axis is orange and cyan. This signal is the only color video signal with a bandwidth of 0 to 1.3 MHz.
Manipulating an image by changing its color, brightness, shape, or size.
A technique that uses more than one vertical scan to reproduce a complete image. In television, the 2:1 interlace used yields two vertical scans (fields) per frame: the first field consists of the odd lines of the frame, the other of the even lines. See also field and frame.
A scale for measuring analog video signal levels, normally starting at the bottom of the horizontal sync pulse and extending to the top of peak white. Blanking level is 0 IRE units and peak white level is 100 IRE units (700mv). An IRE unit equals 7.14mv (+100 IRE to -40 IRE = 1v). IRE stands for Institute of Radio Engineers, a forerunner of the IEEE.
Combining proportional amounts of two frames, pixel by pixel, with optional opacity. This process resembles taking two panes of glass with images on them and placing one pane on top of the other. The opacity of the top pane determines the parts of the bottom pane that show. Usually, keying is a real-time continuous process, as in the “over the shoulder” graphics in TV news programs. The alpha component of each pixel, which defines its opacity, determines how the images are combined. Combining images based on the alpha component is often called alpha keying or luma keying. See also compositing and mixing.
The portion of the video waveform after active video, between the sync threshold and the sync pulse. See also video waveform.
The result of a single pass of the sensor from left to right across the image.
The blanking signal at the end of each horizontal scanning line, used to make the horizontal retrace invisible. Also called horizontal blanking.
The number of horizontal scans per second, normally 15,734.26 times per second for NTSC color systems. The line frequency for the PAL 625/50H. system is 15,625 times per second.
Input timing that is derived from the horizontal sync signal, also implying that the system clock (the clock being used to sample the incoming video) is an integer multiple of the horizontal frequency and that it is locked in phase to the horizontal sync signal. See also at video waveform.
The process of combining a group of signals through addition or subtraction; for example, RGB signals into luminance and chrominance signals.
Video being delivered at a nominal frame rate appropriate to the format.
The video signal that describes the amount of light in each pixel. Luminance is a weighted sum of the R, G, and B signals. See also chrominance and Y signal.
Numerical lookup of pixel data that modifies each pixel without using neighboring pixels. This large category of video editing functions includes clip/gain, solarization, and histogram equalization.
A second-generation recording format based on a version of the Y/R-Y/B-Y video signal. Developed by Panasonic, MII is also marketed by other video manufacturers. Though similar to Betacam, it is nonetheless incompatible.
The process of converting analog color signals from one tristimulus format to another, for example, RGB to YUV. See also tristimulus color system.
In video editing, combining two clips frame by frame, pixel by pixel. Usually, a linear interpolation between the pixels in each clip is used, with which one can, for example, perform a cross-fade. Other operations include averaging, adding, differencing, maximum (non-additive mix), minimum, and equivalence (white where equal, else black). See also compositing and keying.
A test pattern consisting of sets of vertical lines with closer and closer spacing; used for testing horizontal resolution of a video system.
A color television standard or timing format encoding all of the color, brightness, and synchronizing information in one signal. Used in North America, most of South America, and most of the Far East, this standard is named after the National Television Systems Committee, the standardizing body that created this system in the U.S. in 1953. NTSC employs a total of 525 horizontal lines per frame, with two fields per frame of 262.5 lines each. Each field refreshes at 60Hz (actually 59.94Hz).
The highest frequency of input signal that can be correctly sampled without aliasing. The Nyquist limit is equal to half of the sampling frequency.
In the context of a video signal, the relative coordinates from the upper left corner of the video image where signal sampling begins.
To scan a little beyond the display raster area of the monitor so that the edges of the raster are not visible. Television is overscanned; computer displays are underscanned.
A color television standard or timing format developed in West Germany and used by most other countries in Europe, including the United Kingdom but excluding France, as well as Australia and parts of the Far East. PAL employs a total of 625 horizontal lines per frame, with two fields per frame of 312.5 lines per frame. Each field refreshes at 50Hz. PAL encodes color differently from NTSC. PAL stands for Phase Alternation Line or Phase Alternated by Line, by which this system attempts to correct some of the color inaccuracies in NTSC. See also NTSC and SECAM.
In the Video Library, a connection of sources and drains that provide useful processing of video signals. Pathways have controls and video streams. Pathways can be locked for exclusive use, and are the target of events generated during video processing. See also exclusive use and event.
See setup; see also video waveform.
Picture element; the smallest addressable spatial element of the computer graphics screen. A digital image address, or the smallest reproducible element in analog video. A pixel can be monochrome, gray-scale, or color, and can have an alpha component to determine opacity or transparency. Pixels are referred to as having a color component and an alpha component, even if the color component is gray-scale or monochrome.
A two-dimensional piece of memory, any number of bits deep. See also bitmap.
The processes that occur before release of the finished video product, including editing, painting (2D graphics), production, and 3D graphics production.
Red, green, and blue. Opposite voltage polarities are the complementary colors cyan, magenta, and yellow.
The color video signal that modulates 3.58 MHz C signal in quadrature with the I signal. Hues are green and magenta. Bandwidth is 0.0 MHz to 0.5 MHz. See also C signal, I signal, YC, and YIQ.
The magnitude of the error introduced in a signal when the actual signal is between levels, resulting from subdividing a video signal into distinct increments, such as levels from 0 to 255.
The scanning pattern for television display; a series of horizontal lines, usually left to right, top to bottom. In NTSC and PAL systems, the first and last lines are half lines.
A logical or arithmetic operation on a pixel value.
The process of causing two frames to coincide exactly. In component video cameras or displays, the process of causing the three color images to coincide exactly, so that no color fringes are visible.
Number of horizontal lines in a television display standard; the higher the number, the greater a system's ability to reproduce fine detail.
Red, green, blue; the basic component set used by graphics systems and some video cameras, in which a separate signal is used for each primary color.
The technical specification for NTSC color television. Often (incorrectly) used to refer to an RGB signal that is being sent at NTSC composite timings, for example, a Silicon Graphics computer set to output 640 x 480. The timing would be correct to display on a television, but the signal would still be split into red, green and blue components. This component signal would have to go through an encoder to yield a composite signal (RS-170A format) suitable for display on a television receiver.
A color difference signal obtained by subtracting the luminance signal from the red camera signal. It is plotted on the 90 to 270 degree axis of a vector diagram. The R-Y signal drives the vertical axis of a vectorscope. The color mixture is close to red. Phase is in quadrature with B-Y; bandwidth is 0.0 MHz to 0.5 MHz. See also luminance, B-Y (B minus Y) signal, Y/R-Y/B-Y, and vectorscope.
To read the value of a signal at evenly spaced points in time; to convert representational data to sampled data (that is, synthesizing and rendering).
Number of samples per second.
Color intensity; zero saturation is white (no color) and maximum saturation is the deepest or most intense color possible for that hue. Different saturation values are varying peak-to-peak amplitudes in the 3.58 MHz modulated C signal. In signal terms, saturation is determined by the ratio between luminance level and chrominance amplitude. See also hue.
To change the size of an image.
To convert an image to an electrical signal by moving a sensing point across the image, usually left to right, top to bottom.
Sequentiel Couleur avec Memoire, the color television system developed in France and used there as well as in eastern Europe, the Near East and Mideast, and parts of Africa and the Caribbean.
The difference between the blackest level displayed on the receiver and the blanking level (see Figure Gl-6). A black level that is elevated to 7.5 IRE instead of being left at 0.0 IRE, the same as the lowest level for active video. Because the video level is known, this part of the signal is used for black-level clamping circuit operation. Setup is typically used in the NTSC video format and is typically not used in the PAL video format; it was originally introduced to simplify the design of early television receivers, which had trouble distinguishing between video black levels and horizontal blanking. Also called pedestal.
Figure Gl-6 shows waveform displays of a signal with and without setup. See also video waveform.
An artifact usually caused by mid-frequency distortions in an analog system that results in the vertical edges of the picture spreading horizontally.
A signal specified by the Society of Motion Picture and Television Engineers for facilitating videotape editing; this signal uniquely identifies each frame of the video signal. Program originators use vertical blanking interval lines 12 through 14 to store a code identifying program material, time, frame number, and other production information (see Figure Gl-7).
In the context of the Video Library, a provider of video input signals.
A portion of a video signal that carries a specific signal, such as color. See color subcarrier.
A unit derived from a pixel by using a filter for sizing and positioning.
Video format in which the Y (luminance) and C (chrominance) portions of the signal are kept separate. Also known as YC.
The part of the television video signal that ensures that the display scanning is synchronized with the broadcast scanning. See also video waveform.
A vertical or horizontal pulse (or both) that determines the display timing of a video signal. Composite sync has both horizontal and vertical sync pulses, as well as equalization pulses. The equalization pulses are part of the interlacing process.
The lowest part of the horizontal blanking interval, used for synchronization. See also video waveform.
To perform time shifting so that things line up.
Applying images to three-dimensional objects to give additional realism to displayed renderings.
To send a signal through a transmission line accurately, there must be an impedance at the end which matches the impedance of the source and of the line itself. Amplitude errors, frequency response, and pulse distortions and reflections (ghosting) occur on a line without proper termination. Video is a 75Ohm system; therefore a 75Ohm terminator of .5% to .25% accuracy must be installed at the end of the signal path.
In a digital circuit, the signal level that is specified as the division point between levels used to represent different digital values; for example, the sync threshold is the level at which the leading edge of sync begins. See also video waveform.
Analog artifacts caused by nonuniform motion of videotape or of the tape head drum. Time-base errors usually cause horizontal display problems, such as horizontal jitter.
See SMPTE time code.
Frame-by-frame alignment of all video inputs to one sync pulse, so that all frames start at the same time. This alignment is necessary because cable length differences cause unequal delays. See time-base errors.
A device that converts a component video signal to a different component video signal, for example, RGB to Y/R-Y/B-Y, or D1 to RGB.
A microphone, video camera, or other device that can convert sounds or images to electrical signals.
The geometric perspective transformation of 3-D graphics models and planar images.
A system of transmitting and reproducing images that uses three color signals, for example, RGB, YIQ, and YUV.
One of the chrominance signals of the PAL color television system, along with V. Sometimes referred to as B-Y, but U becomes B-Y only after a weighting factor of 0.493 is applied. The weighting is required to reduce peak modulation in the composite signal.
Sony trademark of its 3/4-inch composite videotape format. SP U-Matic is an improved version using metal tape.
To scan a television screen so that the edges of the raster are visible. See also overscan.
One of the chrominance signals of the PAL color television system, along with U. Sometimes referred to as R-Y, but V becomes R-Y only after a weighting factor of 0.877 is applied. The weighting is required to reduce peak modulation in the composite signal.
A specialized oscilloscope that demodulates the video signal and presents a display of R-Y versus B-Y for NTSC (V and U for PAL). Video engineers use vectorscopes to measure the amplitude (gain) and phase angle (vector) of the primary (red, green, and blue) and the secondary (yellow, cyan, and magenta) color components of a television signal.
The portion of the video signal that is blanked so that the vertical retrace of the beam is not visible.
The blanking portion at the beginning of each field. It contains the equalizing pulses, the vertical sync pulses, and vertical interval test signals (VITS). Also the period when a scanning process is moving from the lowest horizontal line back to the top horizontal line.
Video signal amplitude.
The electrical signal produced by a scanning image sensor.
Table Gl-12 lists major videotape formats.
U-Matic (SP) cassette, 3/4-inch
Type C reel-to-reel, 1-inch composite
Type B (Europe), composite
Type MII (component)
D1 525/625 (YUV)
D2 525 (NTSC)
D2 625 (PAL)
The main components of the video waveform are the active video portion and the horizontal blanking portion. Certain video waveforms carry information during the horizontal blanking interval.
Figure Gl-8 and Figure Gl-9 diagram a typical red or blue signal, which carries no information during the horizontal blanking interval, and a typical Y or green-plus-sync signal, which carries a sync pulse.
Figure Gl-10 and Figure Gl-11 show the video waveform and its components for composite video in more detail. The figures show the composite video waveform with and without setup, respectively.
Figure Gl-10 shows a composite video signal with setup.
Figure Gl-11 shows a composite video signal without setup.
In the active video portion of the video waveform, the 1.0-volt (100 IRE) level. See also video waveform.
Luminance, corresponding to the brightness of an image. See also luminance and Y/R-Y/B-Y.
A color space (color component encoding format) based on YIQ or YUV. Y is luminance, but the two chroma signals (I and Q or U and V) are combined into a composite chroma called C, resulting in a two-wire signal. C is derived from I and Q as follows:
C - I cos(2\xb9 fsct) + Q sin(2\xb9 fsct)
where fsc is the subcarrier frequency. YC-358 is the NTSC version of this luminance/chrominance format; YC-443 is the PAL version. Both are referred to as S-Video formats.
A color space (color component encoding format) used in decoding, in which Y is the luminance signal and I and Q are the chrominance signals. The two chrominance signals I and Q (in-phase and quadrature, respectively) are two-phase amplitude-modulated; the I component modulates the subcarrier at an angle of 0 degrees and the Q component modulates it at 90 degrees. The color burst is at 33 degrees relative to the Q signal.
The amplitude of the color subcarrier represents the saturation values of the image; the phase of the color subcarrier represents the hue value of the image.
Y = 0.299R + 0.587G + 0.114B
I = 0.596R - 0.275G - 0.321B
Q = 0.212R - 0.523G + 0.311B
A name for the YUV color component encoding format that summarizes how the chrominance components are derived. Y is the luminance signal and R-Y and B-Y are the chrominance signals. R-Y (red minus Y) and B-Y (blue minus Y) are the color differences or chrominance components. The color difference signals R-Y and B-Y are derived as follows:
Y = 0.299R + 0.587 + 0.114B
Y/R-Y/B-Y has many variations, just as NTSC and PAL do. All component and composite color encoding formats are derived from RGB without scan standards being changed. The matrix (amount of red, green, and blue) values and scale (amplitude) factors can differ from one component format to another (YUV, Y/R-Y, B-Y, SMPTE Y/R-Y, B-Y).
A color space (color component encoding format) used by the PAL video standard, in which Y is the luminance signal and U and V are the chrominance signals. The two chrominance signals U and V are two-phase amplitude-modulated. The U component modulates the subcarrier at an angle of 0 degree, but the V component modulates it at 90 degrees or 180 degrees on alternate lines. The color burst is also line-alternated at +135 and -135 degrees relative to the U signal. The YUV matrix multiplier derives colors from RGB via the following formula:
Y = .299R + .587 G + .114 B
CR = R - Y
CB = B - Y
In this formula, Y represents luminance; red and blue are derived from it: CR denotes red and (V), CB denotes blue. V corresponds to CR; U corresponds to CB c. The U and V signals are carried on the same bandwidth. This system is sometimes referred to as Y/R-Y/B-Y.
The name for this color encoding method is YUV, despite the fact that the order of the signals according to the formula is YVU.