Hubbry Logo
Analog televisionAnalog televisionMain
Open search
Analog television
Community hub
Analog television
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Analog television
Analog television
from Wikipedia

Early monochrome analog receiver with large dials for volume control and channel selection, and smaller ones for fine-tuning, brightness, contrast, and horizontal and vertical hold adjustments.

Analog television (or analogue television), the original television technology, uses analog signals to transmit video and audio.[1] In an analog television broadcast, brightness, color, and sound are represented by the amplitude, phase, and frequency of the signal.

The strength of an analog signal varies over a continuous range of possible values, meaning that electronic noise and interference may be introduced. Thus, a moderately weak signal becomes snowy and subject to interference. In contrast, picture quality from a digital television (DTV) signal remains good until the signal level drops below a certain threshold (the "digital cliff"), where reception is either no longer possible or becomes intermittent.

Analog television may be wireless (as in terrestrial and satellite television) or distributed over a cable network (cable television).

All broadcast television systems traditionally used analog signals. Starting after the year 2000, motivated by the lower bandwidth requirements of compressed digital signals, a digital television transition has been underway in most of the world, with different deadlines for the cessation of analog broadcasts. Countries that still primarily use analogue systems are mostly in Africa, Asia, and South America.

Development

[edit]

The earliest systems of analog television were mechanical television systems that used spinning disks with patterns of holes punched into the disc to scan an image. A similar disk reconstructed the image at the receiver. Synchronization of the receiver disc rotation was handled through sync pulses broadcast with the image information. Camera systems used similar spinning discs and required intensely bright illumination of the subject for the light detector to work. The reproduced images from these mechanical systems were dim, very low resolution and flickered severely.

Analog television did not begin in earnest as an industry until the development of the cathode ray tube (CRT), which uses a focused electron beam to trace lines across a phosphor coated surface. The electron beam could be swept across the screen much faster than any mechanical disc system, allowing for more closely spaced scan lines and much higher image resolution. Also, far less maintenance was required of an all-electronic system compared to a mechanical spinning disc system. All-electronic systems became popular with households after World War II.

Standards

[edit]
Analog television system by nation

Broadcasters of analog television encode their signal using different systems. The official systems of transmission were defined by the ITU in 1961 as: A, B, C, D, E, F, G, H, I, K, K1, L, M and N.[2] These systems determine the number of scan lines, frame rate, channel width, video bandwidth, video-audio separation, and so on. A color encoding scheme (NTSC, PAL, or SECAM) could be added to the base monochrome signal.[3] Using RF modulation the signal is then modulated onto a very high frequency (VHF) or ultra high frequency (UHF) carrier wave. Each frame of a television image is composed of scan lines drawn on the screen. The lines are of varying brightness; the whole set of lines is drawn quickly enough that the human eye perceives it as one image. The process repeats and the next sequential frame is displayed, allowing the depiction of motion. The analog television signal contains timing and synchronization information so that the receiver can reconstruct a two-dimensional moving image from a one-dimensional time-varying signal.

The first commercial television systems were black-and-white; the beginning of color television was in the 1950s.[4]

A practical television system needs to take luminance, chrominance (in a color system), synchronization (horizontal and vertical), and audio signals, and broadcast them over a radio transmission. The transmission system must include a means of television channel selection.

Analog broadcast television systems come in a variety of frame rates and resolutions. Further differences exist in the frequency and modulation of the audio carrier. The monochrome combinations still existing in the 1950s were standardized by the International Telecommunication Union (ITU) as capital letters A through N. When color television was introduced, the chrominance information was added to the monochrome signals in a way that black and white televisions ignore. In this way backward compatibility was achieved.

There are three standards for the way the additional color information can be encoded and transmitted. The first was the American NTSC system. The European and Australian PAL and the French and former Soviet Union SECAM standards were developed later and attempt to cure certain defects of the NTSC system. PAL's color encoding is similar to the NTSC systems. SECAM, though, uses a different modulation approach than PAL or NTSC. PAL had a late evolution called PALplus, allowing widescreen broadcasts while remaining fully compatible with existing PAL equipment.

In principle, all three color encoding systems can be used with any scan line/frame rate combination. Therefore, in order to describe a given signal completely, it is necessary to quote the color system plus the broadcast standard as a capital letter. For example, the United States, Canada, Mexico and South Korea used (or use) NTSC-M,[a] Japan used NTSC-J,[b] the UK used PAL-I,[c] France used SECAM-L,[d] much of Western Europe and Australia used (or use) PAL-B/G,[e] most of Eastern Europe uses SECAM-D/K or PAL-D/K and so on.

Not all of the possible combinations exist. NTSC is only used with system M, even though there were experiments with NTSC-A (405 line) in the UK and NTSC-N (625 line) in part of South America. PAL is used with a variety of 625-line standards (B, G, D, K, I, N) but also with the North American 525-line standard, accordingly named PAL-M. Likewise, SECAM is used with a variety of 625-line standards.

For this reason, many people refer to any 625/25 type signal as PAL and to any 525/30 signal as NTSC, even when referring to digital signals; for example, on DVD-Video, which does not contain any analog color encoding, and thus no PAL or NTSC signals at all.

Although a number of different broadcast television systems are in use worldwide, the same principles of operation apply.[5]

Displaying an image

[edit]
Raster scanning is performed from left-to-right and top-to-bottom. Once the screen has been scanned, the beam returns to the beginning of the first line.
Close up image of analog color screen

A CRT television displays an image by scanning a beam of electrons across the screen in a pattern of horizontal lines known as a raster. At the end of each line, the beam returns to the start of the next line; at the end of the last line, the beam returns to the beginning of the first line at the top of the screen. As it passes each point, the intensity of the beam is varied, varying the luminance of that point. A color television system is similar except there are three beams that scan together and an additional signal known as chrominance controls the color of the spot.

When analog television was developed, no affordable technology for storing video signals existed; the luminance signal had to be generated and transmitted at the same time at which it is displayed on the CRT. It was therefore essential to keep the raster scanning in the camera (or other device for producing the signal) in exact synchronization with the scanning in the television.

The physics of the CRT require that a finite time interval be allowed for the spot to move back to the start of the next line (horizontal retrace) or the start of the screen (vertical retrace). The timing of the luminance signal must allow for this.

The human eye has a characteristic called phi phenomenon. Quickly displaying successive scan images creates the illusion of smooth motion. Flickering of the image can be partially solved using a long persistence phosphor coating on the CRT so that successive images fade slowly. However, slow phosphor has the negative side effect of causing image smearing and blurring when rapid on-screen motion occurs.

The maximum frame rate depends on the bandwidth of the electronics and the transmission system, and the number of horizontal scan lines in the image. A frame rate of 25 or 30 hertz is a satisfactory compromise, while the process of interlacing two video fields of the picture per frame is used to build the image. This process doubles the apparent number of video frames per second and further reduces flicker and other defects in transmission.

Receiving signals

[edit]

The television system for each country will specify a number of television channels within the UHF or VHF frequency ranges. A channel actually consists of two signals: the picture information is transmitted using amplitude modulation on one carrier frequency, and the sound is transmitted with frequency modulation at a frequency at a fixed offset (typically 4.5 to 6 MHz) from the picture signal.

The channel frequencies chosen represent a compromise between allowing enough bandwidth for video (and hence satisfactory picture resolution), and allowing enough channels to be packed into the available frequency band. In practice, a technique called vestigial sideband is used to reduce the channel spacing, which would be nearly twice the video bandwidth if pure AM was used.

Signal reception is invariably done via a superheterodyne receiver: the first stage is a tuner which selects a television channel and frequency-shifts it to a fixed intermediate frequency (IF). The signal amplifier performs amplification to the IF stages from the microvolt range to fractions of a volt.

Extracting the sound

[edit]

At this point, the IF signal consists of a video carrier signal at one frequency and the sound carrier at a fixed offset in frequency. A demodulator recovers the video signal. Also at the output of the same demodulator is a new frequency-modulated sound carrier at the offset frequency. In some sets made before 1948, this was filtered out, and the sound IF of about 22 MHz was sent to an FM demodulator to recover the basic sound signal. In newer sets, this new carrier at the offset frequency was allowed to remain as intercarrier sound, and it was sent to an FM demodulator to recover the basic sound signal. One particular advantage of intercarrier sound is that when the front panel fine-tuning knob is adjusted, the sound carrier frequency does not change with the tuning, but stays at the above-mentioned offset frequency. Consequently, it is easier to tune the picture without losing the sound.

So the FM sound carrier is then demodulated, amplified, and used to drive a loudspeaker. Until the advent of the NICAM and MTS systems, television sound transmissions were monophonic.

Structure of a video signal

[edit]

The video carrier is demodulated to give a composite video signal[f] containing luminance, chrominance and synchronization signals.[6] The result is identical to the composite video format used by analog video devices such as VCRs or CCTV cameras. To ensure good linearity and thus fidelity, consistent with affordable manufacturing costs of transmitters and receivers, the video carrier is never modulated to the extent that it is shut off altogether. When intercarrier sound was introduced later in 1948, not completely shutting off the carrier had the side effect of allowing intercarrier sound to be economically implemented.

Diagram showing video signal amplitude against time.
NTSC composite video signal (analog)
A waterfall display showing a 20 ms long interlaced PAL frame with high FFT resolution

Each line of the displayed image is transmitted using a signal as shown above. The same basic format (with minor differences mainly related to timing and the encoding of color) is used for PAL, NTSC, and SECAM television systems. A monochrome signal is identical to a color one, with the exception that the elements shown in color in the diagram (the colorburst, and the chrominance signal) are not present.

Portion of a PAL video signal. From left to right: end of a video scan line, front porch, horizontal sync pulse, back porch with colorburst, and beginning of next line

The front porch is a brief (about 1.5 microsecond) period inserted between the end of each transmitted line of picture and the leading edge of the next line's sync pulse. Its purpose was to allow voltage levels to stabilise in older televisions, preventing interference between picture lines. The front porch is the first component of the horizontal blanking interval which also contains the horizontal sync pulse and the back porch.[7][8][9]

The back porch is the portion of each scan line between the end (rising edge) of the horizontal sync pulse and the start of active video. It is used to restore the black level (300 mV) reference in analog video. In signal processing terms, it compensates for the fall time and settling time following the sync pulse.[7][8]

In color television systems such as PAL and NTSC, this period also includes the colorburst signal. In the SECAM system, it contains the reference subcarrier for each consecutive color difference signal in order to set the zero-color reference.

In some professional systems, particularly satellite links between locations, the digital audio is embedded within the line sync pulses of the video signal, to save the cost of renting a second channel. The name for this proprietary system is Sound-in-Syncs.

Monochrome video signal extraction

[edit]

The luminance component of a composite video signal varies between 0 V and approximately 0.7 V above the black level. In the NTSC system, there is a blanking signal level used during the front porch and back porch, and a black signal level 75 mV above it; in PAL and SECAM these are identical.

In a monochrome receiver, the luminance signal is amplified to drive the control grid in the electron gun of the CRT. This changes the intensity of the electron beam and therefore the brightness of the spot being scanned. Brightness and contrast controls determine the DC shift and amplification, respectively.

Color video signal extraction

[edit]
Color bar generator test signal

U and V signals

[edit]

A color signal conveys picture information for each of the red, green, and blue components of an image. However, these are not simply transmitted as three separate signals, because: such a signal would not be compatible with monochrome receivers, an important consideration when color broadcasting was first introduced. It would also occupy three times the bandwidth of existing television, requiring a decrease in the number of television channels available.

Instead, the RGB signals are converted into YUV form, where the Y signal represents the luminance of the colors in the image. Because the rendering of colors in this way is the goal of both monochrome film and television systems, the Y signal is ideal for transmission as the luminance signal. This ensures a monochrome receiver will display a correct picture in black and white, where a given color is reproduced by a shade of gray that correctly reflects how light or dark the original color is.

The U and V signals are color difference signals. The U signal is the difference between the B signal and the Y signal, also known as B minus Y (B-Y), and the V signal is the difference between the R signal and the Y signal, also known as R minus Y (R-Y). The U signal then represents how purplish-blue or its complementary color, yellowish-green, the color is, and the V signal how purplish-red or its complementary, greenish-cyan, it is. The advantage of this scheme is that the U and V signals are zero when the picture has no color content. Since the human eye is more sensitive to detail in luminance than in color, the U and V signals can be transmitted with reduced bandwidth with acceptable results.

In the receiver, a single demodulator can extract an additive combination of U plus V. An example is the X demodulator used in the X/Z demodulation system. In that same system, a second demodulator, the Z demodulator, also extracts an additive combination of U plus V, but in a different ratio. The X and Z color difference signals are further matrixed into three color difference signals, (R-Y), (B-Y), and (G-Y). The combinations of usually two, but sometimes three demodulators were:

  1. (I) / (Q), (as used in the 1954 RCA CTC-2 and the 1985 RCA "Colortrak" series, and the 1954 Arvin, and some professional color monitors in the 1990s),
  2. (R-Y) / (Q), as used in the 1955 RCA 21-inch color receiver,
  3. (R-Y) / (B-Y), used in the first color receiver on the market (Westinghouse, not RCA),
  4. (R-Y) / (G-Y), (as used in the RCA Victor CTC-4 chassis),
  5. (R-Y) / (B-Y) / (G-Y),
  6. (X) / (Z), as used in many receivers of the late '50s and throughout the '60s.

In the end, further matrixing of the above color-difference signals c through f yielded the three color-difference signals, (R-Y), (B-Y), and (G-Y).

The R, G, and B signals in the receiver needed for the display device (CRT, Plasma display, or LCD display) are electronically derived by matrixing as follows: R is the additive combination of (R-Y) with Y, G is the additive combination of (G-Y) with Y, and B is the additive combination of (B-Y) with Y. All of this is accomplished electronically. It can be seen that in the combining process, the low-resolution portion of the Y signals cancel out, leaving R, G, and B signals able to render a low-resolution image in full color. However, the higher resolution portions of the Y signals do not cancel out, and so are equally present in R, G, and B, producing the higher-resolution image detail in monochrome, although it appears to the human eye as a full-color and full-resolution picture.

NTSC and PAL systems

[edit]
Color signals mixed with the video signal (two horizontal lines in sequence)

In the NTSC and PAL color systems, U and V are transmitted by using quadrature amplitude modulation of a subcarrier. This kind of modulation applies two independent signals to one subcarrier, with the idea that both signals will be recovered independently at the receiving end. For NTSC, the subcarrier is at 3.58 MHz.[g] For the PAL system it is at 4.43 MHz.[h] The subcarrier itself is not included in the modulated signal (suppressed carrier), it is the subcarrier sidebands that carry the U and V information. The usual reason for using suppressed carrier is that it saves on transmitter power. In this application a more important advantage is that the color signal disappears entirely in black and white scenes. The subcarrier is within the bandwidth of the main luminance signal and consequently can cause undesirable artifacts on the picture, all the more noticeable in black and white receivers.

A small sample of the subcarrier, the colorburst, is included in the horizontal blanking portion, which is not visible on the screen. This is necessary to give the receiver a phase reference for the modulated signal. Under quadrature amplitude modulation the modulated chrominance signal changes phase as compared to its subcarrier and also changes amplitude. The chrominance amplitude (when considered together with the Y signal) represents the approximate saturation of a color, and the chrominance phase against the subcarrier reference approximately represents the hue of the color. For particular test colors found in the test color bar pattern, exact amplitudes and phases are sometimes defined for test and troubleshooting purposes only.

Due to the nature of the quadrature amplitude modulation process that created the chrominance signal, at certain times, the signal represents only the U signal, and 70 nanoseconds (NTSC) later, it represents only the V signal. About 70 nanoseconds later still, -U, and another 70 nanoseconds, -V. So to extract U, a synchronous demodulator is utilized, which uses the subcarrier to briefly gate the chroma every 280 nanoseconds, so that the output is only a train of discrete pulses, each having an amplitude that is the same as the original U signal at the corresponding time. In effect, these pulses are discrete-time analog samples of the U signal. The pulses are then low-pass filtered so that the original analog continuous-time U signal is recovered. For V, a 90-degree shifted subcarrier briefly gates the chroma signal every 280 nanoseconds, and the rest of the process is identical to that used for the U signal.

Gating at any other time than those times mentioned above will yield an additive mixture of any two of U, V, -U, or -V. One of these off-axis (that is, of the U and V axis) gating methods is called I/Q demodulation. Another much more popular off-axis scheme was the X/Z demodulation system. Further matrixing[clarification needed] recovered the original U and V signals. This scheme was actually the most popular demodulator scheme throughout the 1960s.[clarification needed]

The above process uses the subcarrier. But as previously mentioned, it was deleted before transmission, and only the chroma is transmitted. Therefore, the receiver must reconstitute the subcarrier. For this purpose, a short burst of the subcarrier, known as the colorburst, is transmitted during the back porch (re-trace blanking period) of each scan line. A subcarrier oscillator in the receiver locks onto this signal (see phase-locked loop) to achieve a phase reference, resulting in the oscillator producing the reconstituted subcarrier.[i]

Test card showing "Hanover bars" (color banding phase effect) in the PAL-S (simple) signal mode of transmission.

NTSC uses this process unmodified. Unfortunately, this often results in poor color reproduction due to phase errors in the received signal, sometimes caused by multipath but mostly by poor implementation at the studio end. With the advent of solid-state receivers, cable TV, and digital studio equipment for conversion to an over-the-air analog signal, these NTSC problems have been largely fixed, leaving operator error at the studio end as the sole color rendition weakness of the NTSC system.[citation needed] In any case, the PAL D (delay) system mostly corrects these kinds of errors by reversing the phase of the signal on each successive line, and averaging the results over pairs of lines. This process is achieved by the use of a 1H (where H = horizontal scan frequency) duration delay line.[j] Phase shift errors between successive lines are therefore canceled out and the wanted signal amplitude is increased when the two in-phase signals are re-combined.

NTSC is more spectrum efficient than PAL, giving more picture detail for a given bandwidth. This is because sophisticated comb filters in receivers are more effective with NTSC's 4 color frame sequence compared to PAL's 8-field sequence. However, in the end, the larger channel width of most PAL systems in Europe still gives PAL systems the edge in transmitting more picture detail.

SECAM system

[edit]

In the SECAM television system, U and V are transmitted on alternate lines, using simple frequency modulation of two different color subcarriers.

In some analog color CRT displays, starting in 1956, the brightness control signal (luminance) is fed to the cathode connections of the electron guns, and the color difference signals (chrominance signals) are fed to the control grids connections. This simple CRT matrix mixing technique was replaced in later solid state designs of signal processing with the original matrixing method used in the 1954 and 1955 color TV receivers.

Synchronization

[edit]

Synchronizing pulses added to the video signal at the end of every scan line and video frame ensure that the sweep oscillators in the receiver remain locked in step with the transmitted signal so that the image can be reconstructed on the receiver screen.[7][8][10]

A sync separator circuit detects the sync voltage levels and sorts the pulses into horizontal and vertical sync.

Horizontal synchronization

[edit]

The horizontal sync pulse separates the scan lines. The horizontal sync signal is a single short pulse that indicates the start of every line. The rest of the scan line follows, with the signal ranging from 0.3 V (black) to 1 V (white), until the next horizontal or vertical synchronization pulse.

The format of the horizontal sync pulse varies. In the 525-line NTSC system it is a 4.85 μs pulse at 0 V. In the 625-line PAL system the pulse is 4.7 μs at 0 V. This is lower than the amplitude of any video signal (blacker than black) so it can be detected by the level-sensitive sync separator circuit of the receiver.

Two timing intervals are defined – the front porch between the end of the displayed video and the start of the sync pulse, and the back porch after the sync pulse and before the displayed video. These and the sync pulse itself are called the horizontal blanking (or retrace) interval and represent the time that the electron beam in the CRT is returning to the start of the next display line.

Vertical synchronization

[edit]

Vertical synchronization separates the video fields. In PAL and NTSC, the vertical sync pulse occurs within the vertical blanking interval. The vertical sync pulses are made by prolonging the length of horizontal sync pulses through almost the entire length of the scan line.

The vertical sync signal is a series of much longer pulses, indicating the start of a new field. The sync pulses occupy the whole line interval of a number of lines at the beginning and end of a scan; no picture information is transmitted during vertical retrace. The pulse sequence is designed to allow horizontal sync to continue during vertical retrace; it also indicates whether each field represents even or odd lines in interlaced systems (depending on whether it begins at the start of a horizontal line, or midway through).

The format of such a signal in 525-line NTSC and 625-line PAL is:

  • pre-equalizing pulses (6 to start scanning odd lines, 5 to start scanning even lines)
  • long-sync pulses (5 pulses)
  • post-equalizing pulses (5 to start scanning odd lines, 4 to start scanning even lines)

Each pre- or post-equalizing pulse consists of half a scan line of black signal: 2 μs at 0 V, followed by 30 μs at 0.3 V. Each long sync pulse consists of an equalizing pulse with timings inverted: 30 μs at 0 V, followed by 2 μs at 0.3 V.

In video production and computer graphics, changes to the image are often performed during the vertical blanking interval to avoid visible discontinuity of the image. If this image in the framebuffer is updated with a new image while the display is being refreshed, the display shows a mishmash of both frames, producing page tearing partway down the image.

Horizontal and vertical hold

[edit]

The sweep (or deflection) oscillators were designed to run without a signal from the television station (or VCR, computer, or other composite video source). This allows the television receiver to display a raster and to allow an image to be presented during antenna placement. With sufficient signal strength, the receiver's sync separator circuit would split timebase pulses from the incoming video and use them to reset the horizontal and vertical oscillators at the appropriate time to synchronize with the signal from the station.

The free-running oscillation of the horizontal circuit is especially critical, as the horizontal deflection circuits typically power the flyback transformer (which provides acceleration potential for the CRT) as well as the filaments for the high voltage rectifier tube and sometimes the filament(s) of the CRT itself. Without the operation of the horizontal oscillator and output stages in these television receivers, there would be no illumination of the CRT's face.

The lack of precision timing components in early equipment meant that the timebase circuits occasionally needed manual adjustment. If their free-run frequencies were too far from the actual line and field rates, the circuits would not be able to follow the incoming sync signals. Loss of horizontal synchronization usually resulted in an unwatchable picture; loss of vertical synchronization would produce an image rolling up or down the screen.

Older analog television receivers often provide manual controls to adjust horizontal and vertical timing. The adjustment takes the form of horizontal hold and vertical hold controls, usually on the front panel along with other common controls. These adjust the free-run frequencies of the corresponding timebase oscillators.

A slowly rolling vertical picture demonstrates that the vertical oscillator is nearly synchronized with the television station but is not locking to it, often due to a weak signal or a failure in the sync separator stage not resetting the oscillator.

Horizontal sync errors cause the image to be torn diagonally and repeated across the screen as if it were wrapped around a screw or a barber's pole; the greater the error, the more copies of the image will be seen at once wrapped around the barber pole.

By the early 1980s the efficacy of the synchronization circuits, plus the inherent stability of the sets' oscillators, had been improved to the point where these controls were no longer necessary. Integrated Circuits which eliminated the horizontal hold control were starting to appear as early as 1969.[11]

The final generations of analog television receivers used IC-based designs where the receiver's timebases were derived from accurate crystal oscillators. With these sets, adjustment of the free-running frequency of either sweep oscillator was unnecessary and unavailable.

Horizontal and vertical hold controls were rarely used in CRT-based computer monitors, as the quality and consistency of components were quite high by the advent of the computer age, but might be found on some composite monitors used with the 1970s–80s home or personal computers.

Other technical information

[edit]

Components of a television system

[edit]
block diagram of a television receiver showing tuner, intermediate frequency amplifier. A demodulator separates sound from video. Video is directed to the CRT and to the synchronizing circuits.
Block diagram for a typical analog monochrome television receiver

The tuner is the object which, with the aid of an antenna, isolates the television signals received over the air. There are two types of tuners in analog television, VHF and UHF tuners. The VHF tuner selects the VHF television frequency. This consists of a 4 MHz video bandwidth and about 100 kHz audio bandwidth. It then amplifies the signal and converts it to a 45.75 MHz Intermediate Frequency (IF) amplitude-modulated video and a 41.25 MHz IF frequency-modulated audio carrier.

The IF amplifiers are centered at 44 MHz for optimal frequency transference of the audio and video carriers.[k] Like radio, television has automatic gain control (AGC). This controls the gain of the IF amplifier stages and the tuner.

The video amp and output amplifier is implemented using a pentode or a power transistor. The filter and demodulator separates the 45.75 MHz video from the 41.25 MHz audio then it simply uses a diode to detect the video signal. After the video detector, the video is amplified and sent to the sync separator and then to the picture tube.

The audio signal goes to a 4.5 MHz amplifier. This amplifier prepares the signal for the 4.5 MHz detector. It then goes through a 4.5 MHz IF transformer to the detector. In television, there are 2 ways of detecting FM signals. One way is by the ratio detector. This is simple but very hard to align. The next is a relatively simple detector. This is the quadrature detector. It was invented in 1954. The first tube designed for this purpose was the 6BN6 type. It is easy to align and simple in circuitry. It was such a good design that it is still being used today in the Integrated circuit form. After the detector, it goes to the audio amplifier.

Image synchronization is achieved by transmitting negative-going pulses.[l] The horizontal sync signal is a single short pulse that indicates the start of every line. Two-timing intervals are defined – the front porch between the end of the displayed video and the start of the sync pulse, and the back porch after the sync pulse and before the displayed video. These and the sync pulse itself are called the horizontal blanking (or retrace) interval and represent the time that the electron beam in the CRT is returning to the start of the next display line.

The vertical sync signal is a series of much longer pulses, indicating the start of a new field. The vertical sync pulses occupy the whole of line interval of a number of lines at the beginning and end of a scan; no picture information is transmitted during vertical retrace. The pulse sequence is designed to allow horizontal sync to continue during vertical retrace.[m]

A sync separator circuit detects the sync voltage levels and extracts and conditions signals that the horizontal and vertical oscillators can use to keep in sync with the video. It also forms the AGC voltage.

The horizontal and vertical oscillators form the raster on the CRT. They are driven by the sync separator. There are many ways to create these oscillators. The earliest is the thyratron oscillator. Although it is known to drift, it makes a perfect sawtooth wave. This sawtooth wave is so good that no linearity control is needed. This oscillator was designed for the electrostatic deflection CRTs but also found some use in electromagnetically deflected CRTs. The next oscillator developed was the blocking oscillator which uses a transformer to create a sawtooth wave. This was only used for a brief time period and never was very popular. Finally the multivibrator was probably the most successful. It needed more adjustment than the other oscillators, but it is very simple and effective. This oscillator was so popular that it was used from the early 1950s until today.

Two oscillator amplifiers are needed. The vertical amplifier directly drives the yoke. Since it operates at 50 or 60 Hz and drives an electromagnet, it is similar to an audio amplifier. Because of the rapid deflection required, the horizontal oscillator requires a high-power flyback transformer driven by a high-powered tube or transistor. Additional windings on this flyback transformer typically power other parts of the system.

Portion of a PAL videosignal. From left to right: end of a video line, front porch, horizontal sync pulse, back porch with colorburst, and beginning of next line
Beginning of the frame, showing several scan lines; the terminal part of the vertical sync pulse is at the left
PAL video signal frames. Left to right: frame with scan lines (overlapping together, horizontal sync pulses show as the doubled straight horizontal lines), vertical blanking interval with vertical sync (shows as brightness increase of the bottom part of the signal in almost the leftmost part of the vertical blanking interval), entire frame, another VBI with VSYNC, beginning of the third frame
Analyzing a PAL signal and decoding the 20 ms frame and 64 μs lines

Loss of horizontal synchronization usually results in a scrambled and unwatchable picture; loss of vertical synchronization produces an image rolling up or down the screen.

Timebase circuits

[edit]

In an analog receiver with a CRT display sync pulses are fed to horizontal and vertical timebase circuits (commonly called sweep circuits in the United States), each consisting of an oscillator and an amplifier. These generate modified sawtooth and parabola current waveforms to scan the electron beam. Engineered waveform shapes are necessary to make up for the distance variations from the electron beam source and the screen surface. The oscillators are designed to free-run at frequencies very close to the field and line rates, but the sync pulses cause them to reset at the beginning of each scan line or field, resulting in the necessary synchronization of the beam sweep with the originating signal. The output waveforms from the timebase amplifiers are fed to the horizontal and vertical deflection coils wrapped around the CRT tube. These coils produce magnetic fields proportional to the changing current, and these deflect the electron beam across the screen.

In the 1950s, the power for these circuits was derived directly from the mains supply. A simple circuit consisted of a series voltage dropper resistance and a rectifier. This avoided the cost of a large high-voltage mains supply (50 or 60 Hz) transformer. It was inefficient and produced a lot of heat.

In the 1960s, semiconductor technology was introduced into timebase circuits. During the late 1960s in the UK, synchronous (with the scan line rate) power generation was introduced into solid state receiver designs.[12]

In the UK use of the simple (50 Hz) types of power, circuits were discontinued as thyristor based switching circuits were introduced. The reason for design changes arose from the electricity supply contamination problems arising from EMI, and supply loading issues due to energy being taken from only the positive half cycle of the mains supply waveform.[13]

CRT flyback power supply

[edit]

Most of the receiver's circuitry (at least in transistor- or IC-based designs) operates from a comparatively low-voltage DC power supply. However, the anode connection for a CRT requires a very high voltage (typically 10–30 kV) for correct operation.

This voltage is not directly produced by the main power supply circuitry; instead, the receiver makes use of the circuitry used for horizontal scanning. Direct current (DC), is switched through the line output transformer, and alternating current (AC) is induced into the scan coils. At the end of each horizontal scan line the magnetic field, which has built up in both transformer and scan coils by the current, is a source of latent electromagnetic energy. This stored collapsing magnetic field energy can be captured. The reverse flow, short duration, (about 10% of the line scan time) current from both the line output transformer and the horizontal scan coil is discharged again into the primary winding of the flyback transformer by the use of a rectifier which blocks this counter-electromotive force. A small value capacitor is connected across the scan-switching device. This tunes the circuit inductances to resonate at a much higher frequency. This lengthens the flyback time from the extremely rapid decay rate that would result if they were electrically isolated during this short period. One of the secondary windings on the flyback transformer then feeds this brief high-voltage pulse to a Cockcroft–Walton generator design voltage multiplier. This produces the required high-voltage supply. A flyback converter is a power supply circuit operating on similar principles.

A typical modern design incorporates the flyback transformer and rectifier circuitry into a single unit with a captive output lead, known as a diode split line output transformer or an Integrated High Voltage Transformer (IHVT),[14] so that all high-voltage parts are enclosed. Earlier designs used a separate line output transformer and a well-insulated high-voltage multiplier unit. The high frequency (15 kHz or so) of the horizontal scanning allows reasonably small components to be used.

Transition to digital

[edit]

In many countries, over-the-air broadcast television of analog audio and analog video signals has been discontinued to allow the re-use of the television broadcast radio spectrum for other services.

The first country to make a wholesale switch to digital over-the-air (terrestrial television) broadcasting was Luxembourg in 2006, followed later in 2006 by the Netherlands.[15] The Digital television transition in the United States for high-powered transmission was completed on 12 June 2009, the date that the Federal Communications Commission (FCC) set. Almost two million households could no longer watch television because they had not prepared for the transition. The switchover had been delayed by the DTV Delay Act.[16] While the majority of the viewers of over-the-air broadcast television in the U.S. watch full-power stations (which number about 1800), there are three other categories of television stations in the U.S.: low-power broadcasting stations, class A stations, and television translator stations. These were given later deadlines.

In Japan, the switch to digital began in northeastern Ishikawa Prefecture on 24 July 2010 and ended in 43 of the country's 47 prefectures (including the rest of Ishikawa) on 24 July 2011, but in Fukushima, Iwate, and Miyagi prefectures, the conversion was delayed to 31 March 2012, due to complications from the 2011 Tōhoku earthquake and tsunami and its related nuclear accidents.[17]

In Canada, most of the larger cities turned off analog broadcasts on 31 August 2011.[18]

China turned off analog broadcasting between 2015 and 2021.[19][needs update]

Brazil switched to digital television on 2 December 2007 in São Paulo and planned to end analog broadcasting nationwide by 30 June 2016. However, the Ministry of Communications announced in 2012 that the deadline would be delayed.[20] As of 2024, Brazil is in the process of implementing its next-generation digital television system, known as TV 3.0.[21][22] In July 2024, ATSC 3.0 standard was officially selected for the country's next-generation digital television system.[21] The transition to TV 3.0 began in 2025, with initial deployments planned for key cities such as São Paulo, Rio de Janeiro, and Brasília.[23][needs update]

In Malaysia, the Malaysian Communications and Multimedia Commission advertised for tender bids to be submitted in the third quarter of 2009 for the 470 through 742 MHz UHF allocation, to enable Malaysia's broadcast system to move into DTV. The new broadcast band allocation would result in Malaysia's having to build an infrastructure for all broadcasters, using a single digital terrestrial television broadcast channel.[citation needed] Large portions of Malaysia are covered by television broadcasts from Singapore, Thailand, Brunei, and Indonesia (from Borneo and Batam). Starting from 1 November 2019, all regions in Malaysia were no longer using the analog system after the states of Sabah and Sarawak finally turned it off on 31 October 2019.[24]

In Singapore, digital television under DVB-T2 began on 16 December 2013. The switchover was delayed many times until analog TV was switched off at midnight on 2 January 2019.[25]

In the Philippines, the National Telecommunications Commission required all broadcasting companies to end analog broadcasting on 31 December 2015 at 11:59 p.m. Due to delay of the release of the implementing rules and regulations for digital television broadcast, the target date was moved to 2020. Full digital broadcast was expected in 2021 and all of the analog TV services were to be shut down by the end of 2023.[26] However, in February 2023, the NTC postponed the ASO/DTV transition to 2025 due to many provincial television stations not being ready to start their digital TV transmissions.[27]

In the Russian Federation, the Russian Television and Radio Broadcasting Network (RTRS) disabled analog broadcasting of federal channels in five stages, shutting down broadcasting in multiple federal subjects at each stage. The first region to have analog broadcasting disabled was Tver Oblast on 3 December 2018, and the switchover was completed on 14 October 2019.[28] During the transition, DVB-T2 receivers and monetary compensations for purchasing of terrestrial or satellite digital TV reception equipment were provided to disabled people, World War II veterans, certain categories of retirees and households with income per member below living wage.[29]

See also

[edit]

Notes

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Analog television is a broadcasting technology that transmits audio and video signals using continuous analog waveforms, modulating them onto radio frequency carriers for over-the-air, cable, or satellite distribution. In this system, the video signal combines luminance (brightness) and chrominance (color) information into a composite waveform, typically amplitude-modulated for picture elements and scanned line by line to form images on a cathode-ray tube or similar display. Audio is separately frequency-modulated on a subcarrier within the same channel bandwidth, usually 6 MHz in North America or 8 MHz in Europe. The primary analog television standards varied by region to accommodate different electrical grids and preferences, with the (ITU) recognizing systems like , PAL, and as the main color encoding methods. (National Television System Committee), adopted by the U.S. (FCC) in 1953, uses 525 scan lines per frame at approximately 30 frames per second (precisely 29.97 Hz) and employs for color, serving , , and parts of . PAL (Phase Alternating Line), introduced in 1967 and standardized in , , and much of and , features 625 lines at 25 frames per second, with the color phase inverting each line to reduce hue errors. (Séquentiel Couleur à Mémoire), developed in in 1967 and used in and former Soviet states, also employs 625 lines at 25 frames per second but transmits color information sequentially per line using for stability in transmission. Analog television originated in the 1920s with mechanical scanning experiments but matured in the 1930s–1940s with electronic systems, beginning with black-and-white broadcasts under FCC regulations established in 1941 for 525-line monochrome in the U.S. Color transmission followed in the 1950s, with NTSC enabling compatible color in existing monochrome receivers, while PAL and SECAM addressed perceived deficiencies in NTSC's color fidelity. By the mid-20th century, analog TV dominated global broadcasting, delivering entertainment, news, and education to billions, though susceptible to noise, interference, and signal degradation over distance. The shift to digital television, offering better quality, efficiency, and spectrum reuse, began in the 1990s; the FCC mandated the end of full-power analog broadcasts in the U.S. by June 12, 2009, with transitions in other countries occurring throughout the 2000s and 2010s, and some extending into the 2020s as of 2025.

History and Development

Early Mechanical Systems

The foundations of analog television were laid in the late through mechanical scanning techniques, which relied on physical devices to capture and reproduce images line by line. In 1884, German engineering student patented the "Elektrisches Telescop," a pioneering electromechanical system featuring a rotating disk perforated with a spiral pattern of holes, now known as the . This device worked by sequentially scanning an image with light passing through the holes onto a selenium photocell to convert it into electrical signals, while a similar disk at the receiver reconstructed the image using light modulated by the signals. Although Nipkow's invention was theoretical and never constructed as a functional prototype during his lifetime, it introduced the core concept of raster scanning essential to early television systems. Building on Nipkow's ideas, Scottish inventor advanced into practical demonstrations in the mid-1920s. In late 1925, Baird transmitted the first recognizable moving silhouette of a human face over a short distance using a 30-line system operating at low frame rates. This was followed by the world's first public demonstration of a working television system on January 26, 1926, at his laboratory in , where Baird displayed live moving images of a ventriloquist's dummy to members of the Royal Institution. His setup employed a perforated disk spinning at approximately 750 to achieve the 30-line resolution, with cells for light detection and neon lamps for image display, marking a significant milestone in proving the feasibility of televised imagery. Baird's system later evolved to support resolutions up to 240 lines by 1928, including early experiments with color and long-distance transmission. Despite these breakthroughs, early mechanical systems faced inherent limitations that constrained their viability for widespread use. Resolutions were typically low, ranging from 30 to 240 lines, producing fuzzy, low-detail images confined to small viewing areas of just a few inches. The reliance on mechanical components, such as rapidly spinning disks, introduced flickering from insufficient frame rates—often as low as 5 to 12.5 per second—and susceptibility to mechanical wear, misalignment, and , which degraded and image stability over time. Additionally, these systems demanded intense illumination for scanning, wasting much of the and overheating subjects. These shortcomings, particularly the inability to achieve higher resolutions without prohibitive mechanical complexity, ultimately spurred the transition to fully electronic scanning methods in the late and , enabling clearer, larger, and more reliable broadcasts.

Electronic Advancements

The transition from mechanical to electronic television systems in the marked a pivotal advancement, enabling higher resolution and more reliable image capture and display through technology. In 1923, Vladimir Zworykin, working at Westinghouse Electric, invented the , an early electronic camera tube that used a photoemissive to convert optical images into electrical signals by scanning with an electron beam. This device overcame the limitations of mechanical scanning disks by allowing fully electronic operation. Zworykin also developed the in 1929, a cathode-ray tube (CRT) receiver that displayed images by directing an electron beam onto a phosphorescent screen, forming the basis for modern television displays. These inventions laid the groundwork for all-electronic systems, surpassing the low-resolution and flickering outputs of mechanical precursors like scanners. Building on these foundations, achieved the first fully electronic television transmission on September 7, 1927, using his tube—a camera device that dissected images into electronic signals without mechanical parts—to send a simple line image across a room in . This demonstration proved the viability of electronic scanning for both transmission and reception. Experimental broadcasts followed, including Bell Laboratories' long-distance demonstration in April 1927, which transmitted images from Washington, D.C., to New York over lines, though it relied on a hybrid mechanical-electronic setup. By 1936, the launched the world's first regular high-definition public television service from in , operating at 405 lines of resolution and using electronic Emitron cameras developed by , which broadcast live programming to a small audience of receivers. World War II significantly paused civilian television development, as resources and technology were redirected toward military applications like and electronic guidance systems. In the United States, the (FCC) had approved a standard in 1941, based on recommendations from the National Television System Committee (NTSC), enabling to begin that year with six stations operational. Post-war, a boom ensued in the late 1940s, with receiver production surging as manufacturing scaled up; sets typically featured basic superheterodyne tuners with RF and intermediate-frequency (IF) amplifiers using tubes for signal gain, followed by video detectors and simple CRTs—often 7- to 12-inch screens with low-voltage deflection circuits—to render images. These early designs prioritized affordability and reliability, using off-the-shelf components from radio technology to amplify weak broadcast signals and synchronize electron beam scanning with incoming sync pulses.

Global Standardization Efforts

The International Consultative Committee for Radio (CCIR), established in at the International Radiotelegraph Conference in , under the auspices of the International Telegraph Union (predecessor to the ITU), was tasked with conducting technical studies on radiocommunications to facilitate international coordination and . The CCIR played a pivotal role in harmonizing analog television parameters, including frequency allocations and scanning line counts, through its recommendations and reports that informed global regulatory conferences, aiming to enable cross-border program exchange and minimize interference. By the mid-20th century, the CCIR's work focused on broadcasting service (television) study groups, producing guidelines that influenced national adoptions while accommodating regional variations in power grids and spectrum availability. In the United States, the (FCC) approved the first commercial analog television standard in 1941, specifying a resolution at 60 fields per second (30 frames per second with interlacing), which became the foundation for North American broadcasting and was later designated as . This monochrome standard was developed by the National Television System Committee (NTSC) to balance image quality with available spectrum in the VHF band (channels 2-13). Across , the CCIR coordinated the adoption of a 625-line/50 Hz standard, formalized at the European VHF/UHF Broadcasting Conference in in 1961, which allocated channels in both VHF (Bands I-III) and UHF (Bands IV-V) to support higher resolution suited to the continent's 50 Hz electrical systems and denser population centers. The introduction of color further highlighted the CCIR's standardization efforts, with the FCC approving an enhanced color system on December 17, 1953, which added a color subcarrier to the existing monochrome framework while ensuring for black-and-white receivers. This color standard influenced global variants, such as PAL in (adopted in the 1960s, using 625 lines) and in and , both building on CCIR-recommended parameters for modulation and frequency spacing to adapt to local needs. Standardization faced significant challenges, particularly in allocating VHF (54-216 MHz) and UHF (470-890 MHz) frequency bands to avoid co-channel interference across borders, as varying national priorities led to fragmented channel plans that complicated equipment manufacturing and international transmissions. Compatibility issues arose from divergent line counts and field rates—such as the US 60 Hz versus Europe's 50 Hz—hindering seamless program interchange without conversion, prompting CCIR reports to advocate for hybrid solutions like frequency converters for diplomatic and cultural exchanges. These efforts, while not fully resolving regional disparities, laid the groundwork for the ITU's ongoing radiocommunication sector (ITU-R) after the CCIR's restructuring in 1992.

Signal Fundamentals

Composite Video Signal Structure

The composite video signal in analog television combines luminance information with synchronization components to form a single waveform suitable for transmission and display. This structure ensures precise timing for scanning the image line by line and frame by frame on cathode-ray tube (CRT) receivers. The signal operates at baseband levels, typically ranging from 0 V (black level) to about 1 V (white level), with sync pulses extending below black to -0.3 V. Key components include the active video line, which carries the luminance (brightness) data for the visible picture, spanning approximately 52 μs in systems where it represents the scanned portion of each horizontal line. Horizontal blanking intervals follow, consisting of a front (about 1.5 μs to stabilize the signal), a horizontal sync pulse to trigger the receiver's horizontal deflection, and a back (around 5.8 μs, including color burst in color systems but primarily for sync recovery here). The vertical blanking interval, lasting approximately 21 lines per field in (totaling about 42 lines per frame), conceals the vertical retrace and includes equalizing pulses followed by serrated vertical sync pulses to initiate each field. Timing parameters vary by standard but follow established norms for compatibility. In (525 lines, 2:1 interlaced), the horizontal line duration is 63.5 μs, corresponding to a line frequency of 15.734 kHz, while the vertical is 29.97 Hz (approximately 30 Hz). For PAL (625 lines, 2:1 interlaced), the horizontal line duration is 64 μs (line frequency 15.625 kHz), and the is exactly 25 Hz. The horizontal sync pulse width is approximately 4.7 μs in both systems, given by: τhs4.7μs\tau_{hs} \approx 4.7 \, \mu\text{s} where τhs\tau_{hs} denotes the horizontal sync duration, ensuring reliable deflection triggering. Vertical sync consists of serrated pulses, each 4.7 μs wide but extended over 3 horizontal lines (half-lines in equalizing periods), during the blanking interval to synchronize fields without visible artifacts. The video carrier bandwidth is typically 4.2 MHz for to accommodate luminance details up to about 4 MHz, while PAL systems support 5-6 MHz for enhanced horizontal resolution. These specifications, derived from EIA RS-170 for and ITU-R BT.470 for PAL, allow the composite signal to fit within allocated broadcast channels while maintaining image fidelity. Synchronization pulses, integral to this structure, are addressed in greater detail separately but enable precise receiver locking to the transmitted timing.

Audio Carrier and Modulation

In analog television broadcasting, the audio signal is transmitted on a dedicated carrier frequency that is offset from the video carrier to allow separation in the receiver. This carrier is frequency modulated (FM) by the baseband audio waveform, which typically spans a frequency range of 50 Hz to 15 kHz to ensure compatibility with high-quality audio reproduction. The specific frequency offset varies by standard. In the NTSC system used primarily in , the audio carrier is located 4.5 MHz above the video carrier, fitting within the 6 MHz channel bandwidth. In PAL systems, common in and much of , the offset is 5.5 MHz for the widely adopted B/G variant, accommodating the broader 7-8 MHz channel allocation. The FM modulation employs a peak deviation of ±25 kHz in NTSC and ±50 kHz in PAL, which provides sufficient dynamic range and noise immunity while conserving spectrum; this deviation is achieved after pre-emphasis with a 75 μs time constant in NTSC or 50 μs in PAL to enhance high-frequency response. Receivers extract the audio using intercarrier sound detection, a technique where the video (IF) signal—centered around 45.75 MHz in or 38.9 MHz in PAL—mixes nonlinearly with the incoming RF to generate the audio carrier directly at (e.g., 4.5 MHz or 5.5 MHz). This method simplifies circuitry by eliminating the need for precise alignment of separate audio and video tuners, as the offset remains constant regardless of channel frequency. To support and secondary audio, extensions were introduced without altering the core mono signal. In , (MTS) using BTSC encoding adds left-minus-right (L–R) and pilot tones as subcarriers within the main audio FM band, while the Second Audio Program () provides a separate mono channel (e.g., for alternate languages) as an FM subcarrier at 5 times the horizontal line (approximately 78.67 kHz), limited to ±15 kHz deviation to avoid interference. In PAL, the NICAM (Near Instantaneous Companded Audio Multiplex) system delivers digital or dual-mono audio on a second carrier offset by about 320 kHz from the primary audio carrier (e.g., at 6.552 MHz in B/G systems), using 728 kbit/s encoding for CD-quality sound while maintaining with mono receivers.

Synchronization Pulses

Synchronization pulses in analog television are critical for maintaining image stability by providing timing references that align the receiver's scanning electron beam with the transmitted signal. These pulses ensure precise control over both horizontal and vertical scanning, preventing drift that could cause image distortion or rolling. Inserted within the signal during blanking intervals, they remain invisible to the viewer while enabling the cathode ray tube (CRT) to retrace accurately without producing visible artifacts. The horizontal synchronization is a negative-going designed to trigger the reset of the line scan at the end of each horizontal line. In the standard, this has a duration of 4.7 μs and is positioned within the , which spans approximately 10.9 μs and includes front and back porches to stabilize the signal. The pulse's sharp transitions are essential for reliable detection, with rise and fall times specified to be less than 0.4 μs (measured from 10% to 90% points) to minimize timing errors in the receiver's horizontal oscillator. Vertical synchronization employs a more complex structure to reset the frame scan, incorporating equalization and serrated pulses to accommodate interlaced scanning and maintain horizontal timing during retrace. In systems, it consists of five pre-equalization pulses (each half the width of a horizontal sync and occurring at twice the horizontal rate), followed by five serrated vertical sync pulses (each broad and notched at the horizontal sync rate to prevent horizontal oscillator disruption), and five post-equalization pulses. This arrangement ensures the vertical deflection circuitry returns to the top of the raster precisely, with the core vertical sync duration equivalent to three horizontal line periods (approximately 190.5 μs). The pulses are confined to the vertical blanking interval, typically spanning 21 lines per field, to avoid interference with the active picture area.

Transmission and Reception

Broadcasting Methods

Analog television signals were broadcast using (AM) for the video component and (FM) for the audio component, enabling the transmission of and accompanying sound within the (VHF, 54–216 MHz) and (UHF, 470–806 MHz) bands in North American standards. This dual-modulation approach allowed the video signal, which included and information along with pulses, to be carried on a main carrier while the audio was offset by a subcarrier typically 4.5 MHz higher in the United States or 5.5–6.5 MHz in other regions. The vestigial sideband variant of AM was employed for video to optimize bandwidth usage while minimizing . Channel allocations for analog television were standardized regionally to accommodate these modulated signals, with the assigning 6 MHz per channel across VHF channels 2–13 and UHF channels 14–69, providing a total of 68 channels before the digital transition. In and many other regions adopting PAL or standards, channels were allocated 8 MHz of bandwidth to support higher resolution or guard bands, often spanning VHF Bands I–III and UHF Bands IV–V. These allocations ensured compatibility with existing plans while allowing for interlaced scanning rates of 525 or 625 lines per frame. The modulated composite signal structure formed the basis for these channel assignments, fitting the video bandwidth of approximately 4–5 MHz and audio within the remaining . Over-the-air (OTA) dominated early analog television distribution, utilizing high-powered transmission towers—often exceeding 300 meters in height—to radiate signals omnidirectionally or directionally over wide areas, serving populations through . Typical transmitter output power ranged from 1 kW for low-power fillers to 100 kW for full-service stations, with (ERP) enhanced by antenna gain factors of 10–50 dB to achieve coverage radii of 50–100 km depending on and . In contrast, cable distribution emerged in the late as a complementary method, initially relaying OTA signals via community antennas and amplifiers, but evolving to use cables for direct, interference-free delivery of multiple channels to subscribers, with early systems like the 1948 Astoria installation in marking the shift toward wired networks. Satellite broadcasting provided another method for analog television distribution, particularly from the , using geostationary satellites to relay signals over long distances, including to remote or international locations. Early systems employed C-band frequencies (around 4–6 GHz) for transponding broadcast feeds to affiliates, while later direct-to-home services in the used Ku-band (12–18 GHz) with larger dishes for consumer reception, though susceptible to weather fading and requiring downconverters.

Antenna and Tuner Operation

In analog television reception, the antenna serves as the initial interface for capturing radio frequency (RF) signals broadcast from transmission towers. For very high frequency (VHF) channels, typically in the 54–216 MHz range, directional Yagi-Uda antennas are commonly employed due to their high gain and ability to focus reception toward specific directions, improving signal quality in areas with interference or distance from the transmitter. In contrast, ultra high frequency (UHF) channels, spanning 470–806 MHz in North American standards, often utilize log-periodic dipole array (LPDA) antennas, which provide broadband performance across a wide frequency range with consistent impedance and reduced size compared to single-band designs. For indoor or portable applications, simple dipole antennas, such as the "rabbit ears" design, offer omnidirectional reception suitable for close-proximity signals, though they exhibit lower gain and are more susceptible to multipath distortion. The tuner, integral to the television receiver, employs a superheterodyne architecture to select and process the desired channel's RF signal. This design, invented by Edwin Armstrong in 1918, mixes the incoming RF with a locally generated oscillator signal to produce a fixed (IF) for subsequent amplification, enhancing selectivity and sensitivity. In the standard, the video carrier is converted to a standard IF of 45.75 MHz, while the audio carrier shifts to 41.25 MHz, allowing standardized filtering and amplification stages to isolate the channel from adjacent interference. Channel selection within the tuner is achieved by adjusting the local oscillator to align the desired RF channel with the fixed IF. Early mechanical tuners used variable capacitors, mechanically rotated via a dial to tune resonant circuits in the RF and oscillator stages, enabling precise alignment across VHF bands. Later solid-state designs incorporated varactor diodes—voltage-variable capacitors—controlled electronically by a tuning voltage, which replaced mechanical components for more reliable and compact operation, particularly in UHF tuners where capacitance variation must cover a broader range. To manage varying signal strengths from different broadcast sources or environmental conditions, the tuner incorporates (AGC). This feedback mechanism detects the IF signal and dynamically adjusts the gain of RF and IF amplifiers to maintain a consistent output level, preventing overload from strong signals or amplification of in weak ones. In analog TV receivers, AGC typically derives its control voltage from the video detector, ensuring stable across input levels spanning 50–100 dB.

Demodulation Process

The demodulation process in analog television receivers extracts the video and audio signals from the (IF) signal generated by the tuner. In a superheterodyne architecture, the IF is produced by mixing the incoming (RF) signal with a (LO) tuned such that the IF frequency equals RF minus LO, typically around 45.75 MHz for video in systems. This fixed IF facilitates standardized amplification and filtering before . The video component, amplitude-modulated (AM) onto the carrier with vestigial (VSB) filtering to reduce bandwidth while preserving low-frequency content, undergoes detection for recovery. IF filtering attenuates the upper beyond a vestigial portion (about 0.75 MHz in ), enabling the full lower and carrier to pass, which saves spectrum compared to double- AM without introducing quadrature upon simple detection. A diode-based rectifies the IF signal and applies low-pass filtering to yield the , where the output is directly proportional to the modulation depth of the carrier. Alternatively, synchronous using a phase-locked carrier reference provides higher by multiplying the IF with a regenerated carrier, suppressing and in challenging reception conditions. The audio carrier, offset by 4.5 MHz from the video carrier in and frequency-modulated (FM) with a deviation of ±25 kHz, is separated via IF bandpass filtering and demodulated independently. Common circuits include the ratio detector, which uses two diodes and a tuned to produce an output proportional to the ratio of amplitudes, offering inherent AM suppression for robust performance in TV sound IF stages around 4.5 MHz. Discriminator circuits, such as the Foster-Seeley type, achieve similar FM-to-baseband conversion by detecting phase differences in a balanced mixer configuration, converting frequency shifts to voltage variations for audio recovery.

Video Processing

Monochrome Signal Extraction

In analog television systems, the monochrome signal, denoted as Y, is generated at the transmitter by combining the gamma-corrected (R'), (G'), and (B') components using the weighted formula Y=0.299R+0.587G+0.114BY = 0.299 R' + 0.587 G' + 0.114 B'. These coefficients approximate the visual system's sensitivity to color wavelengths, emphasizing green's dominant role in perceived brightness while minimizing bandwidth requirements for transmission. This formation ensures compatibility with monochrome displays, where the Y signal alone reproduces the image in . In the receiver, the demodulated composite video signal—encompassing the luminance and synchronization components—is processed to extract the usable monochrome signal. DC restoration is essential to reestablish the absolute voltage levels of the Y signal, which are often lost due to AC coupling in the intermediate frequency (IF) and video amplifier stages. Clamping circuits achieve this by sampling the signal during the back porch of the horizontal blanking interval, where the voltage corresponds to the black level, and adjusting the DC bias to maintain proper dynamic range for display. This process prevents image distortion, such as shifts in brightness or contrast, by ensuring the sync tip remains at approximately -0.3 V and white peaks at +0.7 V relative to ground. Following DC restoration, bandpass filtering isolates the luminance signal within its typical bandwidth of 0 to 4-6 MHz, depending on the standard, thereby separating it from low-frequency synchronization pulses and any residual high-frequency components from the audio carrier (around 4.5-5.5 MHz). This filtering, often implemented as a low-pass response with a cutoff near the video bandwidth limit combined with high-pass elements to attenuate sync-related DC wander, ensures clean extraction without interference that could degrade image sharpness. In monochrome receivers, such separation maintains signal integrity across the full luminance spectrum, avoiding crosstalk from adjacent channels or modulation artifacts. Noise reduction in the extracted monochrome signal primarily relies on inherent filtering stages, with basic de-emphasis applied in some transmission contexts to counteract high-frequency amplification during VHF/UHF , though it is more commonly associated with audio subcarriers. Where applicable, this involves a gentle of higher video frequencies at the receiver to restore a flat response, improving without significantly impacting perceived image quality.

Color Signal Encoding and Decoding

In analog television systems, the color information is conveyed through the (C) signal, which is modulated onto a high-frequency subcarrier and combined with the (Y) signal to form a signal compatible with receivers. The subcarrier frequency is chosen to be sufficiently high to separate it from the luminance bandwidth, typically 3.58 MHz in and 4.43 MHz in PAL systems, allowing the chrominance to occupy the upper portion of the video spectrum without significantly interfering with black-and-white viewing. The signal employs (QAM), where two components—I (in-phase) and Q (quadrature), or equivalent U and V in other systems—are simultaneously modulated onto the subcarrier using carriers that are 90 degrees out of phase. The I signal modulates the in-phase (cosine) component, representing primarily orange-to-cyan hues, while the Q signal modulates the quadrature (sine) component, capturing magenta-to-green variations. This dual modulation is expressed mathematically as: C(t)=I(t)cos(ωt)+Q(t)sin(ωt)C(t) = I(t) \cos(\omega t) + Q(t) \sin(\omega t) where ω=2πfsc\omega = 2\pi f_{sc} and fscf_{sc} is the subcarrier frequency (conventions may include a 33° phase offset). In NTSC, the I component has a bandwidth of approximately 1.3 MHz to support finer hue detail, while the Q component is limited to about 0.5 MHz, reflecting lower perceptual sensitivity to those color directions and aiding signal compression within the 6 MHz channel. Decoding the signal at the receiver involves synchronous detection, where the composite signal is multiplied by locally generated carriers phase-locked to the transmitted subcarrier, followed by ing to extract the I and Q signals. This process recovers the color differences, which are then matrixed with the to reconstruct the full RGB image, ensuring as monochrome sets simply the composite to ignore the high-frequency . Phase accuracy is critical, as errors can cause hue shifts, often mitigated by burst signals embedded in the for subcarrier synchronization.

YIQ and Similar Color Spaces

The color space was developed for the analog television standard to separate from information, enabling efficient transmission while maintaining with receivers. In this model, the Y component represents , capturing the brightness information in a manner perceptually weighted to match human vision, calculated as a of gamma-corrected (R'), (G'), and blue (B') primaries. The I (in-phase) component encodes color differences along the orange-cyan axis, while the Q (quadrature) component encodes differences along the green-magenta axis, allowing these signals to be modulated onto a subcarrier without significantly interfering with the signal. The transformation from RGB to YIQ involves deriving I and Q from the color-difference signals (R' - Y') and (B' - Y'), using the following equations: I=0.74(RY)0.27(BY)I = 0.74(R' - Y') - 0.27(B' - Y') Q=0.48(RY)+0.41(BY)Q = 0.48(R' - Y') + 0.41(B' - Y') These coefficients optimize the encoding for the primaries and ensure that the axes align with directions of minimal perceived color error. To ensure compatibility with existing black-and-white televisions, the I and Q signals are transmitted with reduced bandwidth—I limited to approximately 1.3 MHz and Q to 0.5 MHz—compared to the full 4.2 MHz bandwidth of the Y signal, exploiting the human visual system's lower acuity for color details versus . This bandwidth limitation allows monochrome receivers to extract only the Y component, ignoring the narrower subcarrier, thus displaying a functional image without color artifacts. Similar color spaces were adopted in other analog standards, with used in and systems, where the U and V components represent a of the I and Q axes by approximately 33 degrees relative to , adjusting the color-difference directions to suit those transmission formats while preserving the luminance- separation.

Display Mechanisms

Cathode Ray Tube (CRT) Operation

The cathode ray tube (CRT), the primary display mechanism in analog television receivers until the late , functions by accelerating a focused stream of electrons onto a phosphor-coated screen within an evacuated glass envelope. At the rear of the tube, the generates this beam through , where a —typically an oxide-coated filament heated to approximately 800–1000°C—emits electrons as thermal agitation provides sufficient energy to overcome the material's , typically around 1–2 eV. This process, first systematically studied in vacuum tubes, releases electrons at low velocities into the gun's vacuum environment. Within the electron gun, the emitted electrons pass through a control grid that modulates beam intensity in proportion to the video signal's amplitude, followed by one or more focusing electrodes that form electrostatic lenses to converge the diverging electrons into a fine beam, often 0.2–0.5 mm in diameter at the screen. An accelerating anode, maintained at high positive potential relative to the cathode, propels the electrons forward at speeds up to 30% of the speed of light. Focusing is critical for resolution, with the lens voltage typically adjusted to 20–25% of the final anode voltage to minimize spherical aberration and astigmatism in the beam path. Deflection coils, positioned externally around the tube neck, provide magnetic steering: separate horizontal and vertical yoke coils generate perpendicular fields to scan the beam across the screen, with the video signal serving as input to control the current in these coils for raster formation. The deflection angle θ for small angles in magnetic deflection is approximated by θBlV,\theta \approx \frac{B l}{V}, where BB is the magnetic field strength (in tesla), ll is the effective length of the deflection region (in meters), and VV is the beam acceleration voltage (in volts); this relation derives from the Lorentz force balancing centripetal acceleration in the curved beam path, neglecting relativistic effects at typical operating voltages. Electrons exit the gun and traverse the tube's narrow neck, gaining kinetic energy from a final high-voltage anode operated at 10–30 kV, which accelerates them to energies sufficient for efficient phosphor excitation while limiting X-ray production to safe levels below 0.5 mR/h at the tube surface. This voltage range balances brightness—proportional to beam current and energy—with practical power supply constraints in consumer television sets, where higher voltages like 25 kV were common for 19–25 inch screens to achieve luminance up to 300 cd/m². Upon reaching the wide front faceplate, the high-velocity electrons impact the phosphor layer, transferring energy via inelastic collisions and exciting atomic electrons in the phosphor lattice. The screen, a thin aluminized on the inner glass surface, converts electron kinetic energy into visible through . In analog televisions, P1 phosphor (Zn₂SiO₄:Mn) predominates, emitting green with a peak at 525 nm and a decay time to 10% of peak intensity around 10–20 ms, providing sufficient to reduce flicker at 30 Hz field rates without excessive motion blur. Color CRTs employ the P22 phosphor suite, a triad of materials—red (Y₂O₂S:Eu³⁺, decay ~1.5 ms to 10%), green (ZnS:Cu,Al, decay ~6 ms), and blue (ZnS:Ag)—deposited in a dot or stripe pattern behind a , enabling RGB excitation for full-color reproduction; these decay times match /PAL frame rates, ensuring smooth without visible trailing in dynamic scenes. The aluminum backing reflects forward-emitted toward the viewer while conducting away accumulated charge to maintain uniform beam .

Scanning and Persistence

In analog television systems, the image is formed on the screen through raster scanning, where an electron beam sweeps across the display in a series of horizontal lines to build the picture progressively. This process relies on interlaced scanning to reduce bandwidth requirements while maintaining an effective ; each complete frame is divided into two fields—one odd and one even—with the odd field illuminating lines 1, 3, 5, and so on, while the even field covers lines 2, 4, 6, etc. In the standard, each field comprises 262.5 scan lines, yielding a total of per frame at a rate of 30 frames per second (or precisely 29.97). The electron beam's movement is controlled by deflection circuits that generate sawtooth waveforms applied to electromagnetic coils surrounding the cathode ray tube (CRT). Horizontally, the beam scans from left to right at a frequency of 15.734 kHz, with a rapid flyback retrace during the horizontal blanking interval to return to the start of the next line. Vertically, a slower sawtooth waveform at 60 Hz (precisely 59.94 Hz) deflects the beam downward, advancing one line per horizontal cycle and completing one field every 1/60 second before interlacing the next field to form a full frame. To create the illusion of a continuous image despite the intermittent scanning, the CRT's coating exhibits persistence, where excited continue emitting light after beam impact. Typical decay times for television range from 1 to 10 milliseconds, allowing the glow from one scan to overlap with the next and minimizing flicker perceptible to the , which has its own around 1/16 second. Analog television broadcasts adhere to a standard of 4:3, defining the proportional relationship between the width and height of the visible image area. Receivers incorporate margins, typically 3.5% to 5% on each side, to hide edge distortions and ensure critical content remains visible regardless of minor geometric variations in the display.

Alternative Analog Displays

While the cathode ray tube (CRT) dominated analog television displays, a few alternative technologies were prototyped and occasionally deployed, primarily for specialized applications due to their technical constraints. Plasma displays emerged as early prototypes in the , with significant advancements in the early involving gas discharge mechanisms to form pixels. These panels featured arrays of microscopic cells filled with inert gases like , , and , where applied voltages ionized the gas into plasma, producing radiation that excited coatings to emit red, green, or blue light for color images. Researchers at the University of and companies like Owens- demonstrated color prototypes by incorporating phosphors into open-cell structures around 1972, enabling rudimentary video display from analog signals. However, these early systems were limited to small sizes (typically under 10 inches diagonally) and high power consumption, making them unsuitable for mainstream sets. In the 1980s, passive matrix displays (LCDs) with analog inputs found niche use in portable televisions, offering a flat, lightweight alternative to CRTs. These displays operated via a simple grid of row and column electrodes that selectively applied voltage to twist nematic s, modulating polarized light to create grayscale or color images; early color versions used twisted nematic structures enhanced by super-twisted nematic (STN) configurations for better contrast. Devices like the 1983 TV-10 and Sinclair's 1984 pocket TVs featured 2- to 3-inch passive matrix screens that directly processed or PAL analog signals, often relying on reflective modes for daylight viewing but incorporating early backlighting with cold cathode fluorescent lamps (CCFL) in some models for low-light conditions. Despite their portability, passive matrix LCDs exhibited , slow pixel response (around 100-200 ms), and limited resolution (typically 100-200 lines), restricting them to small, low-end consumer products. The Eidophor oil-film projector represented a specialized large-screen solution for analog television, particularly in theaters and auditoriums from the onward. Developed by Swiss physicist Fritz Fischer between 1939 and 1943, the system used a rotating drum coated with a thin (0.4-micrometer) film of glycerin-based , scanned by a low-velocity (5-10 kV) in a raster pattern synchronized to the incoming analog video signal. The beam deposited electrostatic charges that deformed the surface into hills and valleys proportional to signal intensity, altering the film's reflectivity; schlieren optics then projected these modulated deformations onto screens up to 20 feet wide using a high-intensity . Commercialized by Gretag in 1958, Eidophor projectors achieved brightness levels of 10-20 foot-lamberts and supported color via three-gun variants, but required constant replenishment and precise maintenance to prevent contamination. These alternative displays struggled with widespread adoption primarily due to prohibitive costs and inferior performance compared to CRTs. Plasma prototypes demanded expensive vacuum sealing and high-voltage drivers, with production costs exceeding $1,000 per small panel in the , while offering resolutions below 300 lines—far short of CRT's 400-500 lines. Passive matrix LCDs were cheaper to fabricate (under $100 for portables) but suffered from ghosting in dynamic video and narrow viewing angles (around 120 degrees), limiting appeal beyond niche portables. The Eidophor, priced at 50,00050,000-100,000 per unit, excelled in projection but its mechanical complexity and need for skilled operation confined it to professional venues. Overall, these factors ensured CRTs' dominance in consumer analog television until the late .

Color Television Standards

NTSC System Details

The color television standard, formally approved by the (FCC) on December 17, 1953, specifies a frame composed of 525 interlaced scanning lines, with two fields per frame scanned at 59.94 fields per second to achieve an effective 29.97 frames per second, ensuring compatibility with existing broadcasts. This configuration supports a vertical resolution of approximately visible lines, balancing with bandwidth constraints of the era. The color subcarrier is precisely defined at 3.579545 MHz, with a tolerance of ±0.0003%, to interleave information with the signal without significant interference. A key element of the color encoding is the color burst, a signal consisting of 8 to 9 cycles of the unmodulated 3.579545 MHz subcarrier, positioned on the back porch of each horizontal synchronizing pulse following the . This burst provides receivers with a phase and for demodulating the components, enabling accurate recovery of hue and saturation information modulated onto the subcarrier using . The system briefly references the , where (Y) is separated from (I and Q) to allow with black-and-white sets. One notable challenge in reception stems from its sensitivity to phase errors in the color subcarrier, which can manifest as hue shifts across the , often requiring manual tint controls on consumer receivers to adjust the phase reference relative to the color burst. Such errors, potentially arising from transmission distortions or oscillator drift, alter the perceived , with shifts in the I-Q phase plane directly impacting red-cyan and green-magenta hues. As the dominant analog color standard in , parts of , and following its 1953 adoption, remained in widespread use for over five decades until analog broadcast shutdowns transitioned to digital formats. In the United States, full-power transmissions ceased on June 12, 2009, while low-power stations followed by July 13, 2021; completed its nationwide analog termination on July 24, 2011, with select regions extending to March 2012 due to disaster recovery needs.

PAL System Details

The PAL (Phase Alternating Line) system is an analog color television encoding standard developed to provide stable color reproduction by alternating the phase of the chrominance signal on successive scan lines, thereby mitigating transmission-induced phase errors that affect hue accuracy. It builds on the monochrome 625-line format common in Europe, delivering 625 total scan lines per frame at a field rate of 50 Hz (25 frames per second interlaced), which aligns with the 50 Hz mains frequency to reduce flicker. The color subcarrier frequency is precisely 4.433619 MHz, modulated using quadrature amplitude modulation (QAM) to carry the chrominance information alongside the luminance signal. A key feature of PAL is the line-by-line phase switching of the U and V color-difference signals derived from the RGB primaries, where the V signal (related to red-blue differences) is inverted by 180 degrees on every other line while the U signal (related to yellow-blue differences) maintains its phase. This alternation ensures that any differential phase errors introduced during transmission—such as those from cable or multipath interference—are averaged out in the receiver by combining the current line with the delayed previous line, resulting in corrected hue without the need for user-adjustable tint controls common in other systems. To enable this averaging and separate the intertwined luminance (Y) and chrominance (C) components in the composite signal, PAL decoders employ a 64 μs delay line—equivalent to one horizontal line period—as part of a . The delay line stores the previous line's signal, allowing the receiver to add or subtract it from the current line based on the phase inversion, which notches out the subcarrier frequency from the path and extracts stable for into U and V. This comb filtering approach enhances color fidelity and reduces cross-luminance artifacts like dot crawl. PAL was first commercially adopted in in 1967, with initiating regular color broadcasts on August 25 of that year, followed shortly by the United Kingdom's BBC2 launch on July 1. The standard quickly spread across much of the continent and beyond, becoming the dominant analog color system in over 100 countries due to its robustness. A notable variant is PAL-M, introduced in in 1972, which adapts PAL's phase-alternating color encoding to the 525-line, 60 Hz (59.94 fields per second) frame structure originally used for monochrome compatibility, using a subcarrier of approximately 3.575612 MHz.

SECAM System Details

The SECAM (Séquentiel Couleur À Mémoire) system is an analog color television standard characterized by 625 scanning lines per frame and 50 fields per second, operating in an interlaced mode to deliver 25 frames per second. This configuration aligns with the 625/50 format widely used in European broadcasting, providing a vertical resolution suitable for the era's transmission bandwidth constraints. In SECAM, color information is encoded using (FM) of two subcarrier frequencies that alternate line by line: one carrying the Dr (red-luminance) difference signal at a nominal 4.25 MHz, and the other carrying the Db (blue-luminance) difference signal at 4.406 MHz. The FM approach modulates the amplitude of these subcarriers to represent color saturation, with the specific frequency deviation encoding the values, eliminating the need for a color burst signal as required in systems. This sequential transmission transmits only one color difference signal per line, relying on the system's to reconstruct the full color image. Decoding in SECAM receivers involves frequency demodulation of the received subcarrier to recover the signals, followed by a one-line (64 µs) delay line to store the previous line's signal while the current line is processed. Alternate-line switching then combines the delayed and current signals to provide continuous Db and Dr components for matrixing with the (Y) signal, forming the complete RGB output without cross-color artifacts common in other standards. The system is based on the , adapted to YDbDr components for its FM encoding. SECAM was first adopted for broadcast in on October 1, 1967, marking the initial implementation of there. It subsequently gained traction in , including the and countries like , , and , due to political alignments during the era. The system's incompatibility with and PAL standards necessitated converters for cross-format viewing or recording.

Supporting Technical Elements

Timebase Circuits and Synchronization

In analog television receivers, timebase circuits are essential for generating stable horizontal and vertical scanning signals derived from the incoming signal's pulses. These circuits ensure that the beam in the display traces the correctly by locking the receiver's internal oscillators to the transmitted timing information. The process begins with the sync separator, which isolates the horizontal and vertical sync pulses from the video content. The sync separator typically employs a clipper circuit to clamp the sync tips to a reference voltage, separating the negative-going sync pulses from the positive video signal. This clipped signal then passes through filtering stages: a differentiator (high-pass filter) for horizontal sync pulses, producing short pulses at the leading edges of the frequent horizontal syncs (occurring at approximately 15.734 kHz in NTSC systems), and an integrator (low-pass filter) for vertical sync pulses, which attenuates the shorter horizontal pulses while preserving the longer vertical ones (at about 60 Hz). This separation prevents interference between horizontal and vertical timing, directing the respective pulses to their oscillators. For horizontal synchronization, an (AFC) circuit processes the separated horizontal sync pulses along with feedback from the horizontal oscillator, often a relaxation or type running near 15.734 kHz. The AFC generates a DC correction voltage by comparing the phase of incoming sync pulses with the oscillator's output (via flyback pulse), adjusting the oscillator to maintain lock and minimize picture tearing or . A user-adjustable horizontal hold control, typically a , allows fine-tuning of the oscillator's free-running when the incoming signal is weak or noisy, ensuring stable locking. Vertical synchronization relies on an integrator circuit that further processes the vertical sync pulses to trigger a vertical oscillator, usually a sawtooth generator operating at 60 Hz. The smooths the pulses into a ramp-like signal suitable for driving the oscillator, with the hold control—another user adjustment via variable resistance or —modifying the oscillator's free-run rate to compensate for marginal signals, preventing vertical rolling of the . In early designs, these integrators used RC networks for timing stability. Later advancements in analog receivers incorporated phase-locked loops (PLLs) for enhanced precision in timebase synchronization, particularly for horizontal locking. A PLL consists of a phase detector, low-pass filter, and voltage-controlled oscillator (VCO), where the phase detector compares sync pulse timing with VCO output, producing an error voltage that the filter smooths to adjust the VCO frequency and phase dynamically. This approach offered superior noise immunity and reduced pull-in time compared to traditional AFC, becoming common in mid-1970s sets for stable performance under varying signal conditions.

Power Supply and Flyback

In analog cathode ray tube (CRT) televisions, the power supply incorporates a that integrates horizontal deflection with high-voltage generation for the CRT . The horizontal oscillator drives current through the during the active scan line, building magnetic energy. During the horizontal retrace, or flyback, period—when the electron beam returns to the start of the next line—this stored energy is redirected to the primary winding of the flyback transformer, inducing high-voltage pulses in its secondary winding at frequencies around 15.625 kHz for PAL systems or 15.734 kHz for . This dual-function design efficiently utilizes the timebase circuitry to produce the necessary anode voltage without a separate high-voltage oscillator. The high-voltage output from the consists of sharp pulses that require rectification to supply stable DC to the CRT . Half-wave rectification is employed, where a high-voltage —often integrated into the assembly—conducts only during the positive pulse peaks, filtering the output through a to yield approximately 25 kV DC. This configuration minimizes component count while handling the peak inverse voltages exceeding 30 kV inherent to the flyback pulses. Voltage regulation ensures consistent anode potential amid fluctuations in line voltage, beam current, or temperature. A feedback loop samples the via a resistive divider network and feeds it to an error , which compares it against a reference voltage. The then modulates the horizontal output transistor's drive or the low-voltage B+ supply to the horizontal circuit, stabilizing the flyback-generated HV within 1-2% tolerance. This closed-loop control prevents picture distortion such as blooming or effects. Safety features mitigate risks from the high voltages involved. Bleeder resistors, high-value components paralleled across HV storage capacitors, provide a discharge path when the television is powered off, reducing stored energy to safe levels within seconds to minutes and averting shock hazards during servicing. X-ray protection circuits monitor the voltage; if it exceeds regulatory limits (e.g., due to a failing regulator), they trigger shutdown of the horizontal deflection oscillator, halting HV generation and beam acceleration to limit unintended emissions from the CRT .

Overall System Components

Analog television receivers consist of several integrated hardware blocks that process incoming radio frequency (RF) signals to produce video and audio output. The RF section, including the tuner, selects the desired channel from the antenna input and converts it to a fixed (IF), typically 45.75 MHz for video in systems, while rejecting adjacent channels to prevent interference. The IF amplifier then boosts this signal to overcome and ensure adequate strength for subsequent stages, operating over the full 6 MHz channel bandwidth. Following , the video amplifier processes the video signal, amplifying and components before driving the display, while the audio amplifier handles the separated sound carrier at 4.5 MHz offset, typically using FM detection for reproduction through speakers. The deflection circuits, including horizontal and vertical oscillators and amplifiers, generate scanning waveforms to control the beam via the coils around the cathode ray tube (CRT), synchronizing with extracted sync pulses to rasterize the image across 525 lines at 30 frames per second. Early analog television sets in the predominantly relied on vacuum tubes for amplification and across all major blocks, offering reliability in high-voltage CRT deflection but requiring significant power and generating substantial heat. The introduction of transistors marked a pivotal shift, with the first fully transistorized portable television, Sony's TV8-301, released in 1960, enabling smaller, more efficient designs without the fragility of tubes. By the 1970s, solid-state transistors had largely supplanted vacuum tubes in consumer receivers, reducing size, power consumption, and cost while improving portability and longevity, as manufacturers like RCA and adopted hybrid and all-transistor architectures for . Receiver cabinets were engineered with metallic enclosures and internal shielding to mitigate radio frequency interference (RFI), which could degrade signal quality from external sources like nearby transmitters or internal between sections. designs often incorporated grounded metal partitions around the RF tuner and IF stages to contain electromagnetic emissions, with early models using cabinets and later ones adding foil or conductive gaskets at seams for better , complying with FCC regulations on radiated interference. This shielding was essential in densely populated urban environments, where unshielded sets could pick up or emit unwanted signals affecting both the receiver and nearby devices. Testing and alignment of analog receivers involved precise procedures to optimize performance, particularly for the IF amplifier and color circuits. For the IF amplifier, technicians used a and to align traps that suppress , followed by adjusting response curves for flat gain across the video bandwidth, ensuring sharp images without . Color alignment required test patterns like SMPTE bars, where purity magnets and convergence rings on the CRT neck were adjusted to align red, green, and blue electron beams, minimizing color fringing; tint and color gain controls were then fine-tuned using a to match the 3.58 MHz subcarrier phase reference from the color burst. These procedures, often documented in service manuals, were performed during or repair to maintain compliance with broadcast standards.

Legacy and Transition

Analog Broadcast Shutdowns

The shutdown of analog television broadcasts worldwide was driven primarily by the need to reallocate for digital services, including and public safety communications, allowing governments to auction frequencies for economic benefit while improving broadcast efficiency. Regulatory bodies mandated transitions to digital standards, such as ATSC in the United States and in , to facilitate this shift. In the United States, the Federal Communications Commission (FCC) required full-power stations to cease analog transmissions on June 12, 2009, following the Digital Television Transition and Public Safety Act of 2005, which aimed to recover 108 MHz of spectrum for advanced wireless services. The United Kingdom's switchover, overseen by Ofcom, began on October 17, 2007, in Whitehaven and concluded on October 24, 2012, in Northern Ireland, enabling the release of 800 MHz band frequencies for 4G mobile networks. In Brazil, the National Telecommunications Agency (Anatel) extended the analog shutdown to June 30, 2025, which was completed to ensure complete digital coverage across municipalities. These transitions faced significant challenges, particularly in ensuring access for vulnerable populations. In the US, rural viewers encountered signal disruptions as many low-power translators failed to upgrade due to high costs, leaving some areas without service post-shutdown. To mitigate this, the (NTIA) administered a program subsidizing boxes at $40 each, with over 64 million coupons requested by consumers relying on over-the-air signals. Similar issues arose in the UK, where rural signal reception proved problematic, prompting Ofcom's assistance scheme that provided free equipment and support to over 1.8 million low-income and elderly households. Brazil's implementation highlighted economic barriers, including the need for subsidies to distribute converters in remote regions where digital lagged. As of November 2025, analog persists in several developing regions due to incomplete digital rollouts and constraints. For instance, South Africa's analogue switch-off was extended to March 2025 but was halted by court rulings, with ongoing debates as of November 2025 advocating for postponement to the end of 2025 or later, while the began initial analog sign-offs in in November 2025, with full shutdown planned within 12 months (by late 2026), and provincial areas to follow later. For example, in , countries like and continue analog broadcasts alongside digital trials, while in , some rural areas in persist with analog signals. These exceptions reflect ongoing efforts to balance reallocation with universal access in less developed markets.

Digital Migration Impacts

The transition from analog to digital television significantly enhanced broadcast quality and efficiency, primarily through higher resolution capabilities that allowed for high-definition (HD) imagery far surpassing the standard-definition limits of analog systems. Digital standards like ATSC enabled resolutions up to , providing sharper images and 16:9 aspect ratios compared to analog's 4:3 format, while supporting up to five channels of CD-quality audio. Compression techniques, such as , further improved efficiency by reducing data requirements without perceptible quality loss, allowing a single 6 MHz channel to carry multiple streams. Multicasting emerged as a key benefit, enabling broadcasters to transmit several standard-definition channels or a mix of HD and data services simultaneously within the same allocation previously dedicated to one analog channel. The shift underscored longstanding drawbacks of analog television, particularly its vulnerability to and interference, which degraded signal quality into visible "" or static on screens. Analog signals were highly susceptible to , causing ghosting where reflected signals created duplicate, offset images, a problem exacerbated in urban areas with tall structures. Additionally, analog was inherently limited to one channel per 6 MHz bandwidth, constraining the number of available stations and services without expanding usage. These limitations became more evident as digital systems demonstrated near-perfect reception within range, free from cumulative degradation. During the hybrid transition periods, particularly leading to the analog shutdown, converter boxes and set-top devices incorporated analog pass-through features to ensure compatibility for legacy equipment. These devices converted ATSC digital signals to analog for older televisions while allowing any remaining analog broadcasts to bypass conversion and be directly tuned, minimizing disruptions for consumers reliant on analog sets. This approach facilitated a phased rollout, with broadcasters simulcasting in both formats until full digital adoption. Economically, the migration unlocked substantial revenues through spectrum auctions of reclaimed analog frequencies, funding the transition and related infrastructure. The 2005 Digital Television Transition and Public Safety Act directed auction proceeds into a dedicated fund, estimated to generate $10 billion, with portions allocated for converter box subsidies and public safety enhancements. The 2016-2017 incentive auction alone raised $19.8 billion, reimbursing broadcasters for relocation costs and supporting the repacking of stations into fewer channels to free spectrum for .

Preservation Efforts

Preservation efforts for analog television encompass a range of initiatives aimed at safeguarding both the hardware and content from an era dominated by analog broadcasting technologies. Institutions such as the , formerly known as the Museum of Television and Radio, play a central role by maintaining an extensive archive of over 160,000 television and radio programs, including analog broadcasts, to ensure their accessibility for research and public viewing. Similarly, the Early Television Museum in focuses on collecting and restoring early analog television sets from the 1920s through the color era, demonstrating operational examples to educate visitors on the technology's historical significance. The MZTV Museum of Television in further contributes by preserving and exhibiting vintage receiving instruments, emphasizing the mechanical and electronic components unique to analog systems. Cathode ray tube (CRT) restoration projects represent a specialized aspect of hardware preservation, addressing the degradation of phosphor coatings and high-voltage components in vintage sets. Organizations like the Early Television Foundation undertake meticulous restorations, replacing electrolytic capacitors and realigning electron guns to return non-functional CRT-based televisions to operational condition, thereby preventing the loss of functional artifacts. In the UK, the Lincolnshire Aviation Heritage Centre has restored analog broadcast equipment, including CRT monitors used in historical contexts, to maintain their integrity for display and study. Digitization efforts are crucial for archiving analog content stored on vulnerable media like and tapes, which suffer from magnetic degradation over time. The U.S. (NARA) recommends migrating these formats to digital files using professional playback decks to capture the original with minimal loss, ensuring long-term preservation of broadcasts such as footage and educational programs. Canada's Conservation Institute advocates for analog-to-digital conversion as the optimal method for VHS preservation, producing uncompressed files that retain the full of the original recordings while allowing for future migrations. Projects like those at Michigan State University's in the Digital Age initiative provide guidelines for handling multiple analog tape formats, facilitating broader access to preserved content through digital platforms. Hobbyist communities actively support the repair and maintenance of analog television hardware, fostering knowledge transfer through hands-on workshops and shared resources. The Antique Wireless Association, based in New York, organizes events and publications dedicated to restoring vintage TV sets and building analog signal generators to test and calibrate equipment. Similarly, the North East Vintage Electronics Club (NEVEC) hosts swap meets and technical discussions on repairing analog tuners and flyback transformers, helping enthusiasts source rare parts like vacuum tubes essential for operational authenticity. Legal challenges in preserving old analog broadcasts revolve around and status, which can restrict and redistribution. Under U.S. , pre-1972 recordings embedded in TV broadcasts often fall under state protections rather than federal, complicating archival efforts unless works enter the after 95 years from publication. The highlights that unpublished analog recordings may require permissions for digital dissemination, yet provisions allow limited preservation copying by libraries to avert physical deterioration. These issues underscore the need for advocacy, as seen in reports from the Council on Library and Information Resources, which recommend legislative reforms to facilitate broader access to public domain-era broadcasts without infringing on rights holders.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.