Recent from talks
Contribute something
Nothing was collected or created yet.
Scan line
View on Wikipedia

A scan line (also scanline) is one line, or row, in a raster scanning pattern, such as a line of video on a cathode-ray tube (CRT) display of a television set or computer monitor.[1]
On CRT screens the horizontal scan lines are visually discernible, even when viewed from a distance, as alternating colored lines and black lines, especially when a progressive scan signal with below maximum vertical resolution is displayed.[2] This is sometimes used today as a visual effect in computer graphics.[3]
The term is used, by analogy, for a single row of pixels in a raster graphics image.[4] Scan lines are important in representations of image data, because many image file formats have special rules for data at the end of a scan line. For example, there may be a rule that each scan line starts on a particular boundary (such as a byte or word; see for example BMP file format). This means that even otherwise compatible raster data may need to be analyzed at the level of scan lines in order to convert between formats.
See also
[edit]References
[edit]- ^ Keith Jack and Vladimir Tsatsulin (2002). Dictionary of Video and Television Technology. Newnes. p. 242. ISBN 978-1-878707-99-4.
- ^ Wesley Fenlon (15 January 2014). "In Search of Scanlines: The Best CRT Monitor for Retro Gaming". Tested. Archived from the original on 24 August 2020. Retrieved 20 February 2014.
- ^ Gabriel B. (December 2012). "Freeware Friday: Maldita Castilla". Blistered Thumbs. Archived from the original on 2014-02-27.
- ^ Robin Stuart Ferguson (2001). Practical algorithms for 3D computer graphics. A K Peters, Ltd. p. 104. ISBN 978-1-56881-154-3.
Scan line
View on GrokipediaFundamentals
Definition
A scan line, in the context of raster-based imaging and display systems, is defined as a single horizontal row of pixels or luminous points generated sequentially from left to right during a raster scan pattern, representing one line of a video frame or digital image.[4] This row is typically referenced by an integer vertical coordinate (y-value) within the frame buffer, where pixel values such as color or intensity are stored and processed to form part of the overall visual output.[4] In video systems, a scan line corresponds to a narrow horizontal strip of the optical image that is scanned to produce an electrical signal encoding brightness variations along that line.[2] Unlike a full frame, which comprises the complete set of all such horizontal rows aggregated vertically to create the entire two-dimensional image, a scan line constitutes only one of those rows.[4] For instance, in the 1080p resolution standard—commonly used in high-definition video—the frame consists of exactly 1080 scan lines, each contributing to the vertical resolution of the display.[5] This distinction highlights how raster scanning, the overarching process of sequentially traversing these lines from top to bottom, builds the frame one scan line at a time.[2] When multiple scan lines are rendered and combined vertically, they aggregate to form a coherent complete image, with each line's pixel data refreshed in sequence to maintain visual continuity on the display.[4] This layered accumulation ensures that the discrete horizontal elements coalesce into a seamless raster representation, fundamental to both analog video signals and digital frame buffers.[2]Raster Scanning Principle
In raster scanning, the process begins with horizontal deflection, where a scanning source, such as an electron beam or light source, moves systematically from left to right across the display surface to trace a single scan line.[6] Upon reaching the end of the line, the source undergoes a brief retrace period before vertical deflection repositions it downward to the start of the next line, repeating this sequence to cover the entire surface from top to bottom.[7] This sequential tracing ensures uniform coverage, forming a grid-like pattern essential for image reproduction. Synchronization is maintained through horizontal and vertical sync pulses, which precisely time the deflections to align the scanning source's position with the intended image data.[8] Horizontal sync pulses signal the completion of each line and initiate the next, while vertical sync pulses coordinate the return to the top after all lines are traced, preventing misalignment or distortion in the overall pattern.[9] As the source traces each scan line, it modulates its intensity or color based on input signals, exciting phosphors or illuminating pixels sequentially to build the two-dimensional image point by point.[6] The cumulative effect across all lines reconstructs the full visual content, with persistence in the display medium sustaining the image until the next frame. The scan pattern resembles a series of parallel horizontal lines progressing downward, akin to reading text on a page, where each line represents the basic unit of the raster output.[7]Historical Development
Origins in Television
The concept of scan lines originated in the early development of television technology during the 1920s and 1930s, building on mechanical scanning precursors that laid the groundwork for electronic raster scanning. In 1884, German inventor Paul Nipkow patented the Nipkow disk, a rotating disk with spiral apertures designed to mechanically scan an image by sequentially exposing portions of it to light-sensitive elements, enabling the first theoretical electromechanical television system for image dissection and reconstruction.[10] This mechanical approach influenced early experimental systems but was limited by its low resolution and mechanical complexity. Pioneers like Vladimir Zworykin advanced to fully electronic methods; in 1923, Zworykin, working at Westinghouse, invented the iconoscope, an electronic camera tube that used a photoemissive mosaic to capture and scan images line by line via an electron beam, marking a shift from mechanical to electronic raster scanning.[11] Similarly, American inventor Philo Farnsworth demonstrated the first fully electronic television transmission on September 7, 1927, using his image dissector tube to scan and transmit a simple line image, eliminating moving parts and enabling higher fidelity real-time imaging.[12] Key milestones in the adoption of scan lines for broadcast television highlighted the transition from experimental mechanical systems to standardized electronic ones. The British Broadcasting Corporation (BBC) conducted early mechanical television trials using John Logie Baird's 30-line system, with regular experimental broadcasts beginning on September 30, 1929, from Long Acre in London; these transmissions used a Nipkow-style disk to scan 30 horizontal lines per frame at 12.5 frames per second, providing rudimentary moving images to a small audience of enthusiasts.[13] By 1936, the BBC transitioned to electronic scanning with the launch of the world's first regular high-definition public television service on November 2 from Alexandra Palace, employing a 405-line system developed by Marconi-EMI; this used cathode-ray tubes for both scanning the image at the transmitter and displaying it at the receiver, achieving sharper resolution with 405 horizontal scan lines interlaced at 50 fields per second.[14] These developments prioritized raster scanning, where an electron beam traces horizontal lines across the screen from top to bottom to build the image progressively. Post-World War II standardization further entrenched scan lines as the core of television imaging, with international agreements establishing compatible electronic systems to facilitate global broadcasting. In the United States, the National Television System Committee (NTSC) finalized standards in 1941 for a 525-line system operating at approximately 30 frames per second, which saw widespread commercial adoption starting in 1946 after wartime suspension, using interlaced scanning to reduce bandwidth while maintaining visual persistence.[15] In Europe, post-war efforts led to the 625-line standard, adopted by systems like PAL in 1967 and SECAM in 1967, which scanned 625 lines at 25 frames per second to balance resolution and transmission efficiency across diverse infrastructures. The initial purpose of scan lines in these systems was to enable real-time image transmission by breaking down visual information into sequential horizontal lines, leveraging the human eye's persistence of vision to reconstruct a coherent, flicker-free moving picture from rapidly refreshed scans.[16]Adoption in Computing
The adoption of scan line concepts in computing emerged in the late 1960s, marking a shift from post-World War II vector displays—such as those in the Whirlwind computer (1951) and SAGE system (1958)—to raster scan systems that drew inspiration from television's sequential line scanning for more efficient image rendering.[17][18] Early raster experiments at Bell Labs, led by A. Michael Noll under Peter B. Denes, utilized a Honeywell DDP-224 computer to generate scanned displays, starting with 252 scan lines aligned to the Picturephone standard and later expanding to 525 lines to align with the U.S. NTSC television format.[19] This software-driven approach to scan conversion represented a pivotal adaptation, enabling the simulation of filled areas and shading that vector systems struggled to achieve economically.[20] By the 1970s, raster scan integration accelerated with the rise of home computers, where affordability drove the use of consumer television sets as displays. The Altair 8800 (1975), one of the first commercially successful microcomputers, lacked built-in video output but supported add-on interfaces like Don Lancaster's TV Typewriter (1974), a low-cost kit that generated raster video signals for standard TV monitors, displaying 16 lines of 32 characters via character-based raster scanning.[21] These interfaces leveraged the NTSC standard's 525 scan lines, allowing hobbyists to repurpose readily available televisions for computing output without specialized hardware.[22] This compatibility became a hallmark of early personal computing, as seen in subsequent systems like the Apple I (1976) and Apple II (1977), which directly output to NTSC-compatible TVs for raster-based text and graphics. The transition to raster scanning profoundly influenced computer graphics by facilitating the development of bitmap displays, where memory directly mapped to screen pixels along scan lines. The Xerox Alto (1973), an influential prototype personal computer, featured one of the first commercial bitmap raster displays with 875 total scan lines (808 visible) at 1024x808 resolution, enabling interactive windows, icons, and bitmapped fonts through scan line addressing.[23] This shift from vector-drawn lines to pixel grids supported complex, filled imagery and laid the groundwork for modern graphical user interfaces, as raster systems proved more scalable for real-time rendering in resource-constrained environments.[24]Technical Implementation
In Cathode-Ray Tube Displays
In cathode-ray tube (CRT) displays, scan lines are generated through the precise control of an electron beam emitted by an electron gun at the rear of the tube. This beam is accelerated toward a phosphor-coated screen and deflected horizontally and vertically using electromagnetic deflection coils, known as the deflection yoke, which surround the tube's neck. The horizontal deflection coils produce a rapidly varying magnetic field that sweeps the beam from left to right across the screen at high frequency, tracing out each horizontal scan line, while the vertical deflection coils move the beam downward to the next line more slowly.[25][26] For standards like NTSC, the horizontal scan frequency is approximately 15.734 kHz, enabling the beam to complete thousands of lines per second to form a visible image. During the vertical retrace, when the beam returns to the top of the screen after completing all lines in a frame, vertical flyback blanking is applied by suppressing the electron gun's emission or reducing the beam intensity, preventing visible retrace lines from appearing on the screen. This blanking ensures a clean transition between frames without artifacts from the non-image-bearing return path.[27][28] The number of scan lines directly determines the vertical resolution in CRT displays; for NTSC, there are 525 total scan lines per frame, but only about 480 are active and visible, establishing a vertical resolution of 480 lines. In low-resolution modes, such as early computer graphics with fewer effective lines, individual scan lines become more visible to the viewer due to the wider spacing relative to the screen height and the beam's finite thickness. Additionally, interline flicker can occur as a visible artifact when differences in brightness between adjacent scan lines cause perceived shimmering in static horizontal features.[27][29][30] In color CRTs, a shadow mask—a thin metal sheet with precisely punched apertures positioned behind the phosphor screen—ensures proper alignment by directing electrons from separate red, green, and blue electron guns to their corresponding phosphor dots or stripes along each scan line. Misalignment of the shadow mask relative to the beams or phosphors can result in color fringing or purity errors, where incorrect colors appear along scan lines, degrading image quality. The mask captures a significant portion of the electron beam (up to 80%), which inherently reduces brightness but maintains color accuracy during horizontal scanning.[31] The horizontal scan frequency is derived from the total number of scan lines per frame and the frame rate. For a general CRT system, , where is the number of scan lines per frame and is the frame rate in Hz. To arrive at this, note that each frame requires the beam to trace horizontal lines, and frames are produced at rate , so the lines per second (i.e., ) is their product. For NTSC, with and Hz, Hz, confirming the standard value. This equation holds for both progressive and interlaced modes when accounts for the full frame.[27] In progressive scanning modes, all scan lines are drawn sequentially in each frame, which can make line visibility more uniform compared to other methods.[30]Progressive and Interlaced Scanning
Progressive scanning involves drawing all scan lines of a video frame sequentially from top to bottom in a single pass, resulting in a complete image at the frame rate specified, such as 1080p at 60 frames per second.[32] This method provides full vertical resolution and detail in every frame, enhancing motion clarity by eliminating temporal offsets between lines, and the total scanning rate is calculated as frame rate multiplied by the number of scan lines.[33] For instance, in a 1080p system at 60 Hz, the scanning processes 1080 lines × 60 frames/second = 64,800 lines per second, supporting smooth playback without artifacts.[32] In contrast, interlaced scanning alternates between odd-numbered and even-numbered scan lines across two separate fields to form each complete frame, as seen in formats like 1080i.[34] Each field contains half the total lines, transmitted at twice the frame rate to reduce flicker, historically enabling bandwidth savings in television broadcasting by delivering only partial images more frequently.[33] However, this approach can introduce drawbacks such as combing artifacts during motion, where stationary objects appear sharp but moving elements show jagged edges due to the half-line temporal displacement between fields.[34] The relationship between field and frame rates in interlaced systems is given by field rate = frame rate × 2, meaning a 30 frames per second interlaced video produces 60 fields per second.[32] This structure halves bandwidth requirements compared to progressive scanning at an equivalent full-refresh rate, as interlacing achieves the same flicker-reducing field rate (e.g., 60 Hz) while scanning only half the lines per field; for a 525-line system like early NTSC, progressive scanning at 60 full frames per second would require 525 × 60 = 31,500 lines per second, whereas interlaced scanning uses 262.5 lines per field × 60 fields per second = 15,750 lines per second, effectively reducing the scanning bandwidth by 50%.[34][33]Applications
In Video Standards
In video standards, scan lines form the foundational structure for analog television broadcasting, defining the vertical resolution and timing of image display across major global systems. The NTSC (National Television System Committee) standard, adopted in the United States and parts of the Americas, utilizes 525 total scan lines per frame, delivered as two interlaced fields of 262.5 lines each at 60 fields per second, resulting in approximately 30 frames per second.[35] This interlaced approach alternates odd and even lines to reduce bandwidth while maintaining perceived motion smoothness. Similarly, the PAL (Phase Alternating Line) standard, prevalent in Europe, Australia, and much of Asia and Africa, employs 625 scan lines per frame in two interlaced fields of 312.5 lines each at 50 fields per second, yielding 25 frames per second.[36] The SECAM (Séquentiel Couleur À Mémoire) system, a variant used primarily in France, Eastern Europe, and former Soviet states, also features 625 scan lines per frame with 50 interlaced fields per second, though it differs in color encoding by sequentially transmitting chrominance components rather than simultaneously as in PAL.[37] Key signal components synchronize the scanning process in these standards. The horizontal sync pulse, a negative-going signal marking the start of each scan line, has a duration of approximately 4.7 μs in both NTSC and PAL systems, ensuring precise horizontal retracing of the electron beam in cathode-ray tube displays.[38] Following the sync pulse, during the back porch of the horizontal blanking interval, a color burst—a short reference signal of 8 to 10 cycles at the chrominance subcarrier frequency (3.579545 MHz for NTSC, 4.433619 MHz for PAL)—is inserted to synchronize color demodulation and phase at the receiver.[38] In SECAM, the color burst is absent due to its sequential color transmission, relying instead on frequency modulation of the chrominance signals across alternate lines.[37] The evolution from analog to digital video standards has preserved the scan line concept while enhancing resolution and flexibility. In the United States, the ATSC (Advanced Television Systems Committee) standard for digital terrestrial television, implemented since 2009, supports high-definition formats such as 1080 progressive scan lines at 30 or 60 frames per second, alongside legacy-compatible 480 interlaced lines, enabling sharper imagery without interlacing artifacts in progressive modes.[39] Globally, digital transitions like Europe's DVB-T mirror this shift, standardizing higher scan line counts for broadcast compatibility.[40]| Standard | Total Scan Lines | Fields/Second | Scan Type | Primary Regions | Example Resolution Variant |
|---|---|---|---|---|---|
| NTSC | 525 | 60 | Interlaced | Americas, Japan | 480i (active lines) |
| PAL | 625 | 50 | Interlaced | Europe, Asia, Africa | 576i (active lines) |
| SECAM | 625 | 50 | Interlaced | France, Eastern Europe | 576i (active lines) |
| ATSC (HD) | 720 or 1080 | Varies (e.g., 60 frames/s progressive, 60 fields/s interlaced) | Progressive or Interlaced | United States | 720p, 1080i |
