Hubbry Logo
High-definition televisionHigh-definition televisionMain
Open search
High-definition television
Community hub
High-definition television
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
High-definition television
High-definition television
from Wikipedia

High-definition television (HDTV) describes a television or video system which provides a substantially higher image resolution than the previous generation of technologies. The term has been used since at least 1933;[1] in more recent times, it refers to the generation following standard-definition television (SDTV). It is the standard video format used in most broadcasts: terrestrial broadcast television, cable television, and satellite television.

Formats

[edit]

HDTV may be transmitted in various formats:

  • 720p (1280 × 720p): 921,600 pixels
  • 1080i (1920 × 1080i) interlaced scan: 1,036,800 pixels (≈1.04 Mpx).
  • 1080p (1920 × 1080p) progressive scan: 2,073,600 pixels (≈2.07 Mpx).
    • Some countries also use a non-standard CTA[clarification needed] resolution, such as 1440 × 1080i: 777,600 pixels (≈0.78 Mpx) per field or 1,555,200 pixels (≈1.56 Mpx) per frame

When transmitted at two megapixels per frame, HDTV provides about five times as many pixels as SD (standard-definition television). The increased resolution provides for a clearer, more detailed picture. In addition, progressive scan and higher frame rates result in a picture with less flicker and better rendering of fast motion.[2] Modern HDTV began broadcasting in 1989 in Japan, under the MUSE/Hi-Vision analog system.[3] HDTV was widely adopted worldwide in the late 2000s.[4]

Standards

[edit]
Map of digital terrestrial television broadcasting standards by country

All modern high-definition broadcasts utilize digital television standards. The major digital television broadcast standards used for terrestrial, cable, satellite, and mobile devices are:

  • DVB, originating in Europe and also used in much of Asia, Africa, and Australia
  • ATSC, used in much of North America
  • DTMB, used in China and some neighboring countries
  • ISDB, used in two incompatible variations in Japan and South America
  • DMB, used by mobile devices in South Korea

These standards use a variety of video codecs, some of which are also used for internet video.

History

[edit]

The term high definition once described a series of television systems first announced in 1933[1] and launched starting in August 1936;[5] however, these systems were only high definition when compared to earlier systems that were based on mechanical systems with as few as 30 lines of resolution. The ongoing competition between companies and nations to create true HDTV spanned the entire 20th century, as each new system became higher definition than the last. In the early 21st century, this race has continued with 4K, 5K and 8K systems.

The British high-definition TV service started trials in August 1936 and a regular service on 2 November 1936 using both the (mechanical) Baird 240 line sequential scan (later referred to as progressive) and the (electronic) Marconi-EMI 405 line interlaced systems. The Baird system was discontinued in February 1937.[6] In 1938 France followed with its own 441-line system, variants of which were also used by a number of other countries. The US NTSC 525-line system joined in 1941. In 1949 France introduced an even higher-resolution standard at 819 lines, a system that would have been high definition even by modern standards, if it had not required such bandwidth for a color version, which prevented the addition of other TV channels (nevertheless, it remained in use until 1983). All of these systems used interlacing and a 4:3 aspect ratio except the 240-line system which was progressive (actually described at the time by the technically correct term sequential) and the 405-line system which started as 5:4 and later changed to 4:3. The 405-line system adopted the (at that time) revolutionary idea of interlaced scanning to overcome the flicker problem of the 240-line with its 25 Hz frame rate. The 240-line system could have doubled its frame rate but this would have meant that the transmitted signal would have doubled in bandwidth, an unacceptable option as the video baseband bandwidth was required to be not more than 3 MHz.

Color broadcasts started at similar line counts, first with the US NTSC color system in 1953, which was compatible with the earlier monochrome systems and therefore had the same 525 lines per frame. European standards did not follow until the 1960s, when the PAL and SECAM color systems were added to the monochrome 625-line broadcasts.

The NHK (Japan Broadcasting Corporation) began researching to "unlock the fundamental mechanism of video and sound interactions with the five human senses" in 1964, after the Tokyo Olympics. NHK set out to create an HDTV system that scored much higher in subjective tests than NTSC's previously dubbed HDTV. This new system, NHK Color, created in 1972, included 1125 lines, a 5:3 (1.67:1) aspect ratio and 60 Hz refresh rate. The Society of Motion Picture and Television Engineers (SMPTE), headed by Charles Ginsburg, became the testing and study authority for HDTV technology in the international theater. SMPTE would test HDTV systems from different companies from every conceivable perspective, but the problem of combining the different formats plagued the technology for many years.

There were four major HDTV systems tested by SMPTE in the late 1970s, and in 1979 an SMPTE study group released A Study of High Definition Television Systems:

  • EIA monochrome: 4:3 aspect ratio, 1023 lines, 60 Hz
  • NHK color: 5:3 aspect ratio, 1125 lines, 60 Hz
  • NHK monochrome: 4:3 aspect ratio, 2125 lines, 50 Hz
  • BBC colour: 8:3 aspect ratio, 1501 lines, 60 Hz[7]

Since the formal adoption of Digital Video Broadcasting's (DVB) widescreen HDTV transmission modes in the mid to late 2000s; the 525-line NTSC (and PAL-M) systems, as well as the European 625-line PAL and SECAM systems, have been regarded as standard definition television systems.

Analog systems

[edit]

Early HDTV broadcasting used analog technology that was later converted to digital television with video compression.

In 1949, France started its transmissions with an 819 lines system (with 737 active lines). The system was monochrome only and was used only on VHF for the first French TV channel. It was discontinued in 1983.

In 1958, the Soviet Union developed Тransformator (Russian: Трансформатор, meaning Transformer), the first high-resolution (definition) television system capable of producing an image composed of 1,125 lines of resolution aimed at providing teleconferencing for military command. It was a research project and the system was never deployed by either the military or consumer broadcasting.[8]

In 1986, the European Community proposed HD-MAC, an analog HDTV system with 1,152 lines. A public demonstration took place for the 1992 Summer Olympics in Barcelona. However HD-MAC was scrapped in 1993 and the DVB project was formed, which would foresee development of a digital HDTV standard.[9]

Japan

[edit]

In 1979, the Japanese public broadcaster NHK first developed consumer high-definition television with a 5:3 display aspect ratio.[10] The standard was known as Hi-Vision and used a system called MUSE (multiple sub-Nyquist sampling encoding) for encoding the signal. It required about twice the bandwidth of the existing NTSC system but provided about four times the resolution (1035i/1125 lines). In 1981, the MUSE system was demonstrated for the first time in the United States, using the same 5:3 aspect ratio as the Japanese system.[11] Upon visiting a demonstration of MUSE in Washington, US President Ronald Reagan was impressed and officially declared it "a matter of national interest" to introduce HDTV to the US.[12] NHK taped the 1984 Summer Olympics with a Hi-Vision camera, weighing 40 kg.[13]

Satellite test broadcasts started June 4, 1989, the first daily high-definition programs in the world,[14] with regular testing starting on November 25, 1991, or "Hi-Vision Day" – dated exactly to refer to its 1,125-lines resolution.[15] Regular broadcasting of BS-9ch commenced on November 25, 1994, which featured commercial and NHK programming.

Several systems were proposed as the new standard for the US, including the Japanese MUSE system, but all were rejected by the Federal Communications Commission (FCC) because of their higher bandwidth requirements. At this time, the number of television channels was growing rapidly and bandwidth was already a problem. A new standard had to be more efficient, needing less bandwidth for HDTV than the existing NTSC.

Decrease of analog HD systems

[edit]

The limited standardization of analog HDTV in the 1990s did not lead to global HDTV adoption as technical and economic constraints at the time did not permit HDTV to use bandwidths greater than normal television. Early HDTV commercial experiments, such as NHK's MUSE, required over four times the bandwidth of a standard-definition broadcast. Despite efforts made to reduce analog HDTV to about twice the bandwidth of SDTV, these television formats were still distributable only by satellite. In Europe too, the HD-MAC standard was considered not technically viable.[16][17]

In addition, recording and reproducing an HDTV signal was a significant technical challenge in the early years of HDTV (Sony HDVS). Japan remained the only country with successful public broadcasting of analog HDTV, with seven broadcasters sharing a single channel.[citation needed]

However, the Hi-Vision/MUSE system also faced commercial issues when it launched on November 25, 1991. Only 2,000 HDTV sets were sold by that day, rather than the enthusiastic 1.32 million estimation. Hi-Vision sets were very expensive, up to US$30,000 each, which contributed to its low consumer adaption.[18] A Hi-Vision VCR from NEC released at Christmas time retailed for US$115,000. In addition, the United States saw Hi-Vision/MUSE as an outdated system and had already made it clear that it would develop an all-digital system.[19] Experts thought the commercial Hi-Vision system in 1992 was already eclipsed by digital technology developed in the U.S. since 1990. This was an American victory against the Japanese in terms of technological dominance.[20] By mid-1993 prices of receivers were still as high as 1.5 million yen (US$15,000).[21]

On February 23, 1994, a top broadcasting administrator in Japan admitted failure of its analog-based HDTV system, saying the U.S. digital format would be more likely a worldwide standard.[22] However this announcement drew angry protests from broadcasters and electronic companies who invested heavily into the analog system. As a result, he took back his statement the next day saying that the government will continue to promote Hi-Vision/MUSE.[23] That year NHK started development of digital television in an attempt to catch back up to America and Europe. This resulted in the ISDB format.[24] Japan started digital satellite and HDTV broadcasting in December 2000.[13]

Rise of digital compression

[edit]

High-definition digital television was not possible with uncompressed video, which requires a bandwidth exceeding 1 Gbit/s for studio-quality HD digital video.[25][26] Digital HDTV was made possible by the development of discrete cosine transform (DCT) video compression.[27][25] DCT coding is a lossy image compression technique that was first proposed by Nasir Ahmed in 1972,[28] and was later adapted into a motion-compensated DCT algorithm for video coding standards such as the H.26x formats from 1988 onwards and the MPEG formats from 1993 onwards.[29][30] Motion-compensated DCT compression significantly reduces the amount of bandwidth required for a digital TV signal.[25][31] By 1991, it had achieved data compression ratios from 8:1 to 14:1 for near-studio-quality HDTV transmission, down to 70–140 Mbit/s.[25] Between 1988 and 1991, DCT video compression was widely adopted as the video coding standard for HDTV implementations, enabling the development of practical digital HDTV.[25][27][32] Dynamic random-access memory (DRAM) was also adopted as framebuffer semiconductor memory, with the DRAM semiconductor industry's increased manufacturing and reducing prices important to the commercialization of HDTV.[32]

Since 1972, International Telecommunication Union's radio telecommunications sector (ITU-R) had been working on creating a global recommendation for Analog HDTV. These recommendations, however, did not fit in the broadcasting bands which could reach home users. The standardization of MPEG-1 in 1993 led to the acceptance of recommendations ITU-R BT.709.[33] In anticipation of these standards, the DVB organization was formed. It was alliance of broadcasters, consumer electronics manufacturers and regulatory bodies. The DVB develops and agrees upon specifications which are formally standardised by ETSI.[34]

DVB created first the standard for DVB-S digital satellite TV, DVB-C digital cable TV and DVB-T digital terrestrial TV. These broadcasting systems can be used for both SDTV and HDTV. In the US the Grand Alliance proposed ATSC as the new standard for SDTV and HDTV. Both ATSC and DVB were based on the MPEG-2 standard, although DVB systems may also be used to transmit video using the newer and more efficient H.264/MPEG-4 AVC compression standards. Common for all DVB standards is the use of highly efficient modulation techniques for further reducing bandwidth, and foremost for reducing receiver-hardware and antenna requirements.[citation needed]

In 1983, the International Telecommunication Union's radio telecommunications sector (ITU-R) set up a working party (IWP11/6) with the aim of setting a single international HDTV standard. One of the thornier issues concerned a suitable frame/field refresh rate, the world already having split into two camps, 25/50 Hz and 30/60 Hz, largely due to the differences in mains frequency. The IWP11/6 working party considered many views and throughout the 1980s served to encourage development in a number of video digital processing areas, not least conversion between the two main frame/field rates using motion vectors, which led to further developments in other areas. While a comprehensive HDTV standard was not in the end established, agreement on the aspect ratio was achieved.[citation needed]

Initially the existing 5:3 aspect ratio had been the main candidate but, due to the influence of widescreen cinema, the aspect ratio 16:9 (1.78) eventually emerged as being a reasonable compromise between 5:3 (1.67) and the common 1.85 widescreen cinema format. An aspect ratio of 16:9 was duly agreed upon at the first meeting of the IWP11/6 working party at the BBC's Research and Development establishment in Kingswood Warren. The resulting ITU-R Recommendation ITU-R BT.709-2 ("Rec. 709") includes the 16:9 aspect ratio, a specified colorimetry, and the scan modes 1080i (1,080 actively interlaced lines of resolution) and 1080p (1,080 progressively scanned lines). The British Freeview HD trials used MBAFF, which contains both progressive and interlaced content in the same encoding.[citation needed]

It also includes the alternative 1440×1152 HDMAC scan format. (According to some reports, a mooted 750-line (720p) format (720 progressively scanned lines) was viewed by some at the ITU as an enhanced television format rather than a true HDTV format,[35] and so was not included, although 1920×1080i and 1280×720p systems for a range of frame and field rates were defined by several US SMPTE standards.)[citation needed]

Inaugural HDTV broadcast in the United States

[edit]

HDTV technology was introduced in the United States in the early 1990s and made official in 1993 by the Digital HDTV Grand Alliance, a group of television, electronic equipment, communications companies consisting of AT&T Bell Labs, General Instrument, Philips, Sarnoff, Thomson, Zenith and the Massachusetts Institute of Technology. Field testing of HDTV at 199 sites in the United States was completed August 14, 1994.[36]

The first public HDTV broadcast in the United States occurred on July 23, 1996, when the Raleigh, North Carolina television station WRAL-HD began broadcasting from the existing tower of WRAL-TV southeast of Raleigh, winning a race to be first with the HD Model Station in Washington, D.C., which began broadcasting July 31, 1996 with the callsign WHD-TV, based out of the facilities of NBC owned and operated station WRC-TV.[37][38][39]

On April 3, 1997, the FCC allocated 6 MHz of spectrum to every broadcaster for digital programming. All network-affiliated broadcasters were required to transmit digital broadcasts in the top ten markets by May 1999, reaching nearly 30 percent of American households. By November 1999, the network affiliates were expected to transmit in 20 additional markets, bringing DTV availability to nearly 50 percent of American households. All other commercial stations were expected to start transmitting DTV by May 2002, and non-commercial stations to begin digital transmission by May 2003.[40]

The American Advanced Television Systems Committee (ATSC) HDTV system had its public launch on October 29, 1998, during the live coverage of astronaut John Glenn's return mission to space on board the Space Shuttle Discovery.[41] The signal was transmitted coast-to-coast, and was seen by the public in science centers, and other public theaters specially equipped to receive and display the broadcast.[41][42]

European HDTV broadcasts

[edit]

Between 1988 and 1991, several European organizations were working on discrete cosine transform (DCT) based digital video coding standards for both SDTV and HDTV. The EU 256 project by the CMTT and ETSI, along with research by Italian broadcaster RAI, developed a DCT video codec that broadcast near-studio-quality HDTV transmission at about 70–140 Mbit/s.[25][43] The first HDTV transmissions in Europe, albeit not direct-to-home, began in 1990, when RAI broadcast the 1990 FIFA World Cup using several experimental HDTV technologies, including the digital DCT-based EU 256 codec,[25] the mixed analog-digital HD-MAC technology, and the analog MUSE technology. The matches were shown in 8 cinemas in Italy, where the tournament was played, and 2 in Spain. The connection with Spain was made via the Olympus satellite link from Rome to Barcelona and then with a fiber optic connection from Barcelona to Madrid.[44][45] After some HDTV transmissions in Europe, the standard was abandoned in 1993, to be replaced by a digital format from DVB.[46]

The first regular broadcasts began on January 1, 2004, when the Belgian company Euro1080 launched the HD1 channel with the traditional Vienna New Year's Concert. Test transmissions had been active since the IBC exhibition in September 2003, but the New Year's Day broadcast marked the official launch of the HD1 channel, and the official start of direct-to-home HDTV in Europe.[47]

Euro1080, a division of the later defunct Belgian TV services company Alfacam, broadcast HDTV channels to break the pan-European stalemate of "no HD broadcasts mean no HD TVs bought means no HD broadcasts ..." and kick-start HDTV interest in Europe.[48] The HD1 channel was initially free-to-air and mainly comprised sporting, dramatic, musical and other cultural events broadcast with a multi-lingual soundtrack on a rolling schedule of four or five hours per day.[citation needed]

These first European HDTV broadcasts used the 1080i format with MPEG-2 compression on a DVB-S signal from SES's Astra 1H satellite. Euro1080 transmissions later changed to MPEG-4/AVC compression on a DVB-S2 signal in line with subsequent broadcast channels in Europe.[citation needed]

Despite delays in some countries,[49] the number of European HD channels and viewers has risen steadily since the first HDTV broadcasts, with SES's annual Satellite Monitor market survey for 2010 reporting more than 200 commercial channels broadcasting in HD from Astra satellites, 185 million HD capable TVs sold in Europe (£60 million in 2010 alone), and 20 million households (27% of all European digital satellite TV homes) watching HD satellite broadcasts (16 million via Astra satellites).[50]

In December 2009, the United Kingdom became the first European country to deploy high-definition content using the new DVB-T2 transmission standard, as specified in the Digital TV Group (DTG) D-book, on digital terrestrial television.[51]

The Freeview HD service contains 13 HD channels (as of April 2016) and was rolled out region by region across the UK in accordance with the digital switchover process, finally being completed in October 2012. However, Freeview HD is not the first HDTV service over digital terrestrial television in Europe; Italy's RAI started broadcasting in 1080i on April 24, 2008, using the DVB-T transmission standard.[citation needed]

In October 2008, France deployed five high definition channels using DVB-T transmission standard on digital terrestrial distribution.[52]

Notation

[edit]

HDTV broadcast systems are identified with three major parameters:

  • Frame size in pixels is defined as number of horizontal pixels × number of vertical pixels, for example 1280 × 720 or 1920 × 1080. Often the number of horizontal pixels is implied from context and is omitted, as in the case of 720p and 1080p.
  • Scanning system is identified with the letter p for progressive scanning or i for interlaced scanning.
  • Frame rate is identified as number of video frames per second. For interlaced systems, the number of frames per second should be specified, but it is not uncommon to see the field rate incorrectly used instead.

If all three parameters are used, they are specified in the following form: [frame size][scanning system][frame or field rate] or [frame size]/[frame or field rate][scanning system].[53] Often, frame size or frame rate can be dropped if its value is implied from context. In this case, the remaining numeric parameter is specified first, followed by the scanning system.[citation needed]

For example, 1920×1080p25 identifies progressive scanning format with 25 frames per second, each frame being 1,920 pixels wide and 1,080 pixels high. The 1080i25 or 1080i50 notation identifies interlaced scanning format with 25 frames (50 fields) per second, each frame being 1,920 pixels wide and 1,080 pixels high. The 1080i30 or 1080i60 notation identifies interlaced scanning format with 30 frames (60 fields) per second, each frame being 1,920 pixels wide and 1,080 pixels high. The 720p60 notation identifies progressive scanning format with 60 frames per second, each frame being 720 pixels high; 1,280 pixels horizontally are implied.[citation needed]

Systems using 50 Hz support three scanning rates: 50i, 25p and 50p, while 60 Hz systems support a much wider set of frame rates: 59.94i, 60i, 23.976p, 24p, 29.97p, 30p, 59.94p and 60p. In the days of standard-definition television, the fractional rates were often rounded up to whole numbers, e.g. 23.976p was often called 24p, or 59.94i was often called 60i. Sixty Hertz high definition television supports both fractional and slightly different integer rates, therefore strict usage of notation is required to avoid ambiguity. Nevertheless, 29.97p/59.94i is almost universally called 60i, likewise 23.976p is called 24p.[citation needed]

For the commercial naming of a product, the frame rate is often dropped and is implied from context (e.g., a 1080i television set). A frame rate can also be specified without a resolution. For example, 24p means 24 progressive scan frames per second, and 50i means 25 interlaced frames per second.[54]

There is no single standard for HDTV color support. Colors are typically broadcast using a (10-bits per channel) YUV color space but, depending on the underlying image generating technologies of the receiver, are then subsequently converted to a RGB color space using standardized algorithms. When transmitted directly through the Internet, the colors are typically pre-converted to 8-bit RGB channels for additional storage savings with the assumption that it will only be viewed only on a (sRGB) computer screen. As an added benefit to the original broadcasters, the losses of the pre-conversion essentially make these files unsuitable for professional TV re-broadcasting.[citation needed]

Most HDTV systems support resolutions and frame rates defined either in the ATSC table 3, or in EBU specification. The most common are noted below.[citation needed]

Display resolutions

[edit]
Video format supported [image resolution] Native resolution [inherent resolution] (W×H) Pixels Aspect ratio (W:H) Description
Actual Advertised (Megapixels) Image Pixel
720p
(HD ready)
1280×720
1024 × 768
XGA
786,432 0.8 4:3 (1.33:1) 1:1 (1.00:1) Typically a PC resolution (XGA); also a native resolution on many entry-level plasma displays with non-square pixels.
1280 × 720 921,600 0.9 16:9 (1.78:1) 1:1 Standard HDTV resolution and a typical PC resolution (WXGA), frequently used by high-end video projectors; also used for 750-line video, as defined in SMPTE 296M, ATSC A/53, ITU-R BT.1543.
1366 × 768
WXGA
1,049,088 1.0 683:384
(approx. 16:9)
1:1 A typical PC resolution (WXGA); also used by many HD ready TV displays based on LCD technology.
1080p / 1080i
(Full HD)
1920×1080
1920 × 1080 2,073,600 2.1 16:9 1:1 Standard HDTV resolution, used by full HD and HD ready 1080p TV displays such as high-end LCD, plasma and rear projection TVs, and a typical PC resolution (lower than WUXGA); also used for 1125-line video, as defined in SMPTE 274M, ATSC A/53, ITU-R BT.709
Video format supported Screen resolution (W×H) Pixels Aspect ratio (W:H) Description
Actual Advertised (Megapixels) Image Pixel
720p
(HD Ready)
1280×720
1248 × 702
Clean Aperture
876,096 0.9 16:9 1:1 Used for 750-line video with faster artifact/overscan compensation, as defined in SMPTE 296M.
1080i
(Full HD)
1920×1080
1440 × 1080
HDCAM / HDV
1,555,200 1.6 16:9 4:3 Used for anamorphic 1125-line video in the HDCAM and HDV formats introduced by Sony and defined (also as a luminance subsampling matrix) in SMPTE D11.
1080p
(Full HD)
1920×1080
1888 × 1062
Clean aperture
2,005,056 2.0 16:9 1:1 Used for 1124-line video with faster artifact/overscan compensation, as defined in SMPTE 274M.

At a minimum, HDTV has twice the linear resolution of standard-definition television (SDTV), thus showing greater detail than either analog television or regular DVD. The technical standards for broadcasting HDTV also handle the 16:9 aspect ratio images without using letterboxing or anamorphic stretching, thus increasing the effective image resolution.

A very high-resolution source may require more bandwidth than available in order to be transmitted without loss of fidelity. The lossy compression that is used in all digital HDTV storage and transmission systems will distort the received picture when compared to the uncompressed source.

Standard frame or field rates

[edit]

ATSC and DVB define the following frame rates for use with the various broadcast standards:[55][56]

  • 23.976 Hz (film-looking frame rate compatible with NTSC clock speed standards)
  • 24 Hz (international film and ATSC high-definition material)
  • 25 Hz (PAL film, DVB standard-definition and high-definition material)
  • 29.97 Hz (NTSC film and standard-definition material)
  • 30 Hz (NTSC film, ATSC high-definition material)
  • 50 Hz (DVB high-definition material)
  • 59.94 Hz (ATSC high-definition material)
  • 60 Hz (ATSC high-definition material)

The optimum format for a broadcast depends upon the type of videographic recording medium used and the image's characteristics. For best fidelity to the source, the transmitted field ratio, lines, and frame rate should match those of the source.

PAL, SECAM and NTSC frame rates technically apply only to analog standard-definition television, not to digital or high definition broadcasts. However, with the rollout of digital broadcasting, and later HDTV broadcasting, countries retained their heritage systems. HDTV in former PAL and SECAM countries operates at a frame rate of 25/50 Hz, while HDTV in former NTSC countries operates at 30/60 Hz.[57]

Types of media

[edit]

High-definition image sources include terrestrial broadcast, direct broadcast satellite, digital cable, IPTV, Blu-ray video disc (BD), and internet downloads.

In the US, residents in the line of sight of television station broadcast antennas can receive free, over-the-air programming with a television set with an ATSC tuner via a TV aerial. Laws prohibit homeowners' associations and city government from banning the installation of antennas.[citation needed]

Standard 35mm photographic film used for cinema projection has a much higher image resolution than HDTV systems, and is exposed and projected at a rate of 24 frames per second (frame/s). To be shown on standard television, in PAL-system countries, cinema film is scanned at the TV rate of 25 frame/s, causing a speedup of 4.1 percent, which is generally considered acceptable. In NTSC-system countries, the TV scan rate of 30 frame/s would cause a perceptible speedup if the same were attempted, and the necessary correction is performed by a technique called 3:2 pulldown: Over each successive pair of film frames, one is held for three video fields (1/20 of a second) and the next is held for two video fields (1/30 of a second), giving a total time for the two frames of 1/12 of a second and thus achieving the correct average film frame rate.

Non-cinematic HDTV video recordings intended for broadcast are typically recorded either in 720p or 1080i format as determined by the broadcaster. 720p is commonly used for Internet distribution of high-definition video, because most computer monitors operate in progressive-scan mode. 720p also imposes less strenuous storage and decoding requirements compared to both 1080i and 1080p. 1080p/24, 1080i/30, 1080i/25, and 720p/30 is most often used on Blu-ray Disc.

Recording and compression

[edit]

HDTV can be recorded to D-VHS (Digital-VHS or Data-VHS), W-VHS (analog only), to an HDTV-capable digital video recorder (for example DirecTV's high-definition digital video recorder, Sky HD's set-top box, Dish Network's VIP 622 or VIP 722 high-definition digital video recorder receivers (these set-top boxes allow for HD on the Primary TV and SD on the secondary TV (TV2) without a secondary box on TV2), or TiVo's Series 3 or HD recorders), or an HDTV-ready HTPC. Some cable boxes are capable of receiving or recording two or more broadcasts at a time in HDTV format, and HDTV programming, some included in the monthly cable service subscription price, some for an additional fee, can be played back with the cable company's on-demand feature.[citation needed]

The massive amount of data storage required to archive uncompressed streams meant that inexpensive uncompressed storage options were not available to the consumer. In 2008, the Hauppauge 1212 Personal Video Recorder was introduced. This device accepts HD content through component video inputs and stores the content in MPEG-2 format in a .ts file or in a Blu-ray-compatible format .m2ts file on the hard drive or DVD burner of a computer connected to the PVR through a USB 2.0 interface. More recent systems are able to record a broadcast high definition program in its 'as broadcast' format or transcode to a format more compatible with Blu-ray.[citation needed]

Analog tape recorders with bandwidth capable of recording analog HD signals, such as W-VHS recorders, are no longer produced for the consumer market and are both expensive and scarce in the secondary market.[citation needed]

Set top box tuners with external storage support may be able to directly record the digital video stream itself, including additional data such as teletext. The set top box doesn't even need to support the audio and video encoding used, and the file recorded will be playable on any device that does support it. However encrypted content will require a receiver with the appropriate CAS setup.[citation needed]

In the United States, as part of the FCC's plug and play agreement, cable companies are required to provide customers who rent HD set-top boxes with a set-top box with "functional" FireWire (IEEE 1394) on request. None of the direct broadcast satellite providers have offered this feature on any of their supported boxes, but some cable TV companies have. As of July 2004, boxes are not included in the FCC mandate. This content is protected by encryption known as 5C.[58] This encryption can prevent duplication of content or simply limit the number of copies permitted, thus effectively denying most if not all fair use of the content.[citation needed]

See also

[edit]

References

[edit]

Further reading

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
High-definition television (HDTV) is a television broadcasting system that provides image resolutions approximately twice that of conventional standard-definition television in both the horizontal and vertical dimensions, typically featuring a 16:9 aspect ratio for a wider, more immersive viewing experience. This enhanced resolution, often ranging from 720p (1280×720 pixels) to 1080p (1920×1080 pixels) in progressive scan formats, delivers sharper images with greater detail and color fidelity compared to earlier analog systems limited to about 480i lines. HDTV standards support multiple frame rates, such as 24, 30, or 60 frames per second, enabling compatibility with both film and video content while reducing artifacts like flicker through digital compression techniques. The development of HDTV originated in Japan during the late 1960s, when the Broadcasting Corporation () initiated research in 1964 to create a high-resolution system that would provide a stronger sense of visual reality, beginning with psychophysical experiments on human perception of image quality. By 1974, and had produced the first prototype HDTV capable of displaying 1,125 scanning lines, far surpassing the 480 lines of standard television. Key advancements continued through the , including the production of specialized equipment like high-resolution cameras, displays, and video tape recorders by 1982, and the introduction of bandwidth reduction technologies such as the analog Multiple Sub-Nyquist Sampling Encoding (MUSE) system in 1983, leading to 's provisional standard of 1,125 lines, 60 Hz field rate, 2:1 interlacing, and a 5:3 . The world's first HDTV satellite broadcast occurred in June 1989, marking the system's practical debut and influencing global standards. In the United States, HDTV adoption accelerated in the late 1980s amid concerns over international competition, prompting the (FCC) to form the Advisory Committee on Advanced Television Service (ACATS) in to evaluate proposals. After demonstrations of NHK's analog HDTV in , in , the focus shifted to digital transmission for efficiency, culminating in the Grand Alliance's proposal in 1993 that combined technologies from multiple companies. The FCC adopted the digital ATSC (Advanced Television Systems Committee) standard in December 1996, which supported HDTV alongside standard definition and included 8-VSB modulation for over-the-air broadcasting. Commercial HDTV sets became available in the U.S. in 1998 from manufacturers like and , initially priced over $7,000, with full transition from analog to digital mandated by 2009. HDTV's global standardization was formalized by the (ITU) in 2000 through Recommendation BT.709, defining parameters for high-definition studio production that aligned with ATSC and other regional systems like Europe's and Japan's . By the early , HDTV had become the dominant broadcast format in the United States and many other countries, paving the way for further advancements like 4K ultra-high-definition television, while legacy support for standard definition ensured during the transition.

Overview

Definition and key characteristics

High-definition television (HDTV) is a television system that provides image resolutions of at least (1280 × 720 pixels, ) or (1920 × 1080 pixels, interlaced scan), resulting in pixel counts ranging from approximately 0.92 million to over 2 million per frame—significantly higher than (SDTV) formats of (about 0.35 million pixels) or (about 0.41 million pixels). This increase in resolution allows for substantially greater image detail and fidelity compared to earlier analog systems. Key characteristics of HDTV include sharper imagery due to the elevated , enhanced motion detail from higher line counts, a wider typically of 16:9 to better match cinematic formats, and support for both progressive scanning (which refreshes the entire frame simultaneously for smoother playback) and interlaced scanning (which alternates lines for compatibility with legacy systems). These features collectively enable a more immersive viewing experience by expanding the horizontal to around 30 degrees, compared to about 11 degrees in conventional television. Perceptually, HDTV delivers benefits such as reduced visibility of scan lines, greater clarity in fine details, and improved reproduction, which minimize artifacts like flicker and present in SDTV. For instance, in broadcasts, the higher resolution captures fluid motion with less blurring, allowing viewers to discern movements and ball trajectories more precisely; in movies, the 16:9 preserves the original composition without cropping or letterboxing; and in news programming, it enhances text legibility and facial details for a more engaging and informative presentation. Higher resolutions like 4K ultra-high definition extend these HDTV principles by further boosting for even greater sharpness on larger screens.

Comparison to standard-definition television

High-definition television (HDTV) offers significantly greater image detail compared to (SDTV) primarily due to its higher resolution. For instance, a common HDTV format like uses 1280 × 720 pixels, totaling approximately 921,600 pixels per frame, while SDTV's format employs 720 × 480 pixels, resulting in about 345,600 pixels—roughly 2.7 times fewer pixels in HDTV for enhanced sharpness and finer textures. Higher HDTV resolutions, such as at 1920 × 1080 pixels, can provide up to 6 times more pixels, though the 2-4 times range applies broadly when comparing progressive to interlaced , enabling clearer rendering of small details like text or distant objects that appear blurred or blocky in SDTV. In terms of bandwidth requirements for broadcast transmission, HDTV demands substantially more throughput than SDTV, increasing operational costs for providers. MPEG-2 compressed HDTV streams typically require 15-22 Mbps to maintain high quality, compared to 4-8 Mbps for SDTV, reflecting the larger needed to support HDTV's increased count without excessive compression artifacts. This disparity impacts , cable, and over-the-air distribution, where HDTV channels often consume 3-5 times the bandwidth of SDTV equivalents, necessitating more efficient or dedicated allocation to avoid signal degradation. Infrastructure for HDTV deployment contrasts sharply with SDTV's legacy compatibility, requiring specialized hardware that drives higher upfront investments. HDTV systems rely on displays optimized for 16:9 aspect ratios and HD-capable receivers or set-top boxes to decode and render the signal, whereas SDTV integrates seamlessly with traditional 4:3 cathode-ray tube (CRT) televisions and analog tuners without additional equipment. This shift to wider screens in HDTV accommodates cinematic content more naturally but renders older 4:3 devices incompatible without letterboxing or cropping, complicating transitions in homes and broadcast facilities. From a viewer perspective, HDTV delivers a markedly superior experience by minimizing common SDTV limitations such as visible scan lines, pixelation during fast-motion scenes, and overall lower clarity on larger screens. The increased resolution reduces motion artifacts like blurring or jagged edges in dynamic content, such as sports or action sequences, while providing a more immersive sense of depth and realism that feels less fatiguing over extended viewing. In contrast, SDTV's lower pixel density often results in softer images and noticeable compression-induced noise, particularly when upscaled on modern displays, underscoring HDTV's role in elevating perceived quality for entertainment and information consumption.

Technical Specifications

Resolution and aspect ratios

High-definition television (HDTV) is characterized by resolutions that significantly exceed those of , providing sharper images through increased pixel counts. The most common HDTV resolutions include , defined as 1280 horizontal pixels by 720 vertical pixels in format according to SMPTE ST 296:2012, which establishes the active image area and sampling structure for this format. Similarly, utilizes pixels in interlaced scan, while employs the same pixel dimensions but in , both adhering to SMPTE ST 274:2008 for the active format with square pixels. These resolutions deliver approximately 0.92 million pixels for and 2.07 million pixels for 1080 formats, enabling finer detail rendition compared to standard-definition's roughly 0.48 million pixels. The predominant aspect ratio in HDTV is 16:9 widescreen, as specified in ITU-R Recommendation BT.709 for parameter values in high-definition production and emission standards, which contrasts with the legacy 4:3 ratio used in earlier television systems. This 16:9 ratio, equivalent to 1.78:1, aligns closely with modern cinematic formats, while 4:3 (1.33:1) requires adaptation when displayed on HDTV screens. For instance, converting 4:3 content to 16:9 involves pillarboxing, where black bars are added to the sides; the image height scales to match the display's 9 units, yielding a width of (4/3)×9 = 12 units against the display's 16 units, resulting in 2 units (12.5% of the width) of pillarboxing per side. Conversely, displaying 16:9 content on a 4:3 display uses letterboxing, scaling the width to 4 units and producing a height of (9/16)×4 = 2.25 units against 3 units, with 0.375 units (12.5% of the height) of letterboxing top and bottom. Higher horizontal pixel counts in HDTV, such as the pixels in 1080 formats versus 720 in standard definition, facilitate greater detail across the wider 16:9 frame, enhancing cinematic immersion for content in aspect ratios like 1.85:1 or 2.39:1, which often require minimal letterboxing on HDTV displays. This increased horizontal resolution captures more of the expansive typical in theatrical releases, providing a viewing experience that approximates cinema without excessive black bars. Variations such as 1440×1080 exist for specific applications, including the HDV format defined by the HDV , where rectangular s with a 1.333:1 are used to achieve an effective 16:9 display after scaling, commonly in consumer and camcorders for efficient encoding. Higher frame rates may further enhance the perceived resolution during motion, complementing these static pixel structures.
ResolutionPixel DimensionsScan TypeAspect RatioStandard Reference
720p1280 × 720Progressive16:9SMPTE ST 296:2012
1080i1920 × 1080Interlaced16:9SMPTE ST 274:2008, ITU-R BT.709
1080p1920 × 1080Progressive16:9SMPTE ST 274:2008, BT.709
1440×10801440 × 1080VariesEffective 16:9 (with PAR 1.333:1)HDV Format Specification

Frame rates and scanning methods

High-definition television (HDTV) employs various frame rates to capture and display motion, tailored to content origins and regional broadcast standards. The standard frame rate for cinematic film content is 24 progressive frames per second (24p), which preserves the original motion cadence of movies without introducing artifacts during playback. In regions historically aligned with NTSC standards, such as North America, common rates include 30 progressive frames per second (30p) or 60 interlaced fields per second (60i), where 60i effectively delivers 30 frames per second by alternating odd and even scan lines. For smoother motion in dynamic scenes, 60 progressive frames per second (60p) is utilized, particularly in formats like 720p60. In PAL regions, such as , frame rates are based on a 50 Hz system, with 25 progressive frames per second (25p) or 50 interlaced fields per second (50i) as standards, corresponding to 25 frames per second in the latter case. The 50p rate provides enhanced motion fluidity at 50 progressive frames per second. These rates align with local power frequencies and legacy analog systems, ensuring compatibility while supporting HDTV resolutions like 1080i50 or 720p50. Scanning methods in HDTV determine how images are refreshed on displays, with progressive scanning (denoted by "p") rendering the entire frame sequentially for full vertical resolution in each refresh cycle. This approach minimizes motion artifacts and is ideal for computer monitors or modern flat-panel TVs, as seen in or 60p formats. Interlaced scanning (denoted by "i"), however, displays alternate fields—odd lines in one field and even lines in the next—to achieve the nominal , effectively doubling the while halving bandwidth requirements compared to progressive scanning at the same . A key advantage of interlaced scanning is reduced flicker in broadcast environments, though it can introduce combing artifacts during fast motion if not properly de-interlaced. The relationship between fields and frames in interlaced systems is defined by the formula where the field rate equals twice the frame rate for a given notation. For instance, the 1080i60 format transmits 60 fields per second, comprising 30 complete frames per second, as each frame is split into two fields: Field rate=2×Frame rate (for interlaced)\text{Field rate} = 2 \times \text{Frame rate (for interlaced)} This structure allows efficient use of limited bandwidth in early digital HDTV transmissions, such as the 19.39 Mbps payload in ATSC 8-VSB modulation, by requiring only half the data per refresh compared to 1080p30. Similarly, 1080i50 yields 50 fields per second or 25 frames per second in 50 Hz systems. Regional differences between 50 Hz and 60 Hz standards can lead to judder—perceived stuttering in motion—when converting content across systems, such as adapting film to 50 Hz broadcast via non-integer pulldown ratios like 2:2, unlike the more compatible 3:2 pulldown for 60 Hz. This mismatch arises from the fundamental 50/60 Hz divergence rooted in grids, affecting international content distribution and requiring advanced conversion algorithms to mitigate visual inconsistencies.
RegionCommon Frame RatesExample FormatsNotes
(60 Hz)24p, 30p/60i, 60p1080i60, 720p60Uses 59.94 Hz variants for legacy compatibility; interlaced saves ~50% bandwidth.
PAL (50 Hz)25p/50i, 50p1080i50, 720p50Aligned to 50 Hz grid; progressive preferred for new content.

Color and audio standards

High-definition television employs the color space for encoding, separating (Y) from (Cb and Cr) components to optimize bandwidth while preserving color information. This model, standardized in Recommendation BT.709, defines the transformation from RGB primaries to YCbCr using specific coefficients, such as Y = 0.2126R + 0.7152G + 0.0722B for , enabling efficient handling of high-resolution signals. Unlike standard-definition television's analog system, which relies on narrower primaries (red at x=0.67, y=0.33), YCbCr in HDTV supports primaries (red at x=0.64, y=0.33; green at x=0.30, y=0.60; blue at x=0.15, y=0.06) and D65 , resulting in a wider color gamut that better reproduces vibrant colors and reduces limitations in hue representation. To enhance gradient smoothness and minimize visible artifacts, HDTV color standards recommend 10-bit depth over the baseline 8-bit, providing 1024 levels per channel compared to 256, which significantly reduces color banding in smooth transitions like skies or shadows. In 8-bit encoding, quantization steps can create perceptible steps in low-contrast areas, but 10-bit allocation—such as black at code value 64 and peak white at 940—allows finer gradations, supporting professional production and display accuracy as specified in BT.709. This bit depth integrates with HDTV resolutions to deliver a cohesive visual experience, where higher pixel counts benefit from expanded color precision to avoid contouring. For audio, the baseline standard in digital HDTV broadcasts, particularly under the ATSC system, is (AC-3), which delivers 5.1-channel with discrete left, center, right, surround left/right, and channels at bit rates up to 640 kbps. This format uses perceptual coding to compress audio efficiently while maintaining , ensuring compatibility across broadcasters and receivers in systems like ATSC A/52. As HDTV evolved, audio standards advanced to for immersive experiences, incorporating height channels (e.g., 5.1.4 configuration) and object-based rendering, now supported in broadcasts via codecs like AC-4 to enable dynamic sound placement in three dimensions. HDTV's higher data rates introduce lip-sync challenges, where video processing delays—such as in LCD displays or E encoding cycles adding up to 40 ms per frame—cause audio to precede visuals by 50-100 ms, disrupting viewer perception. These issues stem from the increased computational demands of HD signals compared to SD, requiring tools like automatic delay compensation in or broadcast test signals to align audio and video timing effectively.

Broadcast and Display Standards

Analog HD systems

Analog high-definition television systems emerged in the 1980s as experimental efforts to deliver superior image quality using analog transmission methods, predating the widespread adoption of digital technologies. These systems aimed to increase resolution and aspect ratios beyond (SDTV) while maintaining compatibility with existing broadcast infrastructures where possible. Japan's led the development with its Hi-Vision format, which featured 1125 scanning lines in an interlaced 60 Hz configuration and a 16:9 , providing roughly twice the vertical resolution of SDTV's 525 lines. To address the challenge of transmitting high-resolution signals over limited bandwidth, NHK introduced the Multiple sub-Nyquist Sampling Encoding (MUSE) system in 1987, which compressed the original analog component signal—requiring approximately 30 MHz of bandwidth for luminance and chrominance—down to about 8.1 MHz for satellite broadcasting. This compression relied on techniques like dot interlacing and sub-Nyquist sampling to preserve much of the detail without fully digitizing the signal. Unlike SDTV's 6 MHz channel bandwidth in NTSC, analog HD systems demanded significantly wider transmission spectra, making them suitable primarily for satellite or microwave links rather than terrestrial VHF/UHF bands. However, as analog signals, they were inherently susceptible to noise and interference, which degraded quality over distance or in adverse conditions, particularly given the expanded bandwidth that amplified vulnerability to environmental factors. The high costs associated with analog HD infrastructure posed major limitations, including expensive transmitters, cameras, and receivers—early Hi-Vision sets retailed for up to $30,000—restricting deployment to pilot programs rather than mass adoption. These economic barriers, combined with technical complexities in compression and noise management, confined analog HD to experimental use and contributed to the eventual shift toward digital systems for greater efficiency and robustness. Key trials included NHK's satellite broadcasts starting in June 1989, which delivered daily Hi-Vision programming for about one hour, and the U.S. Advanced Compatible Television (ACTV) system developed by the Research Center, which underwent FCC laboratory tests in 1991 as part of the Advisory Committee on Advanced Television Service (ACATS) evaluations. ACTV sought NTSC compatibility through analog enhancements but ultimately lost favor to digital alternatives.

Digital HD standards

Digital high-definition television standards primarily revolve around protocols for efficient transmission and display of HD content over terrestrial, satellite, and cable networks. In the United States, the Advanced Television Systems Committee (ATSC) standard, specified in document A/53, employs modulation for over-the-air . This modulation scheme operates within a 6 MHz channel bandwidth and supports a maximum rate of 19.39 Mbps via an MPEG-2 transport stream, enabling the delivery of multiple HD programs alongside standard-definition content and services. In Europe, the Digital Video Broadcasting (DVB) family of standards, particularly DVB-T for terrestrial transmission as defined in ETSI EN 300 744, utilizes MPEG-2 compression to accommodate HDTV formats including 1080i and 720p. The system supports variable bitrates up to 31.67 Mbps in an 8 MHz channel using 64-QAM modulation, convolutional coding with rates from 1/2 to 7/8, and hierarchical modulation options for layered services. Similarly, Japan's Integrated Services Digital Broadcasting (ISDB) standard, known as ISDB-T for terrestrial use, also relies on MPEG-2 video compression to transmit 1080i and 720p HD signals, integrated within a flexible segmented structure that allows simultaneous delivery of high- and low-definition layers. These standards incorporate metadata protocols to enhance and in HD broadcasts. For instance, ATSC 1.0 employs the (PSIP) under A/65 to deliver electronic program guides (EPGs), channel mappings, and rating information, while closed captions are encoded using CEA-708 standards embedded in the video user data. DVB systems use Service Information (SI) as outlined in ETSI EN 300 468 to provide EPG data and service descriptors, supporting HD-specific attributes like aspect ratios and audio configurations. ISDB similarly integrates metadata for program guides and captions within its MPEG-2 transport stream framework. To ensure robust transmission over noisy channels, all major digital HD standards implement Reed-Solomon error correction coding as an outer code layer. In ATSC A/53, a Reed-Solomon (207, 187, t=10) code corrects up to 10 byte errors per 207-byte block, combined with trellis coding for inner error protection. applies a Reed-Solomon (204, 188, t=8) code to handle up to 8 erroneous bytes per packet, while ISDB-T uses a Reed-Solomon (204, 188, t=8) scheme with time-interleaving for enhanced reliability in mobile reception scenarios. This maintains HD video quality by mitigating bit errors from multipath interference and fading.

International variations and adoption

In the United States, the transition to high-definition television was mandated through the Digital Television Transition and Public Safety Act of 2005, which required full-power broadcasters to cease analog transmissions and fully adopt the ATSC standard for , including HD capabilities, by June 12, 2009. This shift enabled widespread HD delivery via over-the-air signals, with approximately 85% of the U.S. population having access to at least one local HD over-the-air station by 2009, facilitated by the deployment of digital tuners in televisions and set-top boxes. Europe's adoption of HD standards emphasized the DVB-T2 transmission system to support efficient HD delivery, particularly after the shift to the more compression-efficient MPEG-4 , which delayed widespread rollout until the early 2010s. Countries like the initiated trials in 2009, achieving full HD coverage for major populations by 2012, while others, such as and , followed with national implementations around 2010-2011 to accommodate the codec transition and minimize compatibility issues with existing infrastructure. In Asia, Japan pioneered early HD adoption with the ISDB-T standard, launching full HD terrestrial broadcasting on December 1, 2003, initially in the Tokyo, Osaka, and Nagoya areas, marking one of the world's first comprehensive digital HD networks. South Korea followed with its nationwide HD launch in 2005, utilizing the ATSC-based system to provide HD content across major broadcasters like KBS and SBS, building on earlier tests from 2001. China implemented the DTMB standard in a hybrid analog-digital framework starting with formal adoption in August 2006, allowing simultaneous transmission during the transition phase, with initial HD trials in 2005 and progressive rollout to urban areas by 2008 before full analog switchover. Globally, by 2020 digital television had reached high levels of penetration in most households, with HD capabilities widely available through broadcast and other delivery methods, driven by regulatory mandates and technological maturation, though challenges such as the need for subsidies persisted in developing regions to bridge affordability gaps for low-income viewers. Programs like the U.S. National Telecommunications and Administration's converter coupon initiative exemplified efforts to subsidize devices, distributing over 64 million coupons to ease the transition for non-HD-ready sets.

Historical Development

Early experiments and analog era

The development of high-definition television (HDTV) originated in laboratory experiments during the , driven by the need for improved image quality to match advancing display technologies and viewer expectations. In , the public broadcaster initiated research into HDTV following the 1964 Tokyo Olympics, recognizing the limitations of standard-definition systems for large-screen viewing. Under the leadership of Takashi Fujio, director of NHK's Science and Technical Research Laboratories, the team conducted psychophysical studies on , emphasizing the benefits of higher resolution and widescreen formats to enhance realism. These efforts culminated in a formal research program by 1970, leading to the proposal of a 1125-line analog system in 1972, which doubled the vertical resolution of Japan's prevailing standard while adopting a 5:3 for wider fields of view. Fujio's pioneering work on widescreen analog HDTV laid the groundwork for future standards, earning him recognition from organizations like the Society of Motion Picture and Television Engineers for his contributions to high-resolution imaging. In , the had earlier established a foundation for high-definition broadcasting with its 405-line monochrome system, launched in 1936 as the world's first regular high-definition service, which provided superior detail compared to prior mechanical systems. During the , the conducted trials to enhance this system, including color experiments using compatibility on the 405-line standard, though these ultimately informed the transition to 625 lines rather than a direct path to modern HDTV. Meanwhile, NHK's 1970s research advanced into prototype development, focusing on analog transmission to achieve twice the horizontal and vertical resolution of conventional TV without digital . The 1980s marked key milestones in analog HDTV prototyping, with Japan's Hi-Vision system—NHK's refined 1125-line analog format—demonstrated publicly at the 1985 Tsukuba Expo, showcasing widescreen broadcasts to highlight future home entertainment. In the United States, the (FCC) responded to these advancements by forming the Advisory Committee on Advanced Television Service (ACATS) in 1987, holding hearings to evaluate analog HDTV proposals against enhancements to existing standard-definition systems, amid concerns over spectrum allocation and compatibility. Analog HDTV faced significant challenges, primarily its high bandwidth requirements—up to 20-25 MHz for uncompressed 1125-line signals, compared to 4-6 MHz for —which strained terrestrial and cable infrastructure, prompting proposals in both and the U.S. to transmit HD alongside standard-definition signals on separate channels or via satellite. To address this, developed the Multiple Sub-Nyquist Sampling Encoding (MUSE) compression technique, enabling practical satellite transmission. A pivotal achievement came in 1988 when conducted the first experimental analog HDTV broadcasts using Hi-Vision, transmitting coverage of the Olympics via satellite and optical fibers to demonstrate feasibility for nationwide rollout. These trials underscored the potential of analog HDTV for immersive viewing but highlighted ongoing hurdles in bandwidth efficiency and receiver affordability, setting the stage for further refinements before commercial adoption.

Digital transition and compression advancements

The shift to digital high-definition television in the 1990s marked a pivotal advancement over analog systems, enabling efficient transmission of HD content within existing broadcast bandwidth constraints. In the United States, the Grand Alliance, formed in 1993 by major broadcasters, equipment manufacturers, and research institutions at the direction of the , collaborated to develop a unified digital standard to succeed analog HDTV proposals. This consortium, including companies like , , and , focused on creating the Advanced Television Systems Committee (ATSC) standard, which emphasized digital encoding to support high-resolution video delivery while preserving spectrum efficiency. Central to this transition was the adoption of compression, a video coding standard that dramatically reduced the data requirements for HD signals. Uncompressed 1080-line HD video generates approximately 1.5 Gbps of data, but enabled compression ratios up to 50:1, allowing transmission at around 20 Mbps within a standard 6 MHz terrestrial channel. This breakthrough, developed through international collaboration under the , facilitated the integration of HD into digital multiplexes alongside multiple standard-definition channels, making widespread deployment feasible without requiring additional spectrum. Between 1993 and 1996, extensive laboratory and field trials validated the ATSC system's performance, culminating in real-world demonstrations. A notable example was the experimental HDTV by , a affiliate in , which became the first U.S. station to broadcast a digital HD signal on July 23, 1996, during the Atlanta Summer Olympics; this included live coverage of Olympic events, showcasing the technology's viability for major programming. Globally, similar digital initiatives emerged to advance HDTV. In , the Project, established by broadcasters and manufacturers, finalized its terrestrial specification () in December 1995, enabling MPEG-2-based digital HD transmission across diverse networks like cable and . In , the Digital Broadcasting-Terrestrial (ISDB-T) standard was standardized in May 1999 by the Association of Radio Industries and Businesses (ARIB), incorporating advanced modulation and compression to support robust HD mobile and fixed reception.

Key milestones in global broadcasting

In the United States, the culminated on June 12, 2009, when the mandated that all full-power stations cease operations and switch to exclusively. This nationwide switchover eliminated analog signals, allowing over-the-air (OTA) high-definition television to become universally available to households equipped with digital tuners or converter boxes, thereby achieving 100% potential HD coverage across the country and accelerating the replacement of older analog sets. The move not only freed up for other uses but also paved the way for enhanced picture quality and multicasting capabilities in HD format. Europe's rollout of HDTV during the 2005–2015 period involved phased transitions across member states, with the leading early efforts through the launch of broadcasts on May 11, 2006, initially available via satellite and cable platforms as part of a trial to test public reception. By 2009, Freeview HD services began technical transmissions using the standard, enabling HD delivery, and the UK's full digital switchover was completed by October 2012, at which point had become the predominant platform for HD channels nationwide, covering over 98% of households. Similar progress occurred elsewhere in , with countries like and adopting multiplexes by 2012 to support HD content during analog shutdowns. In the Asia-Pacific region, Australia implemented an HDTV quota requiring commercial broadcasters to air at least 1,040 hours of high-definition programming annually starting in 2003, but 2008 marked a key policy shift when the Australian Broadcasting Corporation rebranded its HD simulcast as ABC HD and expanded multichannel offerings, mandating HD compatibility to align with growing digital infrastructure. This built on the nation's digital terrestrial launch in 2001 and supported the eventual analog switch-off in 2013. In India, public broadcaster Doordarshan initiated a digital terrestrial television (DTT) pilot in 16 cities in 2016 to test nationwide feasibility, though implementation faced delays due to infrastructure challenges and spectrum allocation issues, limiting initial reach to select urban areas. By the mid-2010s, global HDTV adoption had surged, driven by falling prices and broader content availability. Major events like the 2008 Beijing Summer Olympics exemplified this momentum, as the Games marked the first time all competition coverage was produced and broadcast entirely in high-definition format, reaching over 4 billion viewers worldwide and showcasing HD's potential for immersive sports viewing. Advances in video compression technologies, such as MPEG-4, were instrumental in enabling these large-scale HD transmissions over limited bandwidth.

Formats and Notation

Resolution notations and terminology

High-definition television resolutions are commonly denoted using a shorthand that specifies the vertical resolution and scanning method, such as or . The numeric value represents the approximate number of horizontal lines of vertical resolution, while "p" indicates progressive scanning, in which all lines are updated simultaneously in each frame for smoother motion, and "i" denotes interlaced scanning, where odd and even lines are alternately refreshed in separate fields to reduce bandwidth. These notations originated from standards bodies like the ITU and SMPTE, which define HDTV parameters for broadcast compatibility. Terminology surrounding HDTV resolutions includes labels like "" and "Full HD," which guide consumer expectations for display capabilities. "," a certification developed by Digital Europe and adopted in , refers to televisions or monitors that can accept and display high-definition signals with at least or resolution via digital inputs like , ensuring basic compatibility with HDTV broadcasts without native upscaling requirements. In contrast, "Full HD" typically denotes devices supporting resolution, providing the full pixel grid for uncompressed progressive HD content. terms such as "True HD" have been used by industry groups to describe 1080p-capable displays meeting specific refresh rates like 60 Hz, emphasizing native support for high-fidelity HDTV signals. The notation system evolved from analog broadcasting conventions to digital pixel-based specifications. Early analog HDTV experiments, such as Japan's Hi-Vision system developed in the , relied on line counts like 1125 total scanning lines (with about 1035 active) to measure vertical resolution in analog signals transmitted via encoding. As digital television standards emerged in the 1990s, notations shifted to precise pixel dimensions under ITU-R Recommendation BT.709, standardizing HDTV at for 1080 formats and 1280×720 for 720, focusing on active picture area rather than total lines including blanking intervals. A common misconception is that notations like and refer to total pixel counts rather than vertical resolution lines, leading some to dismiss 720p as inferior or non-HD. In reality, both and 1080i/ denote vertical resolution, with horizontal pixels determined by the 16:9 (yielding 1280 for 720p and 1920 for 1080), and classify 720p as full HDTV due to its doubled vertical resolution over standard-definition formats like . Another misunderstanding arises from comparing line counts directly without considering scanning types, as delivers equivalent detail to in motion but differs in flicker and bandwidth efficiency. Notations may also append frame rates, such as 1080p60, to indicate 60 progressive frames per second.

Frame rates and media types

High-definition television (HDTV) supports a variety of frame rates tailored to different production origins and display standards, enabling smoother motion rendering compared to standard-definition formats. Common frame rates include 24 frames per second () for cinematic content, 50 interlaced fields per second (50i, equivalent to 25p progressive) for European PAL-based broadcasts, and 60 progressive frames per second (60p) for North American NTSC-derived systems, all at resolutions like . These rates contrast with DVD's limitations to standard-definition interlaced formats at 25i or 30i, which cap motion fluidity and resolution. Optical disc media like Blu-ray Disc provide the primary physical format for HDTV playback, with single-layer discs holding 25 GB and dual-layer up to 50 GB to accommodate uncompressed or lightly compressed HD video at these frame rates. Broadcast signals transmit HDTV via digital terrestrial, satellite, or cable systems, supporting 1080i50 or 1080p60 depending on regional standards, while cable connections use interfaces—version 1.3 and later enabling bandwidth for 1080p60 transmission without compression artifacts. Playback compatibility arises from frame rate mismatches between source media and display refresh rates, often requiring conversion techniques; for instance, 24p film content on 60 Hz televisions employs 3:2 pulldown, repeating frames in a 3:2 pattern to simulate 30 fps and avoid judder, though it can introduce visible artifacts like combing in interlaced displays. The format wars between (supporting up to 30 GB dual-layer) and Blu-ray were resolved in when major studios and retailers like and shifted support to Blu-ray, establishing it as the dominant HD optical medium due to its higher capacity and broader adoption.

Compatibility and upscaling

High-definition television (HDTV) systems are designed to content that may not natively the display's resolution or , requiring upscaling techniques to convert standard-definition (SD) signals to HD formats. Spatial upscaling primarily involves methods to increase the count of lower-resolution images, such as bilinear or , which estimate new values based on neighboring pixels to fill in the gaps when scaling SD content (typically 480i or 576i) to HD resolutions like 720p or 1080i. These methods aim to preserve image sharpness while minimizing , though more advanced approaches like Lanczos filtering can reduce blurring by using a sinc-based kernel for better edge preservation during the upscale process. Temporal upscaling addresses frame rate mismatches, such as converting 24 per second (fps) content to 60 fps for broadcast, using motion-compensated to generate intermediate . This analyzes motion vectors between to predict and synthesize new , ensuring smoother playback on HDTV displays without judder, particularly important for international content where frame rates vary (e.g., 25 fps PAL to 60 fps ). However, excessive temporal can lead to over-processing, altering the intended cinematic feel of original material. Backward compatibility in HDTV ensures seamless reception of legacy signals through integrated tuners and interface protocols. , ATSC tuners embedded in HDTV sets allow reception of both SD and HD signals within the ATSC 1.0 framework, where SD programs are transmitted as streams at lower bitrates alongside HD, enabling older content to be displayed without additional hardware. Similarly, interfaces use (EDID) for a process, where the display communicates its supported resolutions and formats to the source device, ensuring compatible HD transmission without signal mismatch. The CEA-861 standard, now under CTA-861, defines the protocols for uncompressed HD video and audio transmission over digital interfaces like , specifying timing, , and aspect ratios to guarantee interoperability across devices such as set-top boxes and DTVs. This standard ensures that HD signals are formatted correctly for backward-compatible delivery, supporting resolutions from to while maintaining compatibility with earlier DVI systems through shared electrical signaling. Despite these mechanisms, upscaling can introduce visual artifacts that degrade perceived quality. The "soap opera effect" arises from aggressive in temporal upscaling, creating artificially smooth motion that makes 24 fps content resemble 60 fps video, often resulting in an unnaturally hyper-realistic appearance due to reduced motion blur. artifacts, such as jagged edges or moiré patterns, occur in spatial upscaling when insufficient filters are applied during pixel interpolation, particularly evident in fine details like text or patterns when SD content is stretched to HD without proper sampling. These issues highlight the trade-offs in processing non-native content, where overzealous algorithms can introduce new distortions while attempting to enhance clarity.

Recording and Transmission

Compression techniques

High-definition television (HDTV) relies on video compression to transmit large volumes of data over limited bandwidth channels, typically reducing the raw bitrate of or signals from over 100 Mbps to manageable levels for . The foundational standard for HDTV compression is , which employs a hybrid approach combining spatial and temporal redundancies to achieve efficient encoding. In , intra-frame compression addresses spatial redundancy within a single frame by dividing the image into 8x8 blocks and applying the (DCT) to convert spatial data into frequency coefficients, concentrating energy in lower frequencies for easier discard. These coefficients are then quantized to remove less perceptible high-frequency details, using a process approximated by Q = \round\left( \frac{C}{\text{step_size}} \right), where CC represents the DCT coefficient and step_size controls the quantization coarseness, introducing loss but enabling bitrate reduction. Inter-frame compression exploits temporal redundancy across frames via , predicting the current frame from reference frames (previous or future) by estimating block motion vectors, then applying DCT and quantization to the residual differences. This MPEG-2 framework, standardized in the 1990s, supports HDTV bitrates of 15-20 Mbps for 1080i signals in broadcast applications, balancing quality and transmission constraints. Subsequent advancements improved efficiency: H.264/AVC, introduced in 2003, enhanced , intra-prediction, and context-adaptive , achieving comparable quality at 8-12 Mbps for HD content through better handling of block boundaries and variable block sizes. Further evolution came with HEVC/H.265 in 2013, which uses larger coding tree units (up to 64x64 pixels), advanced intra-prediction modes, and improved motion vector prediction, delivering similar HD quality at 4-8 Mbps—roughly 50% bitrate savings over H.264. Despite these gains, compression introduces artifacts such as blocking, where visible discontinuities appear at block edges due to coarse quantization of DCT coefficients, and mosquito noise, a high-frequency ringing around sharp edges or moving objects from in inverse DCT. mitigates these by reducing residual data through accurate prediction, minimizing the quantized information needed and thus lowering artifact prominence, though post-processing filters in later standards like H.264 further refine edges. Efficiency in these codecs stems from strategic bitrate allocation across frame types: I-frames are fully encoded intra-frame, consuming the most bits (e.g., 150,000-300,000 bits per frame in typical sequences) as complete references; P-frames use forward from prior I- or P-frames, encoding only differentials for 40-60% fewer bits; and B-frames leverage bi-directional , achieving the smallest size (20-40% of I-frame bits) by interpolating between surrounding frames, enabling overall compression ratios of 3-5 times over intra-only methods.
CodecYearTypical HD Bitrate (Mbps)Efficiency Gain Over Prior
MPEG-21990s15-20Baseline for digital HDTV
H.264/AVC20038-12~50% reduction vs.
HEVC/H.26520134-8~50% reduction vs. H.264

Storage and recording media

High-definition television content requires substantial storage due to its higher resolution and data rates compared to standard-definition video. Optical media emerged as a primary format for distributing and archiving HDTV material, with Blu-ray Disc becoming the dominant standard after outcompeting HD DVD in the mid-2000s format war. Blu-ray Discs are available in single-layer (BD-25) variants with 25 GB capacity and dual-layer (BD-50) versions holding 50 GB, both optimized for HDTV playback. A 50 GB dual-layer disc can accommodate up to 9 hours of compressed , enabling full-length movies with high-quality audio and extras. Extended capacities reach 100 GB on triple-layer discs, though these are less common for standard HDTV and more associated with enhanced features. In contrast, , which offered up to 30 GB per disc, failed commercially by 2008 due to limited studio support and Blu-ray's superior capacity and alliances with major hardware manufacturers like . For personal recording and time-shifting, digital video recorders (DVRs) and personal video recorders (PVRs) utilize hard disk drives to store extensive libraries of content. These devices typically hold 100 or more hours of HD material on a 1 TB drive, depending on compression settings. Storage efficiency varies by bitrate; for instance, HD recordings at 5 Mbps consume about 4.5 GB per hour, while higher-quality streams at 9-13.5 Mbps use 6-8 GB per hour. Flash-based storage, such as USB drives, provides portable options for archiving HDTV files, often in container formats like MKV (Matroska) or TS (). MKV supports multiple video codecs and subtitles, making it versatile for HD storage and playback across devices. TS files, commonly used on Blu-ray discs, package compressed or H.264 video streams for reliable HD archival. Cloud services like Apple further enable remote storage of HD videos, with plans offering 200 GB to 2 TB of space suitable for thousands of high-resolution files shared across devices. Compression techniques significantly reduce HDTV storage needs; for example, a 1080p video at 60 frames per second with a 20 Mbps bitrate results in approximately 9 GB per hour, far less than the terabytes required for uncompressed equivalents. This is calculated as (bitrate in Mbps × duration in seconds) / 8, converted to gigabytes.

Delivery methods in and streaming

High-definition television (HDTV) content is delivered to consumers through a variety of broadcasting and streaming methods, each optimized for different infrastructure and user needs. Traditional over-the-air (OTA), cable, and satellite broadcasting rely on standardized modulation and multiplexing techniques to transmit HD signals efficiently over radio frequency (RF) channels. In terrestrial broadcasting, the ATSC 1.0 standard enables OTA HDTV transmission using 8-level vestigial sideband (8-VSB) modulation within 6 MHz channels, supporting resolutions up to 1080i or 720p at typical frame rates. This standard, adopted by the FCC in 1995 and fully transitioned by 2009, allows broadcasters to multiplex multiple SD and HD programs, with a single 19.39 Mbps transport stream accommodating one HD channel alongside ancillary data. For cable systems, quadrature amplitude modulation (QAM), particularly 256-QAM, is employed to deliver HD content over coaxial networks in 6 MHz channels, achieving a multiplex bitrate of 38.8 Mbps that supports approximately two HD programs per channel. Satellite delivery, such as via DVB-S2 standards, similarly uses higher-order QAM variants to broadcast HD multiplexes to subscribers, often at bitrates exceeding 30 Mbps per transponder for regional coverage. Streaming services have transformed HDTV delivery by leveraging (IP) networks for on-demand access. Netflix pioneered widespread HD streaming in 2008, with broader rollout across devices by 2010, utilizing adaptive bitrate (ABR) streaming to dynamically adjust quality based on available bandwidth, typically requiring 5 Mbps for standard HD (720p) and up to 15 Mbps for full . ABR encodes content into multiple bitrate variants, allowing seamless switching to prevent buffering while maintaining HD fidelity over variable connections. IP-based delivery extends HDTV to local networks and beyond, with technologies like over Ethernet enabling uncompressed or lightly compressed HD transmission over Category 6 cabling up to 100 meters, supporting at 60 Hz without additional latency. For live HD streaming, protocols such as the (RTSP) facilitate control and delivery of real-time video over IP, commonly used in surveillance and broadcast applications to stream H.264-encoded HD feeds with low overhead. Hybrid models integrate broadcast and IP elements for enhanced flexibility. The ATSC 3.0 standard, approved by the FCC in November 2017, introduces IP-based packet delivery within its OFDM , allowing broadcasters to overlay internet-derived content onto OTA signals for interactive HD experiences while maintaining through ing. As of November 2025, ATSC 3.0 adoption is ongoing, with the FCC proposing a phased transition that would end simulcast requirements by February 2028 for stations in the top 55 markets. This enables features like and mobile reception, bridging traditional with streaming efficiencies.

Ultra-high-definition television

Ultra-high-definition television (UHDTV) builds upon HDTV by providing resolutions that quadruple the pixel count, enabling sharper images and greater detail, particularly on larger screens. The core 4K UHD format, also known as Quad HD, delivers a resolution of 3840 × 2160 pixels, while 8K UHD extends this to 7680 × 4320 pixels, offering immense detail for applications like professional production and future-proofing consumer displays. These formats maintain the 16:9 standard in HDTV but demand more advanced processing and display technologies to realize their potential. Key standards for UHDTV, designated as UHD-1 under recommendations, incorporate for a broader color gamut covering about 75% of visible colors, far exceeding HDTV's Rec. 709. HDR enhancements like , which uses static metadata for improved brightness and contrast up to 10,000 nits, and , employing dynamic metadata for scene-by-scene optimization, are essential for leveraging UHD's capabilities. The introduction of 4K Ultra HD Blu-ray in February 2016 marked a milestone in , supporting , HDR, and high-bitrate audio on discs with up to 100 GB capacity. Transmission of UHD content requires substantially higher bandwidth than HDTV due to the increased data volume; compressed 4K streams typically need 25-50 Mbps for smooth playback, compared to 10-20 Mbps for HDTV. This disparity arises from the fourfold pixel increase, necessitating efficient codecs like HEVC (H.265) to manage data rates effectively. Adoption of UHDTV accelerated in the with pioneering broadcasts, such as select matches from the transmitted in 4K to demonstrate the format's viability in live sports. By 2020, 4K UHD televisions had permeated the consumer market, with global sales exceeding 100 million units annually as prices dropped and content availability grew.

Integration with streaming and smart devices

High-definition television (HDTV) has become deeply integrated with streaming services, enabling seamless delivery of HD content over the internet. YouTube introduced support for 720p HD video uploads and playback in December 2008, marking an early milestone in online HD streaming, followed by 1080p support in November 2009. Amazon began offering video streaming in 2008 through Amazon Video on Demand, with full HD availability integrated into Amazon Instant Video by 2011 for Prime members, allowing unlimited access to thousands of HD titles. By 2014, both platforms expanded to include 4K ultra-high-definition (UHD) and high dynamic range (HDR) content, enhancing HDTV viewing with higher resolutions and improved color depth, though HD remains the foundational standard for most streamed media. Smart TVs have further embedded HDTV into connected ecosystems through platforms like LG's , launched in 2014, and Google's , also debuting in 2014, both of which natively support HD streaming apps from services such as and . These operating systems facilitate easy access to HD libraries via built-in app stores, with features like personalized recommendations and seamless navigation. Voice control, powered by assistants such as on and LG ThinQ AI on , allows users to search for and launch specific HD content—such as requesting a 1080p episode—without manual input, improving accessibility for HDTV consumption. By 2025, approximately 68% of U.S. internet households own at least one smart TV capable of HD streaming, reflecting widespread adoption driven by these integrated platforms. Enhanced connectivity options have optimized HDTV streaming on smart devices, with —standardized in 2019—providing key benefits like reduced latency and higher throughput to prevent buffering during HD playback, even in multi-device homes. Casting technologies complement this by enabling wireless transmission of HD content from mobile devices to HDTVs; Google's , introduced in 2013, supports mirroring and app-based casting for services like , while Apple's allows similar HD video projection from devices to compatible smart TVs. These integrations ensure HDTV content flows effortlessly across ecosystems, from smartphones to large screens, without compromising quality.

Emerging technologies and challenges

Recent advancements in display technologies have significantly enhanced the contrast and image quality of high-definition televisions. MicroLED displays, which utilize arrays of microscopic inorganic LEDs, offer superior contrast ratios and peak brightness levels exceeding 5,000 nits compared to traditional panels, while also providing longer operational lifespans without the risk of . technologies have continued to evolve, with 2025 models incorporating improved self-emissive pixels for deeper blacks and wider viewing angles in HD content. Complementing these, enhancements, such as Samsung's Real displays, boost color accuracy and volume by up to 100% in LCD-based HDTVs, enabling more vibrant HD visuals without increasing power draw. AI-driven upscaling has emerged as a key for improving HD viewing experiences, particularly for legacy content. NVIDIA's AI upscaling on devices like TV uses models similar to DLSS in gaming to intelligently reconstruct lower-resolution videos to 4K , reducing artifacts and enhancing sharpness in real-time. This approach outperforms traditional methods by predicting and filling in missing details based on trained datasets of high-quality footage. Despite these innovations, several challenges persist in advancing beyond standard HD resolutions. Energy consumption remains a major hurdle for 8K displays, which require approximately twice the power of 4K models due to their increased , leading to EU regulations that effectively restricted 8K TV sales starting in 2023 unless manufacturers implement power-saving modes like pixel value reduction. Additionally, content scarcity hampers adoption of higher resolutions, as native 8K programming is limited on major streaming platforms and broadcasters in 2025, with most HD and 4K material relying on upscaling to fill the gap. Looking ahead, the ongoing rollout of standards, with FCC proposals for a phased mandatory transition beginning in top markets by , is expected to transform interactive HD broadcasting in the coming years, enabling features like on-demand replays and personalized content on over-the-air signals from more than 125 U.S. stations reaching 75% of viewers. On October 28, 2025, the FCC adopted a notice proposing to phase out ATSC 1.0 simulcasting requirements, with stations in the top 55 markets transitioning fully by February and nationwide by 2030. technologies are also expected to further integrate with HD panels, potentially achieving ultra-high-resolution patterning for enhanced efficiency and color fidelity. Accessibility issues in HD television persist, particularly around subtitling and global disparities. Subtitles for the deaf and hard of hearing (SDH) in HD formats, compatible with Blu-ray and streaming, include non-speech elements like sound effects to ensure full comprehension, as mandated by updated FCC rules for easy access on TVs and apps. The global digital divide exacerbates HD access, with 2.6 billion people lacking reliable for streaming HD content in , disproportionately affecting low-income and rural regions in developing countries.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.