Recent from talks
Knowledge base stats:
Talk channels stats:
Members stats:
Loudness war
The loudness war (or loudness race) is a trend of increasing audio levels in recorded music, which reduces audio fidelity and—according to many critics—listener enjoyment. Increasing loudness was first reported as early as the 1940s, with respect to mastering practices for 7-inch singles. The maximum peak level of analog recordings such as these is limited by varying specifications of electronic equipment along the chain from source to listener, including vinyl and cassette players. The issue garnered renewed attention starting in the 1990s with the introduction of digital signal processing capable of producing further loudness increases.
With the advent of the compact disc (CD), music is encoded to a digital format with a clearly defined maximum peak amplitude. Once the maximum amplitude of a CD is reached, loudness can be increased still further through signal processing techniques such as dynamic range compression and equalization. Engineers can apply an increasingly high ratio of compression to a recording until it peaks more frequently at the maximum amplitude, a technique colloquially known as brickwalling. In extreme cases, efforts to increase loudness can result in clipping and other audible distortion. Modern recordings that use extreme dynamic range compression and other measures to increase loudness therefore can sacrifice sound quality to loudness. The competitive escalation of loudness has led music fans and members of the musical press to refer to the affected albums as "victims of the loudness war".
The practice of focusing on loudness in audio mastering can be traced back to the introduction of the compact disc; however, it also existed to some extent when the vinyl phonograph record was the primary released recording medium and when 7-inch singles were played on jukebox machines in clubs and bars. The so-called wall of sound (not to be confused with the Phil Spector Wall of Sound) formula preceded the loudness war, but achieved its goal using a variety of techniques, such as instrument doubling and reverberation, as well as compression.
Jukeboxes became popular in the 1940s and were often set to a predetermined level by the owner, so any record that was mastered louder than the others would stand out. Similarly, starting in the 1950s, producers would request louder 7-inch singles so that songs would stand out when auditioned by program directors for radio stations. In particular, many Motown records pushed the limits of how loud records could be made; according to one of their engineers, they were "notorious for cutting some of the hottest 45s in the industry." In the 1960s and 1970s, compilation albums of hits by multiple different artists became popular, and if artists and producers found their song was quieter than others on the compilation, they would insist that their song be remastered to be competitive.
Because of the limitations of the vinyl format, the ability to manipulate loudness was also limited. Attempts to achieve extreme loudness could render the medium unplayable. One example was the "hot" master of Led Zeppelin II by mastering engineer Bob Ludwig which caused some cartridges to mistrack; the album was recalled and issued with lower compression levels. Digital media such as CDs remove these restrictions and as a result, increasing loudness levels have been a more severe issue in the CD era. Modern computer-based digital audio effects processing allows mastering engineers to have greater direct control over the loudness of a song: for example, a brick-wall limiter can look ahead at an upcoming signal to limit its level.
Since CDs were not the primary medium for popular music until the late 1980s, there was little motivation for competitive loudness practices then. The common practice of mastering music for CD involved matching the highest peak of a recording at, or close to, digital full scale, and referring to digital levels along the lines of more familiar analog VU meters. When using VU meters, a certain point (usually −14 dB below the disc's maximum amplitude) was used in the same way as the saturation point (signified as 0 dB) of analog recording, with several dB of the CD's recording level reserved for amplitude exceeding the saturation point (often referred to as the red zone, signified by a red bar in the meter display), because digital media cannot exceed 0 decibels relative to full scale (dBFS).[citation needed] The average RMS level of the average rock song during most of the decade was around −16.8 dBFS.
By the early 1990s, mastering engineers had learned how to optimize for the CD medium and the loudness war had not yet begun in earnest. However, in the early 1990s, CDs with louder music levels began to surface, and CD levels became more and more likely to bump up to the digital limit, resulting in recordings where the peaks on an average rock or beat-heavy pop CD hovered near 0 dBFS, but only occasionally reached it.[citation needed]
The concept of making music releases hotter began to appeal to people within the industry, in part because of how noticeably louder some releases had become and also in part because the industry believed that customers preferred louder-sounding CDs, even though that may not have been true. Engineers, musicians, and labels each developed their own ideas of how CDs could be made louder. In 1994, the first digital brick-wall limiter with look-ahead (the Waves L1) was mass-produced; this feature, since then, has been commonly incorporated in digital mastering limiters and maximizers. While the increase in CD loudness was gradual throughout the 1990s, some opted to push the format to the limit, such as on Oasis's widely popular album (What's the Story) Morning Glory?, whose RMS level averaged −8 dBFS on many of its tracks—a rare occurrence, especially in the year it was released (1995). Red Hot Chili Peppers's Californication (1999) represented another milestone, with prominent clipping occurring throughout the album.
Hub AI
Loudness war AI simulator
(@Loudness war_simulator)
Loudness war
The loudness war (or loudness race) is a trend of increasing audio levels in recorded music, which reduces audio fidelity and—according to many critics—listener enjoyment. Increasing loudness was first reported as early as the 1940s, with respect to mastering practices for 7-inch singles. The maximum peak level of analog recordings such as these is limited by varying specifications of electronic equipment along the chain from source to listener, including vinyl and cassette players. The issue garnered renewed attention starting in the 1990s with the introduction of digital signal processing capable of producing further loudness increases.
With the advent of the compact disc (CD), music is encoded to a digital format with a clearly defined maximum peak amplitude. Once the maximum amplitude of a CD is reached, loudness can be increased still further through signal processing techniques such as dynamic range compression and equalization. Engineers can apply an increasingly high ratio of compression to a recording until it peaks more frequently at the maximum amplitude, a technique colloquially known as brickwalling. In extreme cases, efforts to increase loudness can result in clipping and other audible distortion. Modern recordings that use extreme dynamic range compression and other measures to increase loudness therefore can sacrifice sound quality to loudness. The competitive escalation of loudness has led music fans and members of the musical press to refer to the affected albums as "victims of the loudness war".
The practice of focusing on loudness in audio mastering can be traced back to the introduction of the compact disc; however, it also existed to some extent when the vinyl phonograph record was the primary released recording medium and when 7-inch singles were played on jukebox machines in clubs and bars. The so-called wall of sound (not to be confused with the Phil Spector Wall of Sound) formula preceded the loudness war, but achieved its goal using a variety of techniques, such as instrument doubling and reverberation, as well as compression.
Jukeboxes became popular in the 1940s and were often set to a predetermined level by the owner, so any record that was mastered louder than the others would stand out. Similarly, starting in the 1950s, producers would request louder 7-inch singles so that songs would stand out when auditioned by program directors for radio stations. In particular, many Motown records pushed the limits of how loud records could be made; according to one of their engineers, they were "notorious for cutting some of the hottest 45s in the industry." In the 1960s and 1970s, compilation albums of hits by multiple different artists became popular, and if artists and producers found their song was quieter than others on the compilation, they would insist that their song be remastered to be competitive.
Because of the limitations of the vinyl format, the ability to manipulate loudness was also limited. Attempts to achieve extreme loudness could render the medium unplayable. One example was the "hot" master of Led Zeppelin II by mastering engineer Bob Ludwig which caused some cartridges to mistrack; the album was recalled and issued with lower compression levels. Digital media such as CDs remove these restrictions and as a result, increasing loudness levels have been a more severe issue in the CD era. Modern computer-based digital audio effects processing allows mastering engineers to have greater direct control over the loudness of a song: for example, a brick-wall limiter can look ahead at an upcoming signal to limit its level.
Since CDs were not the primary medium for popular music until the late 1980s, there was little motivation for competitive loudness practices then. The common practice of mastering music for CD involved matching the highest peak of a recording at, or close to, digital full scale, and referring to digital levels along the lines of more familiar analog VU meters. When using VU meters, a certain point (usually −14 dB below the disc's maximum amplitude) was used in the same way as the saturation point (signified as 0 dB) of analog recording, with several dB of the CD's recording level reserved for amplitude exceeding the saturation point (often referred to as the red zone, signified by a red bar in the meter display), because digital media cannot exceed 0 decibels relative to full scale (dBFS).[citation needed] The average RMS level of the average rock song during most of the decade was around −16.8 dBFS.
By the early 1990s, mastering engineers had learned how to optimize for the CD medium and the loudness war had not yet begun in earnest. However, in the early 1990s, CDs with louder music levels began to surface, and CD levels became more and more likely to bump up to the digital limit, resulting in recordings where the peaks on an average rock or beat-heavy pop CD hovered near 0 dBFS, but only occasionally reached it.[citation needed]
The concept of making music releases hotter began to appeal to people within the industry, in part because of how noticeably louder some releases had become and also in part because the industry believed that customers preferred louder-sounding CDs, even though that may not have been true. Engineers, musicians, and labels each developed their own ideas of how CDs could be made louder. In 1994, the first digital brick-wall limiter with look-ahead (the Waves L1) was mass-produced; this feature, since then, has been commonly incorporated in digital mastering limiters and maximizers. While the increase in CD loudness was gradual throughout the 1990s, some opted to push the format to the limit, such as on Oasis's widely popular album (What's the Story) Morning Glory?, whose RMS level averaged −8 dBFS on many of its tracks—a rare occurrence, especially in the year it was released (1995). Red Hot Chili Peppers's Californication (1999) represented another milestone, with prominent clipping occurring throughout the album.