Hubbry Logo
LTE (telecommunication)LTE (telecommunication)Main
Open search
LTE (telecommunication)
Community hub
LTE (telecommunication)
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
LTE (telecommunication)
LTE (telecommunication)
from Wikipedia

In telecommunications, long-term evolution (LTE) is a standard for wireless broadband communication for cellular mobile devices and data terminals. It is considered to be a "transitional" 4G technology,[1] and is therefore also referred to as 3.95G as a step above 3G.[2]

LTE is based on the 2G GSM/EDGE and 3G UMTS/HSPA standards. It improves on those standards' capacity and speed by using a different radio interface and core network improvements.[3][4] LTE is the upgrade path for carriers with both GSM/UMTS networks and CDMA2000 networks. LTE has been succeeded by LTE Advanced, which is officially defined as a "true" 4G technology[5] and also named "LTE+".

Terminology

[edit]

The standard is developed by the 3GPP (3rd Generation Partnership Project) and is specified in its Release 8 document series, with minor enhancements described in Release 9. LTE is also called 3.95G and has been marketed as 4G LTE and Advanced 4G;[citation needed] but the original version did not meet the technical criteria of a 4G wireless service, as specified in the 3GPP Release 8 and 9 document series for LTE Advanced. The requirements were set forth by the ITU-R organisation in the IMT Advanced specification; but, because of market pressure and the significant advances that WiMAX, Evolved High Speed Packet Access, and LTE bring to the original 3G technologies, ITU-R later decided that LTE and the aforementioned technologies can be called 4G technologies.[6] The LTE Advanced standard formally satisfies the ITU-R requirements for being considered IMT-Advanced.[7] To differentiate LTE Advanced and WiMAX-Advanced from current[when?] 4G technologies, ITU has defined the latter as "True 4G".[8][5]

Overview

[edit]
3GPP logo of LTE
An Android phone showing LTE connection
LTE modem
4G+ modem
LTE and VoLTE logos on an Samsung Galaxy S21+ running One UI 6.1
HTC ThunderBolt, the second commercially available LTE smartphone

LTE stands for Long-Term Evolution[9] and is a registered trademark owned by ETSI (European Telecommunications Standards Institute) for the wireless data communications technology and development of the GSM/UMTS standards. However, other nations and companies do play an active role in the LTE project. The goal of LTE was to increase the capacity and speed of wireless data networks using new DSP (digital signal processing) techniques and modulations that were developed around the turn of the millennium. A further goal was the redesign and simplification of the network architecture to an IP-based system with significantly reduced transfer latency compared with the 3G architecture. The LTE wireless interface is incompatible with 2G and 3G networks so it must be operated on a separate radio spectrum.

The idea of LTE was first proposed in 1998, with the use of the COFDM radio access technique to replace the CDMA and studying its Terrestrial use in the L band at 1428 MHz (TE) In 2004 by Japan's NTT Docomo, with studies on the standard officially commenced in 2005.[10] In May 2007, the LTE/SAE Trial Initiative (LSTI) alliance was founded as a global collaboration between vendors and operators with the goal of verifying and promoting the new standard in order to ensure the global introduction of the technology as quickly as possible.[11][12]

The LTE standard was finalized in December 2008, and the first publicly available LTE service was launched by TeliaSonera in Oslo and Stockholm on December 14, 2009, as a data connection with a USB modem. The LTE services were launched by major North American carriers as well, with the Samsung SCH-r900 being the world's first LTE Mobile phone starting on September 21, 2010,[13][14] and Samsung Galaxy Indulge being the world's first LTE smartphone starting on February 10, 2011,[15][16] both offered by MetroPCS, and the HTC ThunderBolt offered by Verizon starting on March 17 being the second LTE smartphone to be sold commercially.[17][18] In Canada, Rogers Wireless was the first to launch LTE network on July 7, 2011, offering the Sierra Wireless AirCard 313U USB mobile broadband modem, known as the "LTE Rocket stick" then followed closely by mobile devices from both HTC and Samsung.[19] Initially, CDMA operators planned to upgrade to rival standards called UMB and WiMAX, but major CDMA operators (such as Verizon, Sprint and MetroPCS in the United States, Bell and Telus in Canada, au by KDDI in Japan, SK Telecom in South Korea and China Telecom/China Unicom in China) have announced instead they intend to migrate to LTE. The next version of LTE is LTE Advanced, which was standardized in March 2011.[20] Services commenced in 2013.[21] Additional evolution known as LTE Advanced Pro have been approved in year 2015.[22]

The LTE specification provides downlink peak rates of 300 Mbit/s, uplink peak rates of 75 Mbit/s and QoS provisions permitting a transfer latency of less than 5 ms in the radio access network. LTE has the ability to manage fast-moving mobiles and supports multi-cast and broadcast streams. LTE supports scalable carrier bandwidths, from 1.4 MHz to 20 MHz and supports both frequency division duplexing (FDD) and time-division duplexing (TDD). The IP-based network architecture, called the Evolved Packet Core (EPC) designed to replace the GPRS Core Network, supports seamless handovers for both voice and data to cell towers with older network technology such as GSM, UMTS and CDMA2000.[23] The simpler architecture results in lower operating costs (for example, each E-UTRA cell will support up to four times the data and voice capacity supported by HSPA[24]).

Because LTE frequencies and bands differ from country to country, only multi-band phones can use LTE in all countries where it is supported.

History

[edit]

3GPP standard development timeline

[edit]
Cellular network standards and generation timeline
  • In 2004, NTT Docomo of Japan proposes LTE as the international standard.[25]
  • In September 2006, Siemens Networks (today Nokia Networks) showed in collaboration with Nomor Research the first live emulation of an LTE network to the media and investors. As live applications, two users streaming an HDTV video in the downlink and playing an interactive game in the uplink have been demonstrated.[26]
  • In February 2007, Ericsson demonstrated for the first time in the world, LTE with bit rates up to 144 Mbit/s[27]
  • In September 2007, NTT Docomo demonstrated LTE data rates of 200 Mbit/s with power level below 100 mW during the test.[28]
  • In November 2007, Infineon presented the world's first RF transceiver named SMARTi LTE supporting LTE functionality in a single-chip RF silicon processed in CMOS[29][30]
  • In early 2008, LTE test equipment began shipping from several vendors and, at the Mobile World Congress 2008 in Barcelona, Ericsson demonstrated the world's first end-to-end mobile call enabled by LTE on a small handheld device.[31] Motorola demonstrated an LTE RAN (Radio Access Network) standard compliant eNodeB and LTE chipset at the same event.
  • At the February 2008 Mobile World Congress:
    • Motorola demonstrated how LTE can accelerate the delivery of personal media experience with HD video demo streaming, HD video blogging, Online gaming, and VoIP over LTE running a RAN standard-compliant LTE network & LTE chipset.[32]
    • Ericsson EMP (later ST-Ericsson) demonstrated the world's first end-to-end LTE call on handheld[31] Ericsson demonstrated LTE FDD and TDD mode on the same base station platform.
    • Freescale Semiconductor demonstrated streaming HD video with peak data rates of 96 Mbit/s downlink and 86 Mbit/s uplink.[33]
    • NXP Semiconductors (later part of ST-Ericsson) demonstrated a multi-mode LTE modem as the basis for a software-defined radio system for use in cellphones.[34]
    • picoChip and Mimoon demonstrated a base station reference design. This runs on a common hardware platform (multi-mode / software-defined radio) with their WiMAX architecture.[35]
  • In April 2008, Motorola demonstrated the first EV-DO to LTE hand-off – handing over a streaming video from LTE to a commercial EV-DO network and back to LTE.[36]
  • In April 2008, LG Electronics and Nortel demonstrated LTE data rates of 50 Mbit/s while travelling at 110 km/h (68 mph).[37]
  • In November 2008, Motorola demonstrated industry first over-the-air LTE session in 700 MHz spectrum.[38]
  • Researchers at Nokia Siemens Networks and Heinrich Hertz Institut have demonstrated LTE with 100 Mbit/s Uplink transfer speeds.[39]
  • At the February 2009 Mobile World Congress:
    • Infineon demonstrated a single-chip 65 nm CMOS RF transceiver providing 2G/3G/LTE functionality[40]
    • Launch of ng Connect program, a multi-industry consortium founded by Alcatel-Lucent to identify and develop wireless broadband applications.[41]
    • Motorola provided LTE drive tour on the streets of Barcelona to demonstrate LTE system performance in a real-life metropolitan RF environment[42]
  • In July 2009, Nujira demonstrated efficiencies of more than 60% for an 880 MHz LTE Power Amplifier[43]
  • In August 2009, Nortel and LG Electronics demonstrated the first successful handoff between CDMA and LTE networks in a standards-compliant manner[44]
  • In August 2009, Alcatel-Lucent receives FCC certification for LTE base stations for the 700 MHz spectrum band.[45]
  • In September 2009, Nokia Siemens Networks demonstrated the world's first LTE call on standards-compliant commercial software.[46]
  • In October 2009, Ericsson and Samsung demonstrated interoperability between the first ever commercial LTE device and the live network in Stockholm, Sweden.[47]
  • In October 2009, Alcatel-Lucent's Bell Labs, Deutsche Telekom Innovation Laboratories, the Fraunhofer Heinrich-Hertz Institut, and antenna supplier Kathrein conducted live field tests of a technology called Coordinated Multipoint Transmission (CoMP) aimed at increasing the data transmission speeds of LTE and 3G networks.[48]
  • In November 2009, Alcatel-Lucent completed first live LTE call using 800 MHz spectrum band set aside as part of the European Digital Dividend (EDD).[49]
  • In November 2009, Nokia Siemens Networks and LG completed first end-to-end interoperability testing of LTE.[50]
  • On December 14, 2009, the first commercial LTE deployment was in the Scandinavian capitals Stockholm and Oslo by the Swedish-Finnish network operator TeliaSonera and its Norwegian brandname NetCom (Norway). TeliaSonera incorrectly branded the network "4G". The modem devices on offer were manufactured by Samsung (dongle GT-B3710), and the network infrastructure with SingleRAN technology created by Huawei (in Oslo)[51] and Ericsson (in Stockholm). TeliaSonera plans to roll out nationwide LTE across Sweden, Norway and Finland.[52] TeliaSonera used spectral bandwidth of 10 MHz (out of the maximum 20 MHz), and Single-Input and Single-Output transmission. The deployment should have provided a physical layer net bit rates of up to 50 Mbit/s downlink and 25 Mbit/s in the uplink. Introductory tests showed a TCP goodput of 42.8 Mbit/s downlink and 5.3 Mbit/s uplink in Stockholm.[53]
  • In December 2009, ST-Ericsson and Ericsson first to achieve LTE and HSPA mobility with a multimode device.[54]
  • In January 2010, Alcatel-Lucent and LG complete a live handoff of an end-to-end data call between LTE and CDMA networks.[55]
  • In February 2010, Nokia Siemens Networks and Movistar tested the LTE in Mobile World Congress 2010 in Barcelona, Spain, with both indoor and outdoor demonstrations.[56]
  • In May 2010, Mobile TeleSystems (MTS) and Huawei showed an indoor LTE network at "Sviaz-Expocomm 2010" in Moscow, Russia.[57] MTS expects to start a trial LTE service in Moscow by the beginning of 2011. Earlier, MTS has received a license to build an LTE network in Uzbekistan and intends to commence a test LTE network in Ukraine in partnership with Alcatel-Lucent.
  • At the Shanghai Expo 2010 in May 2010, Motorola demonstrated a live LTE in conjunction with China Mobile. This included video streams and a drive test system using TD-LTE.[58]
  • As of 12/10/2010, DirecTV has teamed up with Verizon Wireless for a test of high-speed LTE wireless technology in a few homes in Pennsylvania, designed to deliver an integrated Internet and TV bundle. Verizon Wireless said it launched LTE wireless services (for data, no voice) in 38 markets where more than 110 million Americans live on Sunday, Dec. 5.[59]
  • On May 6, 2011, Sri Lanka Telecom Mobitel demonstrated 4G LTE for the first time in South Asia, achieving a data rate of 96 Mbit/s in Sri Lanka.[60]

Carrier adoption timeline

[edit]

Most carriers supporting GSM or HSUPA networks can be expected to upgrade their networks to LTE at some stage. A complete list of commercial contracts can be found at:[61]

  • August 2009: Telefónica selected six countries to field-test LTE in the succeeding months: Spain, the United Kingdom, Germany and the Czech Republic in Europe, and Brazil and Argentina in Latin America.[62]
  • On November 24, 2009: Telecom Italia announced the first outdoor pre-commercial experimentation in the world, deployed in Torino and totally integrated into the 2G/3G network currently in service.[63]
  • On December 14, 2009, the world's first publicly available LTE service was opened by TeliaSonera in the two Scandinavian capitals Stockholm and Oslo.
  • On May 28, 2010, Russian operator Scartel announced the launch of an LTE network in Kazan by the end of 2010.[64]
  • On October 6, 2010, Canadian provider Rogers Communications Inc announced that Ottawa, Canada's national capital, would be the site of LTE trials. Rogers said it will expand on this testing and move to a comprehensive technical trial of LTE on both low- and high-band frequencies across the Ottawa area.[65]
  • On May 6, 2011, Sri Lanka Telecom Mobitel successfully demonstrated 4G LTE for the first time in South Asia, achieving a data rate of 96 Mbit/s in Sri Lanka.[66]
  • On May 7, 2011, Sri Lankan Mobile Operator Dialog Axiata PLC switched on the first pilot 4G LTE Network in South Asia with vendor partner Huawei and demonstrated a download data speed up to 127 Mbit/s.[67]
  • On February 9, 2012, Telus Mobility launched their LTE service initial in metropolitan areas include Vancouver, Calgary, Edmonton, Toronto and the Greater Toronto Area, Kitchener, Waterloo, Hamilton, Guelph, Belleville, Ottawa, Montreal, Québec City, Halifax and Yellowknife.[68]
  • Telus Mobility has announced that it will adopt LTE as its 4G wireless standard.[69]
  • Cox Communications has its first tower for wireless LTE network build-out.[70] Wireless services launched in late 2009.
  • In March 2019, the Global Mobile Suppliers Association reported that there were now 717 operators with commercially launched LTE networks (broadband fixed wireless access and or mobile).[71]

The following is a list of the top 10 countries/territories by 4G LTE coverage as measured by OpenSignal.com in February/March 2019.[72][73]

Rank Country/Territory Penetration
1  South Korea 97.5%
2  Japan 96.3%
3  Norway 95.5%
4  Hong Kong 94.1%
5  United States 93.0%
6  Netherlands 92.8%
7  Taiwan 92.8%
8  Hungary 91.4%
9  Sweden 91.1%
10  India 90.9%

For the complete list of all the countries/territories, see list of countries by 4G LTE penetration.

LTE-TDD and LTE-FDD

[edit]

Long-Term Evolution Time-Division Duplex (LTE-TDD), also referred to as TDD LTE, is a 4G telecommunications technology and standard co-developed by an international coalition of companies, including China Mobile, Datang Telecom, Huawei, ZTE, Nokia Solutions and Networks, Qualcomm, Samsung, and ST-Ericsson. It is one of the two mobile data transmission technologies of the Long-Term Evolution (LTE) technology standard, the other being Long-Term Evolution Frequency-Division Duplex (LTE-FDD). While some companies refer to LTE-TDD as "TD-LTE" for familiarity with TD-SCDMA, there is no reference to that abbreviation anywhere in the 3GPP specifications.[74][75][76]

There are two major differences between LTE-TDD and LTE-FDD: how data is uploaded and downloaded, and what frequency spectra the networks are deployed in. While LTE-FDD uses paired frequencies to upload and download data,[77] LTE-TDD uses a single frequency, alternating between uploading and downloading data through time.[78][79] The ratio between uploads and downloads on a LTE-TDD network can be changed dynamically, depending on whether more data needs to be sent or received.[80] LTE-TDD and LTE-FDD also operate on different frequency bands,[81] with LTE-TDD working better at higher frequencies, and LTE-FDD working better at lower frequencies.[82] Frequencies used for LTE-TDD range from 1850 MHz to 3800 MHz, with several different bands being used.[83] The LTE-TDD spectrum is generally cheaper to access, and has less traffic.[81] Further, the bands for LTE-TDD overlap with those used for WiMAX, which can easily be upgraded to support LTE-TDD.[81]

Despite the differences in how the two types of LTE handle data transmission, LTE-TDD and LTE-FDD share 90 percent of their core technology, making it possible for the same chipsets and networks to use both versions of LTE.[81][84] A number of companies produce dual-mode chips or mobile devices, including Samsung and Qualcomm,[85][86] while operators CMHK and Hi3G Access have developed dual-mode networks in Hong Kong and Sweden, respectively.[87]

History of LTE-TDD

[edit]

The creation of LTE-TDD involved a coalition of international companies that worked to develop and test the technology.[88] China Mobile was an early proponent of LTE-TDD,[81][89] along with other companies like Datang Telecom[88] and Huawei, which worked to deploy LTE-TDD networks, and later developed technology allowing LTE-TDD equipment to operate in white spaces—frequency spectra between broadcast TV stations.[75][90] Intel also participated in the development, setting up a LTE-TDD interoperability lab with Huawei in China,[91] as well as ST-Ericsson,[81] Nokia,[81] and Nokia Siemens (now Nokia Solutions and Networks),[75] which developed LTE-TDD base stations that increased capacity by 80 percent and coverage by 40 percent.[92] Qualcomm also participated, developing the world's first multi-mode chip, combining both LTE-TDD and LTE-FDD, along with HSPA and EV-DO.[86] Accelleran, a Belgian company, has also worked to build small cells for LTE-TDD networks.[93]

Trials of LTE-TDD technology began as early as 2010, with Reliance Industries and Ericsson India conducting field tests of LTE-TDD in India, achieving 80 megabit-per second download speeds and 20 megabit-per-second upload speeds.[94] By 2011, China Mobile began trials of the technology in six cities.[75]

Although initially seen as a technology utilized by only a few countries, including China and India,[95] by 2011 international interest in LTE-TDD had expanded, especially in Asia, in part due to LTE-TDD's lower cost of deployment compared to LTE-FDD.[75] By the middle of that year, 26 networks around the world were conducting trials of the technology.[76] The Global LTve (GTI) was also started in 2011, with founding partners China Mobile, Bharti Airtel, SoftBank Mobile, Vodafone, Clearwire, Aero2 and E-Plus.[96] In September 2011, Huawei announced it would partner with Polish mobile provider Aero2 to develop a combined LTE-TDD and LTE-FDD network in Poland,[97] and by April 2012, ZTE Corporation had worked to deploy trial or commercial LTE-TDD networks for 33 operators in 19 countries.[87] In late 2012, Qualcomm worked extensively to deploy a commercial LTE-TDD network in India, and partnered with Bharti Airtel and Huawei to develop the first multi-mode LTE-TDD smartphone for India.[86]

In Japan, SoftBank Mobile launched LTE-TDD services in February 2012 under the name Advanced eXtended Global Platform (AXGP), and marketed as SoftBank 4G (ja). The AXGP band was previously used for Willcom's PHS service, and after PHS was discontinued in 2010 the PHS band was re-purposed for AXGP service.[98][99]

In the U.S., Clearwire planned to implement LTE-TDD, with chip-maker Qualcomm agreeing to support Clearwire's frequencies on its multi-mode LTE chipsets.[100] With Sprint's acquisition of Clearwire in 2013,[77][101] the carrier began using these frequencies for LTE service on networks built by Samsung, Alcatel-Lucent, and Nokia.[102][103]

As of March 2013, 156 commercial 4G LTE networks existed, including 142 LTE-FDD networks and 14 LTE-TDD networks.[88] As of November 2013, the South Korean government planned to allow a fourth wireless carrier in 2014, which would provide LTE-TDD services,[79] and in December 2013, LTE-TDD licenses were granted to China's three mobile operators, allowing commercial deployment of 4G LTE services.[104]

In January 2014, Nokia Solutions and Networks indicated that it had completed a series of tests of voice over LTE (

VoLTE) calls on China Mobile's TD-LTE network.[105] The next month, Nokia Solutions and Networks and Sprint announced that they had demonstrated throughput speeds of 2.6 gigabits per second using a LTE-TDD network, surpassing the previous record of 1.6 gigabits per second.[106]

Features

[edit]

Much of the LTE standard addresses the upgrading of 3G UMTS to what will eventually be 4G mobile communications technology. A large amount of the work is aimed at simplifying the architecture of the system, as it transitions from the existing UMTS circuit + packet switching combined network to an all-IP flat architecture system. E-UTRA is the air interface of LTE. Its main features are:

  • Peak download rates up to 299.6 Mbit/s and upload rates up to 75.4 Mbit/s depending on the user equipment category (with 4×4 antennas using 20 MHz of spectrum). Five different terminal classes have been defined from a voice-centric class up to a high-end terminal that supports the peak data rates. All terminals will be able to process 20 MHz bandwidth.
  • Low data transfer latencies (sub-5 ms latency for small IP packets in optimal conditions), lower latencies for handover and connection setup time than with previous radio access technologies.
  • Improved support for mobility exemplified by support for terminals moving at up to 350 km/h (220 mph) or 500 km/h (310 mph) depending on the frequency
  • Orthogonal frequency-division multiple access for the downlink, Single-carrier FDMA for the uplink to conserve power.
  • Support for both FDD and TDD communication systems as well as half-duplex FDD with the same radio access technology.
  • Support for all frequency bands currently used by IMT systems by ITU-R.
  • Increased spectrum flexibility: 1.4 MHz, 3 MHz, 5 MHz, 10 MHz, 15 MHz and 20 MHz wide cells are standardized. (W-CDMA has no option for other than 5 MHz slices, leading to some problems rolling out in countries where 5 MHz is a commonly allocated width of the spectrum so would frequently already be in use with legacy standards such as 2G GSM and cdmaOne.)
  • Support for cell sizes from tens of metres radius (femto and picocells) up to 100 km (62 miles) radius macrocells. In the lower frequency bands to be used in rural areas, 5 km (3.1 miles) is the optimal cell size, 30 km (19 miles) having reasonable performance, and up to 100 km cell sizes supported with acceptable performance. In the city and urban areas, higher frequency bands (such as 2.6 GHz in the EU) are used to support high-speed mobile broadband. In this case, cell sizes may be 1 km (0.62 miles) or even less.
  • Support of at least 200 active data clients (connected users) in every 5 MHz cell.[107]
  • Simplified architecture: The network side of E-UTRAN is composed only of eNode Bs.
  • Support for inter-operation and co-existence with legacy standards (e.g., GSM/EDGE, UMTS and CDMA2000). Users can start a call or transfer of data in an area using an LTE standard, and, should coverage be unavailable, continue the operation without any action on their part using GSM/GPRS or W-CDMA-based UMTS or even 3GPP2 networks such as cdmaOne or CDMA2000.
  • Uplink and downlink Carrier aggregation.
  • Packet-switched radio interface.
  • Support for MBSFN (multicast-broadcast single-frequency network). This feature can deliver services such as Mobile TV using the LTE infrastructure and is a competitor for DVB-H-based TV broadcast only LTE compatible devices receives LTE signal.

Voice calls

[edit]
cs domLTE CSFB to GSM/UMTS network interconnects

The LTE standard supports only packet switching with its all-IP network. Voice calls in GSM, UMTS, and CDMA2000 are circuit switched, so with the adoption of LTE, carriers will have to re-engineer their voice call network.[108] Four different approaches sprang up:

Voice over LTE (VoLTE)
Circuit-switched fallback (CSFB)
In this approach, LTE just provides data services, and when a voice call is to be initiated or received, it will fall back to the circuit-switched domain. When using this solution, operators just need to upgrade the MSC instead of deploying the IMS, and therefore, can provide services quickly. However, the disadvantage is a longer call setup delay.
Simultaneous voice and LTE (SVLTE)
In this approach, the handset works simultaneously in the LTE and circuit-switched modes, with the LTE mode providing data services and the circuit-switched mode providing the voice service. This is a solution solely based on the handset, which does not have special requirements on the network and does not require the deployment of IMS either. The disadvantage of this solution is that the phone can become expensive with high power consumption.
Single Radio Voice Call Continuity (SRVCC)

One additional approach that is not initiated by operators is the usage of over-the-top content (OTT) services, using applications like Skype and Google Talk to provide LTE voice service.[109]

Most major backers of LTE preferred and promoted VoLTE from the beginning. The lack of software support in initial LTE devices, as well as core network devices, however, led to a number of carriers promoting VoLGA (Voice over LTE Generic Access) as an interim solution.[110] The idea was to use the same principles as GAN (Generic Access Network, also known as UMA or Unlicensed Mobile Access), which defines the protocols through which a mobile handset can perform voice calls over a customer's private Internet connection, usually over wireless LAN. VoLGA however never gained much support, because VoLTE (IMS) promises much more flexible services, albeit at the cost of having to upgrade the entire voice call infrastructure. VoLTE may require Single Radio Voice Call Continuity (SRVCC) to be able to smoothly perform a handover to a 2G or 3G network in case of poor LTE signal quality.[111]

While the industry has standardized on VoLTE, early LTE deployments required carriers to introduce circuit-switched fallback as a stopgap measure. When placing or receiving a voice call on a non-VoLTE-enabled network or device, LTE handsets will fall back to old 2G or 3G networks for the duration of the call.

Enhanced voice quality

[edit]

To ensure compatibility, 3GPP demands at least AMR-NB codec (narrow band), but the recommended speech codec for VoLTE is Adaptive Multi-Rate Wideband, also known as HD Voice. This codec is mandated in 3GPP networks that support 16 kHz sampling.[112]

Fraunhofer IIS has proposed and demonstrated "Full-HD Voice", an implementation of the AAC-ELD (Advanced Audio Coding – Enhanced Low Delay) codec for LTE handsets.[113] Where previous cell phone voice codecs only supported frequencies up to 3.5 kHz and upcoming wideband audio services branded as HD Voice up to 7 kHz, Full-HD Voice supports the entire bandwidth range from 20 Hz to 20 kHz. For end-to-end Full-HD Voice calls to succeed, however, both the caller's and recipient's handsets, as well as networks, have to support the feature.[114]

Frequency bands

[edit]

The LTE standard covers a range of many different bands, each of which is designated by both a frequency and a band number:

  • North America – 600, 700, 850, 1700, 1900, 2300, 2500, 2600, 3500, 5000 MHz (bands 2, 4, 5, 7, 12, 13, 14, 17, 25, 26, 28, 29, 30, 38, 40, 41, 42, 43, 46, 48, 66, 71)
  • Central America, South America and the Caribbean – 600, 700, 800, 850, 900, 1700, 1800, 1900, 2100, 2300, 2500, 2600, 3500, 5000 MHz (bands 1, 2, 3, 4, 5, 7, 8, 12, 13, 14, 17, 20, 25, 26, 28, 29, 38, 40, 41, 42, 43, 46, 48, 66, 71)
  • Europe – 450, 700, 800, 900, 1500, 1800, 2100, 2300, 2600, 3500, 3700 MHz (bands 1, 3, 7, 8, 20, 22, 28, 31, 32, 38, 40, 42, 43)[115][116]
  • Asia – 450, 700, 800, 850, 900, 1500, 1800, 1900, 2100, 2300, 2500, 2600, 3500 MHz (bands 1, 3, 5, 7, 8, 11, 18, 19, 20, 21, 26, 28, 31, 38, 39, 40, 41, 42)[117]
  • Africa – 700, 800, 850, 900, 1800, 2100, 2300, 2500, 2600 MHz (bands 1, 3, 5, 7, 8, 20, 28, 40, 41)[citation needed]
  • Oceania (incl. Australia[118][119] and New Zealand[120]) – 700, 850, 900, 1800, 2100, 2300, 2600 MHz (bands 1, 3, 5, 7, 8, 28, 40)

As a result, phones from one country may not work in other countries. Users will need a multi-band capable phone for roaming internationally.

Patents

[edit]

According to the European Telecommunications Standards Institute's (ETSI) intellectual property rights (IPR) database, about 50 companies have declared, as of March 2012, holding essential patents covering the LTE standard.[121] The ETSI has made no investigation on the correctness of the declarations however,[121] so that "any analysis of essential LTE patents should take into account more than ETSI declarations."[122] Independent studies have found that about 3.3 to 5 percent of all revenues from handset manufacturers are spent on standard-essential patents. This is less than the combined published rates, due to reduced-rate licensing agreements, such as cross-licensing.[123][124][125]

See also

[edit]

References

[edit]

Further reading

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Long-Term Evolution (LTE) is a wireless communication standard developed by the 3rd Generation Partnership Project () for providing high-speed access to mobile devices and data terminals, serving as the evolution of the Universal Mobile Telecommunications System () toward an all-IP architecture. It aims to deliver enhanced , reduced latency below 5 milliseconds for the user plane, and peak data rates of up to 100 Mbps in the downlink and 50 Mbps in the uplink for typical categories, while supporting diverse quality-of-service (QoS) requirements for services. LTE employs (OFDMA) in the downlink and single-carrier frequency-division multiple access (SC-FDMA) in the uplink, enabling efficient spectrum use across frequency-division duplex (FDD) and time-division duplex (TDD) modes. The LTE standardization effort began in December 2004 when launched a project to define requirements for the long-term evolution of radio access, focusing on improvements in performance, cost, and simplicity compared to prior generations. Initial specifications were finalized in Release 8 in December 2008, introducing the Evolved Packet System (EPS) comprising the Evolved Universal Terrestrial Radio Access Network (E-UTRAN) and Evolved Packet Core (EPC). Subsequent releases, starting with Release 9 (2010) and continuing through Release 11 (2012), added enhancements such as multimedia broadcast multicast services (MBMS) support in Release 9, in Release 10, and coordinated multipoint transmission in Release 11. The first commercial LTE network was launched by the end of 2009, marking the rapid transition from standardization to real-world implementation. At its core, LTE's architecture features E-UTRAN, consisting of evolved Node Bs (eNBs) that handle , , and functions, interconnected via the X2 interface and linked to the EPC through the S1 interface. The EPC includes logical nodes like the Mobility Management Entity (MME) for control-plane functions, Serving Gateway (S-GW) for user-plane routing, and Packet Data Network Gateway (P-GW) for external network connectivity, enabling seamless mobility across radio access technologies and efficient . Notable features include support for guaranteed bit rate (GBR) and non-GBR bearers with QoS class identifiers (QCIs), discontinuous reception (DRX) for power saving, automatic neighbor relation (ANR) for self-optimization, and enhanced inter-cell interference coordination (eICIC) to boost capacity in dense deployments. These elements ensure low-latency (VoLTE) via (IMS), multicast capabilities, and positioning services, while prioritizing and cost-effective flat IP networking. By June 2025, LTE has achieved widespread global adoption with 706 commercial networks operating across more than 170 countries, underpinning billions of connections and serving as a critical backbone for even amid the rise of . Ongoing evolutions, such as LTE-Advanced Pro in Releases 13–15, have extended capabilities to include licensed-assisted access (LAA) for unlicensed spectrum integration and enhanced (V2X) support, maintaining LTE's relevance for IoT applications and rural coverage. As networks expand, LTE continues to provide fallback and non-standalone (NSA) anchoring, ensuring hybrid ecosystems that balance performance, coverage, and investment efficiency worldwide.

Introduction and Terminology

Overview

Long Term Evolution (LTE) is a standard for communication designed for mobile devices and data terminals, developed by the 3rd Generation Partnership Project () as the core technology for networks. It represents an evolution from earlier cellular standards, aiming to deliver enhanced performance for services through optimized radio access and core network designs. The primary goals of LTE include achieving peak data rates of up to 100 Mbps in the downlink and 50 Mbps in the uplink for low-mobility users, with user-plane latency reduced to under 10 ms and control-plane latency below 100 ms. These targets also encompass improved , targeting 3 to 4 times that of High-Speed Downlink Packet Access (HSDPA) systems, to support higher capacity and better utilization of available spectrum. Architecturally, LTE features a flat, all-IP-based network that simplifies the structure compared to prior generations, comprising the Evolved Universal Terrestrial Radio Access Network (E-UTRAN) for radio access via evolved Node Bs (eNodeBs) and the Evolved Packet Core (EPC) for packet routing, mobility management, and connectivity to external networks. In contrast to 3G Universal Mobile Telecommunications System (UMTS), which combined circuit-switched domains for voice with packet-switched for data, LTE adopts a fully packet-switched, IP-centric approach to streamline operations and enhance efficiency for data-dominant services. Globally, LTE has been instrumental in advancing capabilities, enabling widespread adoption of data-intensive applications such as streaming and (IoT) connectivity by providing scalable, high-throughput access worldwide.

Key Terminology

In LTE networks, the (evolved Node B) functions as the primary within the E-UTRAN, managing , admission control, and scheduling for in both uplink and downlink directions. Unlike the in systems, which relies on a separate Radio Network Controller (RNC) for higher-layer radio functions, the eNodeB integrates these responsibilities to enable a flatter, more efficient architecture without centralized control elements. The UE (User Equipment) refers to any device, such as a or , that connects to the LTE network via the air interface, performing measurements and reporting channel conditions to the . The EPS (Evolved Packet System) encompasses the complete end-to-end framework for packet-switched services in LTE, comprising the E-UTRAN for radio access and the EPC (Evolved Packet Core) for core network functions like bearer management and enforcement. Within the LTE protocol stack, the PDCP (Packet Data Convergence Protocol) operates at the top of the user plane layers in both the UE and eNodeB, providing header compression, ciphering, and integrity protection for robust data transfer. The RLC (Radio Link Control) sublayer below PDCP handles segmentation and reassembly of data units, along with error correction through automatic repeat request mechanisms, ensuring reliable delivery across the radio link. The MAC (Medium Access Control) layer manages multiplexing of logical channels onto transport channels, logical channel prioritization, and random access procedures to coordinate multiple UEs sharing radio resources. At the bottom, the PHY (Physical Layer) deals with the actual transmission and reception of radio signals, including modulation, coding, and mapping to physical channels like the downlink shared channel (DL-SCH). Key performance indicators in LTE include the CQI (Channel Quality Indicator), which the UE reports to the to convey current downlink channel conditions, aiding in adaptive modulation and coding for optimal throughput. The (Hybrid Automatic Repeat Request) process combines with retransmissions at the to enhance reliability, operating asynchronously in the downlink and synchronously in the uplink between the UE and . Interconnection in LTE architecture relies on the S1 interface, which links each to the EPC for both signaling (S1-MME to the Mobility Management Entity) and user plane data (S1-U to the Serving Gateway), supporting multi-to-multi relationships across network elements. The X2 interface enables direct communication between eNodeBs for functions such as coordination, load balancing, and interference management, facilitating seamless intra-E-UTRAN mobility without core network involvement.

Historical Development

Standardization Process

The 3rd Generation Partnership Project () played a central role in developing Long Term Evolution (LTE) standards, initiating work in December 2004 through a for an evolved Universal Terrestrial Radio Access Network (E-UTRAN) and an all-IP-based Evolved Packet Core (EPC). This effort aimed to create a packet-switched, non-voice baseline for , distinct from circuit-switched systems. LTE specifications were first completed in Release 8, frozen in December 2008, which established the foundational (RAT) for high-speed data services while ensuring interworking with existing networks via the EPC. Subsequent releases built incrementally on this foundation. Release 9, frozen in March 2010, introduced minor enhancements such as support for additional spectrum bands (e.g., 800 MHz and 1500 MHz), evolved self-organizing networks (), and initial multimedia broadcast/multicast service (eMBMS) capabilities, all while maintaining compatibility with Release 8 deployments. Release 10, frozen in March 2011, defined LTE-Advanced as the first iteration to meet International Mobile Telecommunications-Advanced (IMT-Advanced) criteria, enabling higher peak data rates and improved efficiency through features like . The International Telecommunication Union Radiocommunication Sector (ITU-R) certified LTE-Advanced under Release 10 as an IMT-Advanced technology in October 2010, allowing the "4G" label for compliant systems. Key technical specifications for LTE are detailed in the 3GPP TS 36 series documents, which encompass the E-UTRAN , including procedures (TS 36.2xx), (RRC) protocols (TS 36.331), and overall architecture (TS 36.300), as well as aspects of the EPC core network and associated protocols. These specifications address critical challenges in , such as with systems through mechanisms and shared core network elements, and flexibility via scalable bandwidth options from 1.4 MHz to 20 MHz in various frequency bands.

Carrier Adoption and Deployment Timeline

The rollout of LTE networks commenced with pioneering efforts in and . In early 2009, TeliaSonera conducted the world's first LTE pilots in , , and , , marking the initial real-world testing of the technology. This was followed by the commercial launch of the first LTE network on December 14, 2009, in and , supplied by equipment. In the United States, MetroPCS achieved a milestone as the first carrier to deploy commercial LTE services on September 21, 2010, initially in , , enabling download speeds up to 12 Mbps for customers. These early adoptions were propelled by favorable spectrum allocations, notably the U.S. Federal Communications Commission's 2008 Auction 73 of the 700 MHz band, where Verizon Wireless acquired licenses specifically earmarked for advanced wireless technologies like LTE, facilitating broader and indoor coverage. The year represented a pivotal expansion phase, as carriers worldwide accelerated deployments amid maturing device ecosystems and regulatory support. The introduction of the first LTE smartphones, such as the in September 2010, catalyzed demand by enabling access on consumer devices. In Europe, launched commercial LTE services in October 2010 in , , and initiated LTE services in in December 2010, achieving approximately 35% household coverage by mid-year through upgrades to over 2,700 base stations. [Deutsche Telekom](/page/Deutsche Telekom), having piloted LTE earlier, significantly scaled its network in , introducing the "LTE Speed" option for smartphones with download rates up to 100 Mbps across major cities. In , expanded its Xi LTE service launched in December 2010, reaching over 2 million subscribers by March and covering key urban areas in . In , , , and conducted large-scale TD-LTE and FDD-LTE pilots in across dozens of cities, setting the stage for commercial services—'s TD-LTE rollout began in December 2013 in 16 cities, while followed with TD-LTE in March 2014. By 2025, LTE had attained near-universal coverage in developed markets, with networks accessible to 93% of the global population and over 99% in regions like and . In emerging markets, expansions continued with a focus on rural connectivity; in , operators invested in LTE to bridge urban-rural divides, with Sub-Saharan coverage reaching 80% but ongoing efforts targeting remote areas through satellite partnerships and spectrum efficiency. Similarly, in , 37 operators across 18 countries advanced LTE alongside trials, emphasizing rural deployments to serve underserved populations, supported by $62 billion in projected investments through 2030. Globally, LTE connections exceeded 5 billion by 2025, reflecting its role as the backbone of according to data.

Operational Modes

Frequency Division Duplex (FDD)

Frequency Division Duplex (FDD) is a duplexing scheme employed in LTE networks where uplink and downlink transmissions occur simultaneously on separate, paired frequency bands, allowing for independent and concurrent communication paths. This approach contrasts with time-based methods by leveraging frequency separation to avoid signal overlap, as defined in 3GPP Release 8 specifications for Evolved Universal Terrestrial Radio Access (E-UTRA). In operation, FDD LTE utilizes guard bands and emission controls to mitigate interference between the uplink and downlink channels; for instance, unwanted emissions are restricted within 10 MHz of the operating band edges to prevent adjacent channel leakage. Channel bandwidths typically range from 1.4 MHz to 20 MHz, scalable to accommodate various spectrum allocations while maintaining orthogonality through (OFDMA) for downlink and (SC-FDMA) for uplink. FDD offers advantages in scenarios with symmetric traffic patterns, such as voice communications, by enabling constant bidirectional flow without time-slot switching, which results in lower latency compared to time-division alternatives suitable for asymmetric loads. This makes it particularly effective for real-time applications like VoLTE, where consistent uplink and downlink performance is essential. As the primary duplexing mode for early LTE deployments, FDD facilitated rapid commercialization starting in , with widespread adoption in paired spectrum bands such as Band 1 (2100 MHz) for global operators and Band 13 ( MHz) for U.S. carriers like Verizon, as well as Band 17 ( MHz) for . These lower-frequency bands enhanced coverage in initial rollouts, supporting the transition from networks. A key limitation of FDD is its spectrum inefficiency when applied to unpaired allocations, as it requires symmetric pairing of frequency blocks for uplink and downlink, potentially underutilizing resources in scenarios with imbalanced traffic demands.

Time Division Duplex (TDD)

Time Division Duplex (TDD) in LTE enables uplink and downlink transmissions to share the same frequency band by allocating distinct time slots, allowing for flexible ratios such as 3:1 downlink-to-uplink to accommodate asymmetric traffic patterns. This approach contrasts with Frequency Division Duplex (FDD), which separates uplink and downlink using paired bands for symmetric operations. LTE TDD was introduced in 3GPP Release 8 in 2008 alongside FDD, featuring a dedicated frame structure (Type 2) to support time-based duplexing, and it drew influences from the earlier TD-SCDMA standard developed for unpaired spectrum in . The standardization integrated TD-SCDMA's time-division concepts into LTE to enable efficient use of unpaired bands, with the first specifications frozen by the end of 2008. The LTE TDD frame structure consists of 10 ms radio frames divided into 1 ms subframes, configurable into downlink (D), uplink (U), or special subframes (S) for switching between directions. Seven uplink-downlink configurations (0 through 6) define the subframe patterns with varying ratios, such as configuration 0 (DSUUDDSUUD, approximately 5:5 DL:UL including special subframes) for balanced traffic or configuration 5 (DDDDDDDDSU, 9:1 DL:UL) for downlink-heavy scenarios. Special subframes, detailed in nine configurations, include a downlink pilot time slot (DwPTS), guard period (GP) for switching, and uplink pilot time slot (UpPTS), with lengths varying by cyclic prefix type—for instance, configuration 1 allocates 9 OFDM symbols to DwPTS, 4 to GP, and 1 to UpPTS under normal cyclic prefix. These setups support 5 ms or 10 ms switch-point periodicity, and TDD networks require across cells, often via GPS or IEEE , to align frames and mitigate interference. A key advantage of LTE TDD is its flexibility in adjusting downlink-to-uplink ratios to match data-intensive applications like video streaming, which typically demand more downlink capacity. It is particularly efficient for unpaired spectrum, enabling deployments in bands such as 2.3 GHz in or 2.5 GHz used by Sprint in the United States, where contiguous blocks support higher capacity without needing paired allocations. However, LTE TDD faces challenges from cross-link interference, where downlink transmissions from one cell overlap with uplink receptions in adjacent cells due to timing misalignment. This is addressed through Inter-Cell Interference Coordination (ICIC), which coordinates and power levels across base stations to reduce interference, as specified in Release 8 and enhanced in later releases. networks further support ICIC by ensuring frame alignment, minimizing base station-to-base station and user equipment-to-user equipment interference in dense deployments.

Technical Features

Spectral Efficiency and Data Rates

LTE's spectral efficiency represents a major advancement over prior mobile broadband technologies, enabling higher data rates within limited spectrum resources. In the downlink, peak reaches up to 16.3 bps/Hz using 4-layer in Release 8 configurations. This outperforms systems such as HSPA, which typically achieve 2-3 bps/Hz, thanks to the use of OFDMA for downlink transmissions that optimizes frequency resource allocation and mitigates interference more effectively than schemes. Uplink efficiency benefits from SC-FDMA, which maintains similar gains while reducing peak-to-average power ratio for better battery life in . Theoretical peak data rates for LTE are defined under ideal conditions with a 20 MHz bandwidth. Downlink peaks at 300 Mbps using 4x4 and 64QAM modulation, while uplink reaches 75 Mbps with 2x2 and 64QAM. These rates assume error-free transmission, full resource utilization, and maximum modulation order without higher-layer overhead. In real-world deployments, however, downlink throughputs typically range from 20-50 Mbps in early networks, scaling to over 100 Mbps with optimizations like advanced receivers and denser deployments. One-way user-plane latency averages 5-10 ms, aligning closely with the <5 ms design target to support responsive applications. Key factors driving these efficiencies include adaptive modulation schemes—from QPSK for robust coverage to 64QAM (and 256QAM in later releases) for high-rate scenarios—along with variable coding rates and precise resource block assignments. Each resource block spans 12 subcarriers across 7 OFDM symbols in the , forming the granular unit for scheduling and allocation. contributes by enabling multiple parallel streams, briefly noted here as it multiplies effective layers without altering core single-carrier efficiency calculations. Throughput can be fundamentally expressed as: R=B×η×LR = B \times \eta \times L where RR is the data rate in bps, BB is the system bandwidth in Hz, η\eta is the in bps/Hz, and LL is the number of spatial layers. This equation highlights how LTE scales performance through bandwidth expansion, efficiency gains from advanced , and layered transmission, though practical values are reduced by factors like control signaling and fading margins.

Multiple Input Multiple Output ()

Multiple Input Multiple Output (MIMO) technology in LTE leverages multiple antennas at the () and (UE) to transmit and receive data simultaneously over the same frequency resources, enabling and diversity gains for enhanced capacity and reliability. This approach exploits in wireless channels to create independent spatial streams, significantly boosting overall system performance without requiring additional spectrum. By separating signal streams in the spatial domain, MIMO addresses key LTE goals of higher data rates and improved , forming a cornerstone of the standard's design from Release 8 onward. LTE supports a range of MIMO configurations tailored to downlink and uplink directions, with progressive enhancements across releases. In the downlink, Release 8 introduces support for up to configurations, allowing up to four spatial layers for transmission, while LTE-Advanced in Release 10 extends this to configurations with eight layers to accommodate higher-order modulation and increased throughput. In the uplink, Release 8 supports single-layer transmission with configurations such as (single transmit antenna at UE with two receive antennas at ) for receive diversity, evolving to up to spatial multiplexing in Release 10 to enable multi-layer transmission from the UE while maintaining for power efficiency. These setups are governed by specific transmission modes defined in specifications: modes 3 and 4 facilitate via open-loop (no channel feedback) and closed-loop (with feedback) , respectively, while modes 6 through 9 support advanced single-user with progressive refinements for non-codebook-based operations and higher antenna ports. Core techniques in LTE MIMO include precoding, layer mapping, and channel feedback mechanisms to optimize signal transmission. Precoding applies a matrix from a predefined codebook to the spatial layers, rotating the signal to maximize orthogonality and minimize interference at the receiver, particularly in closed-loop modes where UE feedback informs the eNodeB. Layer mapping assigns codewords (modulated data blocks) to one or more spatial layers before precoding, supporting flexible ranks from 1 to the maximum configuration (e.g., two codewords mapped to four layers in 4×4 downlink), ensuring efficient resource utilization across antenna ports. For feedback, Channel State Information Reference Signals (CSI-RS) were introduced in Release 10, providing dedicated downlink pilots for UEs to estimate channel conditions in high-antenna configurations, enabling precise precoding and rank adaptation beyond the limitations of cell-specific reference signals in earlier releases. The primary benefits of MIMO in LTE stem from its ability to transmit multiple parallel data streams, effectively doubling throughput in a 2×2 setup and up to quadrupling it in 4×4 configurations under favorable conditions, while also enhancing signal quality in multipath-rich environments through diversity gains that combat fading. This results in improvements, with 2×2 MIMO delivering up to a 2× gain over single-input single-output (SISO) systems by exploiting spatial , as validated in performance evaluations for typical urban deployments. MIMO evolved from single-user (SU-MIMO) focus in Release 8 to (MU-MIMO) in Release 9, allowing the to serve multiple UEs concurrently on the same time-frequency resources via spatial separation and interference coordination. This extension builds on downlink to enable space-division multiple access, further amplifying cell capacity in dense scenarios by scheduling orthogonal streams to different users based on channel feedback.

Carrier Aggregation

Carrier aggregation (CA) was introduced in 3GPP Release 10 as a core feature of LTE-Advanced to enable the combination of multiple component carriers (CCs) for increased effective bandwidth beyond the 20 MHz limit of earlier LTE releases. This allows aggregation of up to five CCs, each up to 20 MHz wide, achieving a total bandwidth of 100 MHz and supporting peak downlink data rates of up to 1 Gbps when combined with 4x4 . The technique aggregates frequency blocks assigned to the same (UE), enhancing throughput while maintaining compatibility with Release 8 and 9 devices that operate on individual CCs. CA supports three main types: intra-band contiguous, where CCs are adjacent within the same frequency band; intra-band non-contiguous, where CCs are separated within the same band (introduced in Release 11); and inter-band, where CCs span different frequency bands. Each CC functions as an independent LTE carrier with its own synchronization and control signaling, but they are logically grouped for the UE. The primary CC (PCC), or primary cell (PCell), handles initial access, mobility, and carries the physical uplink control channel (PUCCH), while secondary CCs (SCCs), or secondary cells (SCells), provide additional data channels without PUCCH. Activation and deactivation of SCells occur dynamically via MAC control elements (MAC CEs) sent from the to the UE, allowing rapid adjustment based on traffic needs, while the PCell remains always active. Configured SCells start in a deactivated state upon addition or and are activated only when needed to conserve UE battery and processing resources. This signaling ensures , as non-CA UEs ignore CA-specific messages and operate solely on the PCC. The primary benefits of CA include seamless extension of bandwidth from fragmented holdings and load balancing across different bands to optimize coverage and capacity. For instance, operators can aggregate a low-frequency band for wide-area coverage with a high-frequency band for higher capacity in dense urban areas. A representative example is the inter-band combination of 1800 MHz (Band 3) and 2600 MHz (Band 7), which provides 40 MHz aggregated bandwidth to balance and throughput in urban deployments. CA integrates with techniques to further amplify these gains by exploiting spatial diversity across the aggregated .

Voice and Multimedia Support

Voice over LTE (VoLTE)

Voice over LTE (VoLTE) serves as the native all-IP voice solution for LTE networks, enabling high-quality voice calls directly over the packet-switched domain without relying on circuit-switched fallback to legacy or systems. Introduced in Release 9, VoLTE leverages the LTE radio access network and evolved packet core to deliver voice services with improved efficiency and capacity compared to traditional . This approach allows operators to optimize usage and prepare for future all-IP communications. The core architecture of VoLTE is built on the (IMS), which provides session control and service delivery. IMS employs the (SIP) for signaling to establish, modify, and terminate voice sessions, while the (RTP) handles the transport of voice media packets. Key IMS components include the Proxy-Call Session Control Function (P-CSCF) for interfacing with the , the Interrogating-CSCF (I-CSCF) for routing, and the Serving-CSCF (S-CSCF) for session management, ensuring end-to-end connectivity and authentication. (3GPP TS 23.228) VoLTE utilizes advanced audio codecs to support high-definition voice. The (AMR-WB) , standardized in Release 7, enables at bit rates from 6.6 to 23.85 kbps, delivering clearer sound than alternatives. In later releases starting from Release 12, the (EVS) extends this capability with super-wideband and fullband support up to 32 kbps, offering even more natural audio quality while maintaining with AMR-WB. Critical procedures ensure reliable operation across diverse scenarios. Single Radio Voice Call Continuity (SRVCC), defined in Release 8, allows seamless of an ongoing VoLTE call from LTE to (UTRAN) or (GERAN) networks when coverage weakens, using a single radio in the device to avoid call drops. Additionally, VoLTE mandates support for emergency calls, routing them via IMS with priority handling to public safety answering points, including location information and fallback if IMS is unavailable. Deployment requirements emphasize robust network performance for carrier-grade service. Quality of Service (QoS) is enforced through dedicated EPS bearers allocated with (QCI) 1, a guaranteed (GBR) class prioritizing conversational voice with packet delay budget of 100 ms and packet error loss rate of 10^{-2}. This setup requires LTE networks with sufficient capacity, typically viable in deployments supporting at least 100 Mbps peak downlink speeds to handle voice alongside data traffic without degradation. (3GPP TS 23.203) Adoption accelerated rapidly after the first commercial launch in 2012 by MetroPCS in the , with interoperability agreements among major carriers like and Verizon enabling nationwide VoLTE-to-VoLTE calling by 2015. By this time, VoLTE became mandatory for many operators' LTE devices, driving the phase-out of circuit-switched fallback and enabling full all-IP voice ecosystems. As of May 2025, 320 operators have launched or committed to launch VoLTE services in 156 countries and territories, marking it as a cornerstone of voice services.

Enhanced Voice Quality and RCS

Enhanced voice quality in LTE networks builds upon the foundational (VoLTE) platform by incorporating advanced s that provide higher fidelity audio beyond traditional . The (AMR-WB) enables High Definition (HD) voice, supporting a bandwidth of 50 Hz to 7,000 Hz, compared to the 300 Hz to 3,400 Hz range of s like AMR. This wider bandwidth delivers clearer, more natural-sounding speech with improved intelligibility, particularly for consonants and higher- sounds, as standardized in specifications. Further advancements arrived with the (EVS) codec, introduced in Release 12, which supports super-wideband audio up to 16 kHz sampling for even greater clarity, including music-like quality in conversational settings. EVS operates efficiently at bitrates up to 32 kbps in super-wideband mode, offering robust performance in noisy environments and seamless with legacy AMR-WB systems while reducing bandwidth demands through advanced source-controlled encoding. These enhancements ensure high-quality voice transmission over LTE's IP-based infrastructure, prioritizing low delay and error resilience for real-time communication. The Rich Communication Suite (RCS), standardized by the , extends LTE's voice capabilities into a unified IP multimedia framework over the (IMS), evolving from legacy to feature-rich, data-enhanced interactions. RCS enables IP-based messaging with capabilities such as , video calling, presence indication to show user availability, and group chats supporting multiple participants with multimedia exchanges. As of August 2025, RCS has reached 473 million monthly active users across 90 operator deployments worldwide. Integrated with VoLTE, RCS provides a seamless where voice calls can transition into or complement rich media sessions, all while maintaining Quality of Service (QoS) guarantees. Key benefits of these enhancements include reduced end-to-end latency, often below 100 ms for RCS interactions, which enhances conversational naturalness and in real-time scenarios. is ensured through fallback mechanisms, where RCS features degrade gracefully to /MMS or circuit-switched voice if IP connectivity falters, preserving service continuity across hybrid networks. Despite these advantages, deployment faces challenges in device and network , as varying implementations across vendors can lead to inconsistent feature support and service reliability. The addresses this through certification programs and the RCS Universal Profile, which mandates standardized compliance to ensure seamless operation across ecosystems.

Spectrum Utilization

Frequency Bands

LTE frequency bands are standardized by the 3rd Generation Partnership Project () in Technical Specification (TS) 36.101, which defines operating frequencies for Long-Term Evolution (LTE) networks to ensure interoperability across devices and infrastructure. These bands are categorized into Frequency Division Duplex (FDD) modes, with numbers including 1–51 and higher such as 65–71 and beyond, utilizing paired uplink and downlink spectrum, and Time Division Duplex (TDD) modes, numbered from 33 to 53 and select higher numbers, employing unpaired spectrum with time-separated transmissions. Channel bandwidths supported across bands typically range from 1.4 MHz to 20 MHz, enabling scalable deployments based on available spectrum. Low-frequency bands below 1 GHz, such as Band 8 (900 MHz FDD), prioritize wide-area coverage due to favorable characteristics that penetrate buildings and extend over longer distances, making them ideal for rural and suburban deployments. Mid-frequency bands between 1.8 GHz and 2.6 GHz, exemplified by Band 1 (2100 MHz FDD) and Band 3 (1800 MHz FDD), balance coverage and capacity, supporting higher data throughput in urban environments where user density demands efficient spectrum use. TDD bands like Band 40 (2300–2400 MHz, 100 MHz unpaired) offer flexible allocation for asymmetric traffic patterns, often deployed in regions with abundant unpaired spectrum. Global allocations align with (ITU) regions, influencing band preferences: in ITU Region 1 (, , parts of Middle East), Band 3 is prevalent for its harmonized 1800 MHz spectrum supporting broad compatibility. In ITU Region 2 (), Band 13 (700 MHz FDD) dominates for enhanced coverage in , leveraging digital dividend spectrum. ITU Region 3 () favors TDD configurations like Band 40 for high-capacity services in densely populated areas such as . These regional variations stem from national spectrum auctions and regulatory frameworks, promoting ecosystem efficiency without universal harmonization. The band portfolio has evolved through releases, with initial definitions in Release 8 and expansions in later versions; for instance, Band 71 (600 MHz FDD) was introduced in Release 13 to extend low-band coverage , utilizing repurposed television white space. Subsequent releases added bands like (3500 MHz TDD) in Release 10 for increased capacity in mid-to-high . Release 18 (2024) further expanded the portfolio with bands like 106 (900 MHz FDD) for secure utility communications and 31/72 (450 MHz FDD) for extended coverage in low-power scenarios. This progression accommodates growing demand and new spectrum opportunities while maintaining . The following table summarizes key LTE frequency bands, selected for their widespread adoption and representation of duplex modes, bandwidth options, and regional focus, based on TS 36.101 (Release 18).
BandDuplex ModeFrequency Range (MHz)Channel Bandwidths (MHz)Primary Regions/Notes
1FDDUL: 1920–1980; DL: 2110–21705, 10, 15, 20Global (IMT core); mid-band capacity
3FDDUL: 1710–1785; DL: 1805–18805, 10, 15, 20/ (Region 1/3); roaming favorite
8FDDUL: 880–915; DL: 925–9605, 10/; low-band coverage
13FDDUL: 777–787; DL: 746–7565, 10 (Region 2); coverage extension
40TDD2300–24005, 10, 15, 20/; unpaired mid-band capacity
71FDDUL: 663–698; DL: 617–6525, 10, 15, 20; added in Release 13 for low-band enhancement

Band Allocation and Harmonization

The allocation and harmonization of spectrum bands for LTE networks are governed by a complex interplay of national and international regulatory frameworks aimed at ensuring efficient use, , and global coordination. In the United States, the (FCC) manages non-federal spectrum allocations, including auctions for LTE-compatible bands to promote expansion. In , the European Telecommunications Standards Institute (ETSI) contributes to harmonized technical specifications through its role in , while the European Conference of Postal and Telecommunications Administrations (CEPT) facilitates cross-border consistency. Globally, the (ITU) provides coordination via its Radio Regulations, which outline international principles to prevent interference and support equitable access. A notable example of these efforts is the 2015 in , the first in Europe for the 700 MHz band, which raised over €5 billion and allocated low-frequency ideal for LTE coverage, enabling operators like and to enhance rural . Such auctions underscore the regulatory push to reassign "digital dividend" from to mobile services, balancing revenue generation with technological advancement. Harmonization initiatives have been pivotal in standardizing LTE spectrum use to foster device ecosystems and reduce market fragmentation. The GSMA promotes global device compatibility through its certification programs, which verify interoperability across operator networks in over 200 countries, thereby streamlining manufacturing and deployment. Complementing this, the 3GPP defines standardized in its technical specifications, such as TS 36.101, to ensure consistent radio access and avoid regional silos that could hinder and . These efforts align with broader calls for international spectrum harmonization to support seamless LTE operations. Despite these advances, challenges persist in spectrum allocation, particularly refarming legacy bands from and technologies to LTE. For instance, reallocating the 1800 MHz band from requires careful planning to maintain service continuity during the transition, often involving dynamic spectrum sharing to minimize disruptions for existing users while boosting LTE capacity. Additionally, trials of , such as LTE-U in the 5 GHz band, have tested coexistence with but faced regulatory hurdles over fair access and interference , with early studies showing mixed results on performance equity. Regional variations in LTE spectrum preferences reflect diverse regulatory and market priorities, with favoring Time Division Duplex (TDD) modes for their flexibility in asymmetric traffic, as seen in deployments in and using bands like 2300-2400 MHz. In contrast, the emphasize Frequency Division Duplex (FDD) for its established symmetry in voice and data, predominant in U.S. and Latin American networks via bands such as 700 MHz. By 2025, updates include ongoing refarming efforts and preparatory allocations in higher frequencies, serving as precursors to mmWave for transitions while sustaining LTE enhancements. These outcomes have significantly improved efficiency, enabling fewer device configurations to support broader global coverage and reducing operational complexities for manufacturers and operators.

Intellectual Property

Patent Landscape

The development of Long Term Evolution (LTE) technology has resulted in a substantial patent landscape, with over 20,000 patents declared as essential to the standard by contributors to the European Telecommunications Standards Institute (ETSI). These declarations encompass core innovations such as (OFDMA) for efficient spectrum use, Multiple Input Multiple Output () for enhanced capacity, and (HARQ) for reliable data transmission. Major patent holders dominate the LTE SEP portfolio, with Huawei holding approximately 12% of declared essential patent families, Samsung around 9%, and Qualcomm about 9% (as of 2021). These companies, along with others like Nokia and Ericsson, have contributed significantly to the IPR database through their R&D efforts in communications. Declarations primarily cover radio interface technologies, with significant contributions in core network and areas. The declaration process for LTE patents follows the 3GPP Intellectual Property Rights (IPR) framework, where participants must disclose potentially essential patents to ETSI and commit to licensing them on Fair, Reasonable, and Non-Discriminatory (FRAND) terms to ensure broad adoption of the standard. Patent filings for LTE peaked between 2005 and 2010, coinciding with the standardization efforts in 3GPP Release 8, after which activity stabilized as the technology matured. This period saw extensive cross-licensing agreements among major vendors, such as those between and , to facilitate interoperability and reduce litigation risks in the fragmented IP environment.

Licensing and Standards Essential Patents

The development of LTE standards by the under the European Telecommunications Standards Institute (ETSI) mandates that participants declare any standards-essential patents (SEPs) they hold and commit to licensing them on fair, reasonable, and non-discriminatory (FRAND) terms to ensure broad access for implementers worldwide. This FRAND obligation, outlined in ETSI's Intellectual Property Rights (IPR) Policy, requires SEP holders to offer licenses to all willing third parties without discrimination, promoting and preventing hold-up tactics that could stifle adoption of the standard. ETSI's rules emphasize timely disclosure of SEPs during to facilitate evaluation and balanced contributions from members. LTE SEP licensing typically follows per-device royalty models, with rates often structured as a percentage of the device's selling price or module cost, capped to avoid excessive burdens. For instance, , a major SEP holder, applies royalties for its cellular SEPs at up to 3.25% of the handset price with a per-device cap that historically aligned with $10-20 for LTE-enabled phones, though evolving to include multi-standard coverage. Patent pools streamline this process by aggregating SEPs from multiple owners into a single licensing entity; Avanci's IoT platform, for example, offers combined , , and (LTE) essential patents for connected devices like smart meters, charging flat or percentage-based fees to reduce complexity for licensees. These models balance innovator compensation with affordability, particularly for low-cost IoT applications. Significant disputes have shaped LTE licensing practices, including the 2017-2019 Apple-Qualcomm conflict, where Apple accused of overcharging on FRAND royalties and sought to invalidate , culminating in a global settlement that dismissed all litigation and renewed a multi-year chip supply and agreement. Similarly, and resolved ongoing SEP disputes in 2021 through a multi-year cross-license covering cellular standards including LTE, ending lawsuits across multiple jurisdictions. These cases drew antitrust scrutiny, with the U.S. (FTC) challenging Qualcomm's practices in a 2019 suit alleging monopolistic licensing tied to chip sales, though the Ninth Circuit overturned key findings in 2020, and the investigating exclusive deals but ultimately annulling fines in 2022. Cross-license agreements have mitigated litigation risks, as seen in the 2014 acquisition of 's devices business, which included a broad license enabling to use 's extensive LTE portfolio while retained rights for networking, fostering mutual access and reducing enforcement actions. By 2025, the LTE ecosystem has stabilized through expanded pools like Sisvel's Cellular IoT program, which covers over 50% of declared narrowband LTE SEPs, and Avanci's offerings, collectively addressing a substantial share of essential patents to simplify compliance for manufacturers. As of 2025, LTE SEPs remain integral to multi-mode licensing agreements covering / devices, with continued declarations for enhancements like LTE-Advanced Pro supporting IoT and vehicular communications. Such arrangements have contained economic impacts, with cumulative SEP royalties estimated at approximately 3-5% of a device's selling price for mid-range smartphones (as of 2016 data).

Global Deployments and Evolution

Worldwide Adoption Status

As of June 2025, LTE networks provide extensive global coverage, approaching universal access in developed regions such as the , , and , while achieving substantial penetration in key emerging markets like and . Worldwide, LTE infrastructure encompasses around 5 million base stations, supporting ubiquitous connectivity in urban and suburban areas. In contrast, rural areas in continue to face coverage gaps, where operators are increasingly integrating satellite-based hybrid solutions to extend LTE access to underserved populations. Many operators are sunsetting networks by the end of 2025 to refarm spectrum for LTE and , further boosting LTE capacity and coverage. LTE subscriber numbers stand at approximately 4.8 billion connections globally, representing about 55% of all mobile subscriptions, according to the Ericsson Mobility Report June 2025. The Asia-Pacific region dominates LTE adoption, driven by massive deployments in and , followed by the and EMEA, highlighting regional disparities in scale. These figures underscore LTE's maturity as the primary technology for , though growth has plateaued amid migrations to newer standards. Economically, LTE has fueled average revenue per user (ARPU) increases through higher data consumption, with global mobile data traffic reaching substantial volumes that boost operator revenues in data-centric markets. However, market saturation in mature regions is prompting spectrum refarming strategies, where LTE bands are repurposed to enhance capacity for evolving services. Performance metrics reflect this stability, with average download speeds ranging from 30 to 50 Mbps across most networks, and LTE-Advanced Pro features deployed in about 60% of operational systems to deliver enhanced efficiency and .

Integration with 5G and Future Enhancements

In the transition to networks, Long-Term Evolution (LTE) plays a pivotal role through Non-Standalone (NSA) , where LTE serves as the control and mobility anchor for 5G New Radio (NR). This configuration, known as E-UTRA-NR Dual Connectivity (EN-DC), was standardized in Release 15, enabling early 5G deployments by leveraging existing LTE infrastructure for signaling and management while adding NR for enhanced data rates. EN-DC allows operators to boost capacity without immediate full standalone 5G core upgrades, facilitating a phased rollout that maintains with LTE devices. Spectrum refarming further integrates LTE with by reallocating LTE-assigned frequency bands, such as the 1800 MHz range, to support traffic. This process is accelerated by Dynamic Spectrum Sharing (DSS), which permits simultaneous operation of LTE and on the same carrier through dynamic resource allocation at the subframe level, avoiding the need for immediate hardware overhauls. DSS optimizes spectrum utilization in mid-band frequencies, enabling operators to introduce services progressively while preserving LTE coverage for legacy users. LTE enhancements for emerging use cases underscore its adaptability in the 5G ecosystem. , introduced in Release 13 as enhanced Machine-Type Communications (eMTC) or LTE Category M1, optimizes LTE for low-power, wide-area (IoT) applications with features like reduced bandwidth (1.4 MHz) and power-saving modes supporting up to 10-year battery life. (), also from Release 13, integrates seamlessly with LTE networks by operating in standalone, in-band, or guard-band modes within LTE carriers, providing deep coverage for massive IoT deployments with minimal spectrum overhead. Additionally, Release 14 introduced LTE-based () sidelink communications, enabling direct device-to-device transmissions for vehicular safety and traffic efficiency without relying on network infrastructure. Looking ahead, LTE is expected to remain a of mobile networks through at least 2030, with ongoing maintenance releases ensuring compatibility and optimization alongside . Projections indicate that NB-IoT and will drive massive IoT growth, reaching approximately 3 billion connections in to support applications in smart metering, , and industrial automation. Despite these advancements, integrating LTE with presents challenges, including a crunch from surging data demands that strains refarming efforts and requires efficient allocation strategies. Energy efficiency also emerges as a key concern, as denser 5G integrations with LTE increase power consumption in base stations and devices, necessitating innovations like advanced sleep modes and AI-driven to support sustainable "" networks.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.