Hubbry Logo
2G2GMain
Open search
2G
Community hub
2G
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
2G
2G
from Wikipedia

2G refers to the second generation of cellular network technology, which were rolled out globally starting in the early 1990s. The main differentiator to previous mobile telephone systems, retrospectively dubbed 1G, is that the radio signals of 2G networks are digital rather than analog, for communication between mobile devices and base stations. In addition to voice telephony, 2G also made possible the use of data services.

The most common 2G technology has been the GSM standard, which became the first globally adopted framework for mobile communications. Other 2G technologies include cdmaOne and the now-discontinued Digital AMPS (D-AMPS/TDMA),[1] as well as the Personal Digital Cellular (PDC) and Personal Handy-phone System (PHS) in Japan.

The transition to digital technology enabled the implementation of encryption for voice calls and data transmission, significantly improving the security of mobile communications while also increasing capacity and efficiency compared to earlier analog systems. 2G networks were primarily designed to support voice calls and Short Message Service (SMS), with later advancements such as General Packet Radio Service (GPRS) enabling always-on packet data services, including email and limited internet access. 2G was succeeded by 3G technology, which provided higher data transfer rates and expanded mobile internet capabilities.

History

[edit]

In 1990, AT&T Bell Labs engineers Jesse Russell, Farhad Barzegar and Can A. Eryaman filed a patent for a digital mobile phone that supports the transmission of digital data. Their patent was cited several years later by Nokia and Motorola when they were developing 2G digital mobile phones.[2]

2G was first commercially launched in 1991 by Radiolinja (now part of Elisa Oyj) in Finland in the form of GSM, which was defined by the European Telecommunications Standards Institute (ETSI).[3] The Telecommunications Industry Association (TIA) defined the cdmaOne (IS-95) 2G standard, with an eight to ten fold increase in voice call capacity compared to analog AMPS.[4] The first deployment of cdmaOne was in 1995.[5] In North America, Digital AMPS (IS-54 and IS-136) and cdmaOne (IS-95) were dominant, but GSM was also used.

Later 2G releases in the GSM space, often referred to as 2.5G and 2.75G, include General Packet Radio Service (GPRS) and Enhanced Data Rates for GSM Evolution (EDGE). GPRS allows 2G networks to achieve a theoretical maximum transfer speed of 40 kbit/s (5 kB/s). EDGE increases this capacity, providing a theoretical maximum transfer speed of 384 kbit/s (48 kB/s).

Three primary benefits of 2G networks over their 1G predecessors were:

Evolution of GSM 2G

[edit]
Cellular network standards and generation timeline. (Large titles on the colored area refer to the lines to their right.

2.5G (GPRS)

[edit]

2.5G ("second-and-a-half generation") refers to 2G systems that incorporate a packet-switched domain alongside the existing circuit-switched domain, most commonly implemented through General Packet Radio Service (GPRS).[6] GPRS enables packet-based data transmission by dynamically allocating multiple timeslots to users, improving network efficiency. However, this does not inherently provide faster speeds, as similar techniques, such as timeslot bundling, are also employed in circuit-switched data services like High-Speed Circuit-Switched Data (HSCSD). Within GPRS-enabled 2G systems, the theoretical maximum transfer rate is 40 kbit/s (5 kB/s).[7]

2.75G (EDGE)

[edit]

2.75G refers to the evolution of GPRS networks into EDGE (Enhanced Data Rates for GSM Evolution) networks, achieved through the introduction of 8PSK (8 Phase Shift Keying) encoding. While the symbol rate remained constant at 270.833 samples per second, the use of 8PSK allowed each symbol to carry three bits instead of one, significantly increasing data transmission efficiency. Enhanced Data Rates for GSM Evolution (EDGE), also known as Enhanced GPRS (EGPRS) or IMT Single Carrier (IMT-SC), is a backward-compatible digital mobile phone technology built as an extension to standard GSM. First deployed in 2003 by AT&T in the United States, EDGE offers a theoretical maximum transfer speed of 384 kbit/s (48 kB/s).[7]

2.875G (EDGE Evolution)

[edit]

Evolved EDGE (also known as EDGE Evolution or 2.875G) is an enhancement of the EDGE mobile technology that was introduced as a late-stage upgrade to 2G networks. While EDGE was first deployed in the early 2000s as part of GSM networks, Evolved EDGE was launched much later, coinciding with the widespread adoption of 3G technologies such as HSPA and just before the emergence of 4G networks. This timing limited its practical application.

Evolved EDGE increased data throughput and reduced latencies (down to 80 ms) by utilizing improved modulation techniques, dual carrier support, dual antennas, and turbo codes. It achieved peak data rates of up to 1 Mbit/s, significantly enhancing network efficiency for operators that had not yet transitioned to 3G or 4G infrastructures. However, despite its technical improvements, Evolved EDGE was never widely deployed. By the time it became available, most network operators were focused on implementing more advanced technologies like UMTS and LTE. As of 2016, no commercial networks were reported to support Evolved EDGE.

Phase-out

[edit]

2G, understood as GSM and CdmaOne, has been superseded by newer technologies such as 3G (UMTS / CDMA2000), 4G (LTE / WiMAX) and 5G (5G NR). However, 2G networks were still available as of 2023 in most parts of the world, while notably excluding the majority of carriers in North America, East Asia, and Australia.[8][9][10]

Many modern LTE-enabled devices have the ability to fall back to 2G for phone calls, necessary especially in rural areas where later generations have not yet been implemented.[11] In some places, its successor 3G is being shut down rather than 2G – Vodafone previously announced that it had switched off 3G across Europe in 2020 but still retains 2G as a fallback service.[12] In the US T-Mobile shut down their 3G services while retaining their 2G GSM network.[13][14]

Various carriers have made announcements that 2G technology in the United States, Japan, Australia, and other countries are in the process of being shut down, or have already shut down 2G services so that carriers can re-use the frequencies for newer technologies (e.g. 4G, 5G).[15][16]

As a legacy protocol, 2G connectivity is considered insecure.[17] Specifically, there exist well known methods to attack weaknesses in GSM since 2009[18] with practical use in crime.[19] Attack routes on 2G CdmaOne were found later and remain less publicized.[20]

Android 12 and later provide a network setting to disable 2G connectivity for the device.[21] iOS 16 and later can disable 2G connectivity by enabling Lockdown Mode.[22]

Criticism

[edit]

In some parts of the world, including the United Kingdom, 2G remains widely used for older feature phones and for internet of things (IoT) devices such as smart meters, eCall systems and vehicle trackers to avoid the high patent licensing cost of newer technologies.[23] Terminating 2G services could leave vulnerable people who rely on 2G infrastructure unable to communicate even with emergency contacts, causing harm and possibly deaths.[24]

Past 2G networks

[edit]
Country Status Network Shutdown date Standard Notes
Åland Ålcom 2025-02-03 GSM [25]
Anguilla Digicel active GSM 900 MHz: 5 MHz GSM + 5 MHz UMTS
1900 MHz: 5 MHz UMTS [26][27][28][29]
FLOW 2024-04-22 GSM [30][31]
Antigua and Barbuda No Service APUA 2018-04-01 GSM [32]
Digicel 2024-05-31 GSM [33]
FLOW 2024-07-31 GSM [34]
Aruba partially
unconfirmed
Digicel 2024-06-30 GSM [35]
SETAR active GSM GSM-900 & GSM-1900
Australia No Service Hutchison 3 2006-08-09 cdmaOne [36][37][38][39][40]
Optus 2017-08-01 GSM 2G shut down in WA and NT on 3 Apr 2017.[41][42]
Telstra 2008-04-28 cdmaOne [43][44][45][46][47]
Telstra 2016-12-01 GSM [48]
Vodafone 2018-06-14 GSM [49]
Bahamas No Service Aliv N/A (no 2G)
BTC 2024-06-30 GSM [50][51][52]
Bahrain Batelco 2021-11-30 GSM [53]
Barbados partially
unconfirmed
Digicel 2025-03-31 GSM 900 MHz: 6 MHz GSM /
1800 MHz: 12 MHz GSM [54]
FLOW 2024-04-22 GSM
Belgium Orange 2030 GSM [55]
Telenet 2027 GSM [56]
Proximus 2027 GSM [57]
Bermuda Digicel active GSM 1900 MHz: 15 MHz GSM + 15 MHz LTE [58]
One active GSM 1900 MHz: 5 MHz GSM + 20 MHz LTE [58]
Bonaire partially
unconfirmed
Digicel 2025-03-31 GSM
FLOW 2024-04-22 GSM
British Virgin Islands CCT active GSM 1900 MHz: 10 MHz GSM + 20 MHz LTE [59]
Digicel active GSM 1800 MHz: 15 MHz GSM
1900 MHz: 5 MHz GSM + 10 MHz UMTS [59]
FLOW 2024-04-22 GSM [60]
Brunei No Service UNN 2021-06-01 GSM National Wholesale Network used by DSTCom, Progresif and imagine.[61][62]
Canada Bell 2019-04-30 cdmaOne Shutdown of CDMA transmitters commenced in remote areas in 2017, followed by an official announcement in June 2018 that 2G devices will lose service soon.[63][64]
Rogers Wireless TBD GSM 1900 MHz shutdown in Jun 2021.
850 MHz remains active.[65][66][67][68]
SaskTel 2017-07-31 cdmaOne [69][70]
Telus Mobility 2017-05-31 cdmaOne [71][72]
Cayman Islands partially
unconfirmed
Digicel 2020-07-01 GSM [73][74]
FLOW 2024-04-22 GSM
China China Mobile active GSM 900 MHz: 15 MHz GSM
1800 MHz: 25 MHz GSM [75]
China Telecom 2025 cdmaOne Local shutdown commenced on 01 Jun 2020.
CDMA2000 1xRTT, EV-DO Rev. A/B (3G) service also terminates.[75][76][77]
China Unicom 2025 GSM Local shutdown commenced on 18 Apr 2018.[75][78][79][77][80]
Chile Entel 2024 Q3 GSM Local shutdown commenced on 22 Jul 2024 in the Arica and Parinacota Region.[81]
Colombia Claro 2023-02-23 GSM [82][83]
Tigo 2022-11-01 GSM [84]
Curaçao Digicel 2025-03-31 GSM
FLOW 2024-02-29 GSM [85][86]
Dominica partially
unconfirmed
Digicel 2027-03-31 GSM
FLOW 2024-03-?? GSM [87]
France Bouygues 2026-12-31 GSM [88]
Orange 2026-09 GSM [55]
SFR 2026 GSM [89]
Germany Deutsche Telekom 2028-06-30 GSM [90]
Vodafone 2028-10-01 /
2030-12-31 (IoT)
GSM [91][92]
Telefónica (O2) TBD GSM
Grenada unconfirmed Digicel 2024-03-31 GSM
FLOW 2024-04-22 GSM
Guam unconfirmed GTA Teleguam ? GSM
Hong Kong 3 2008-11-20 cdmaOne Shut down due to license expiry. Government originally did not allow the license to be renewed due to unpopularity, however the government later reversed the decision and held an auction for CDMA2000 service, which PCCW-HKT won the auction and provided CDMA2000 service immediately after 3's license expiry.
3 2021-09-30 GSM [93]
CMHK active GSM
CSL 2005 D-AMPS Service previously provided by Pacific Link, which subsequently merged into CSL. Shut down due to license expiry. Government did not allow the license to be renewed due to unpopularity.
CSL 2017-10-31 cdmaOne Service previously provided by PCCW. After acquisition of CSL by HKT, its mobile business PCCW Mobile was merged into CSL. No service for local customers, only served incoming roaming tourists.
CSL terminated its CDMA family business upon its licence expiry, and CDMA2000 1xRTT, EV-DO Rev. A (3G) service has also terminated along with cdmaOne.[94]
CSL 2024-11-08 GSM [95]
SmarTone 2022-10-14 GSM [96]
Iceland Nova 2025-01-28 GSM [97]
Síminn 2025 Q4 GSM [98]
Sýn 2025-06-01 GSM [99]
Israel Hot Mobile 2019-12-31 iDEN [100]
2025 GSM Per government statement.[101]
Jamaica No Service Digicel 2024-08-31 GSM [102][103]
FLOW 2024-04-15 GSM [104][103][105]
Japan No Service au KDDI 2012-07-22 cdmaOne [106]
NTT Docomo 2012-03-31 PDC [107]
Softbank 2010-03-31 PDC [108]
Jordan Umniah 2021-03-11 GSM [109]
Luxembourg Orange 2030 GSM [55]
Macau No Service CTM 2019-08-01 GSM Service for local customers terminated on 4 Jun 2015, but remained for roaming users.[110][111][112]
3 2019-08-01 GSM Service for local customers terminated on 4 Jun 2015, but remained for roaming users.[110][111]
SmarTone 2019-08-01 GSM Service for local customers terminated on 4 Jun 2015, but remained for roaming users.[110][111]
Mexico AT&T 2019-09-01 GSM [113]
Local shutdown commenced in Q1 2019.
Movistar 2021-01-01 GSM [114]
Montserrat unconfirmed Digicel ? GSM
FLOW 2024-04-22 GSM
Netherlands KPN 2027-12-01 GSM [115]
T-Mobile 2021-06-01 /
2023-11-15 (IoT)
GSM [116]
New Caledonia OPT-NC 2025 GSM Shutdown commenced in 2022.[117]
New Zealand 2degrees 2018-03-15 GSM [118]
Spark 2012-07-31 cdmaOne [119][120]
Norway Telenor 2027 GSM [121]
Telia 2025 GSM [121]
Panama Digicel 2022-06-30 GSM Complete shutdown of operations and market exit.[122][123][124][125]
Peru Bitel N/A (no 2G)
Poland Orange 2030 GSM [55]
Romania Orange 2030 GSM [55]
Saint Kitts and Nevis Digicel 2027-03-31 GSM
FLOW 2024-04-22 GSM [126]
Saint Lucia partially
unconfirmed
Digicel 2027-03-31 GSM
FLOW 2024-04-22 GSM [127]
Saint Vincent and the Grenadines Digicel 2027-03-31 GSM
FLOW 2023-09-30 GSM [128][129][130][131]
Saudi Arabia stc 2025 Q4 GSM [132]
Singapore No Service M1 2017-04-18 GSM [133]
Singtel 2017-04-18 GSM [133]
StarHub 2017-04-18 GSM [133]
Sint Maarten
Saba
Sint Eustatius
No Service TelCell 2019-01-01 GSM [134]
FLOW (UTS) 2017-09-26 GSM [135]
Slovakia Orange 2030 GSM [55]
South Africa TBD GSM Per government statement.[136][137]
South Korea No Service KT 2012-03-19 cdmaOne CDMA2000 1xRTT, EV-DO Rel. 0 (3G) service has also terminated.[138]
LG Uplus 2021-06-30 cdmaOne CDMA2000 1xRTT, EV-DO Rev. A/B (3G) service has also terminated.[139]
SK Telecom 2020-07-27 cdmaOne CDMA2000 1xRTT, EV-DO Rel. 0 (3G) service has also terminated.[140]
Spain Orange 2030 GSM [55]
Sweden Net4Mobility (Telenor/Tele2) 2025-12-31 GSM 2G network will be shut down by the end of 2025.[141][142][143]
Telia 2027 GSM Shutdown pushed back from 2025 to 2027.[144][145]
 Switzerland No Service Salt 2020-12-31 GSM Shutdown commenced on 1 Jul 2020. A few single 2G-only sites remained until Sep 2023 to preserve CSFB functionality.[146][147][148]
Sunrise 2023-01-03 GSM With the introduction of S-RAN in 2018 phaseout was previously postponed to 2022.[149][150][151]
Swisscom 2021-04-07 GSM Official shutdown on 31 Dec 2020 (guaranteed availability).[152][153][154]
Taiwan No Service Chunghwa Telecom 2017-06-30 GSM [155]
FarEasTone 2017-06-30 GSM [155]
Taiwan Mobile 2017-06-30 GSM [155]
Trinidad and Tobago Digicel 2024-12-31 GSM [156][157][158]
bmobile (TSTT) TBD GSM 850 MHz: 2.5 MHz GSM + 5 MHz UMTS [159][160]
Turks and Caicos Islands Digicel 2025-06-30 GSM 900 MHz: 9.8 MHz GSM [161]
FLOW 2024-04-22 GSM [162]
United Arab Emirates No Service Du 2023-12-31 GSM [163]
Etisalat 2023-12-31 GSM [164]
United Kingdom 2033 GSM Per government statement on confirmation by mobile providers.[165][166][167]
United States
Puerto Rico
US Virgin Islands
AT&T 2008-02-18 D-AMPS TDMA (D-AMPS) on 1900 MHz shut down on 15 July 2007.[168]
2017-01-01 GSM [169]
Cellcom
(US only)
2023-12-01 cdmaOne CDMA2000 1xRTT, EV-DO Rev. A (3G) service has also terminated.[170]
Claro
(PR only)
2028-12-31 GSM
Commnet Wireless (Choice)
(US only)
2022-12-31 cdmaOne CDMA2000 1xRTT, EV-DO Rev. A (3G) service has also terminated.[171]
Copper Valley Wireless 2022-09-30 cdmaOne CDMA2000 1xRTT, EV-DO Rev. A (3G) service has also terminated.[172]
Edge Wireless 2007-06-30 D-AMPS [173]
Inland Cellular 2025-09-03 cdmaOne CDMA2000 1xRTT, EV-DO Rev. A (3G) service has also terminated.[174]
T-Mobile 2025 GSM Shutdown commenced on 9 Feb 2025.[175][176]
T-Mobile (Sprint) 2022-05-31 cdmaOne CDMA2000 1xRTT, EV-DO Rev. A (3G) service has also terminated.
Shutdown commenced on 31 Mar 2022.[177][178][179][180]
UScellular
(US only)
2009-02 D-AMPS [181]
UScellular
(US only)
2024-01-14 cdmaOne CDMA2000 1xRTT, EV-DO Rev. A (3G) service has also terminated.[182][183]
Verizon
(US only)
2022-12-31 cdmaOne CDMA2000 1xRTT, EV-DO Rev. A (3G) service has also terminated.[184]
Venezuela Digitel 2024-03-08 GSM Shutdown commenced in May 2021.[185][186]
Movilnet 2025 GSM [185]
Movistar 2025 GSM Shutdown commenced on 01 Jul 2022.[185]
Vietnam No Service Gmobile 2024-10-16 GSM per government regulation[187]
Mobifone 2024-10-16 GSM per government regulation[187]
Vietnamobile N/A (no 2G)
Viettel 2024-10-16 GSM per government regulation[187]
Vinaphone 2024-10-16 GSM per government regulation[187]

See also

[edit]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
2G (second generation) cellular technology encompasses the digital mobile telecommunication standards that replaced analog 1G systems in the early 1990s, providing digitized voice transmission via techniques such as TDMA and GMSK modulation, thereby improving call quality, network capacity, and security through encryption. The primary standard, GSM (Global System for Mobile Communications), developed by the European Telecommunications Standards Institute (ETSI), introduced short message service (SMS) for text messaging and basic circuit-switched data rates up to 9.6 kbit/s, marking the shift from voice-only analog networks to foundational digital services that enabled global roaming and widespread adoption. First commercially deployed in Finland in 1991, GSM achieved over 90% global market share by the mid-2010s, evolving through enhancements like GPRS and EDGE to bridge toward 3G while facing challenges in spectrum allocation, notably controversies over non-auction methods in regions like India that alleged but ultimately did not judicially confirm massive public losses.

Origins and Standardization

Precursors to Digital Mobile Networks

The first-generation (1G) mobile networks, deployed primarily in the late 1970s and early 1980s, relied on analog transmission of voice signals using frequency-division multiple access (FDMA) techniques. Notable examples included the Advanced Mobile Phone System (AMPS), launched commercially in the United States on October 13, 1983, by Ameritech in Chicago; the Nordic Mobile Telephone (NMT) system, introduced in 1981 across Scandinavian countries operating at 450 MHz and later 900 MHz; and the Total Access Communications System (TACS), a variant of AMPS deployed in the United Kingdom in 1985. These systems allocated dedicated radio channels per call, enabling basic voice telephony but lacking support for data services or encryption. These analog networks faced inherent limitations in spectrum efficiency and reliability, as FDMA provided low —typically 30 kHz per voice channel—and required wide guard bands to mitigate interference, resulting in cluster sizes of 7 or more cells for frequency reuse. Analog signals were prone to , , and cross-talk, yielding inconsistent voice quality, while their unencrypted nature facilitated easy via simple radio scanners. By the mid-1980s, surging subscriber demand exacerbated these issues; in the subscribers grew from approximately 340,000 in 1985 to 5 million by 1990, overwhelming available and causing capacity shortages in urban areas. Similar pressures emerged in , where fragmented national analog systems hindered and efficient use, prompting recognition that analog FDMA could not scale to meet projected growth without digital multiplexing methods like (TDMA) or (CDMA) to boost capacity by factors of 3 to 10 per cell. To address these challenges and foster a unified European market, the Conference of European Postal and Telecommunications Administrations (CEPT) established the Groupe Spécial Mobile (GSM) in December 1982 as a study group tasked with developing a single digital standard for public land mobile service in the 900 MHz band, aiming to replace incompatible 1G systems and enable seamless pan-European roaming. This initiative was driven by empirical needs for higher spectral efficiency, improved signal processing via digital techniques, and inherent security through encryption, rather than incremental analog enhancements. Early validation came through field trials conducted in autumn 1986 across candidate technologies, culminating in a February 1987 meeting in Madeira where GSM selected TDMA as the basis for its proof-of-concept, confirming digital viability for increased capacity and quality over analog predecessors. These developments underscored the causal shift: analog 1G's physical constraints on signal propagation and multiplexing necessitated digital architectures to accommodate exponential demand without proportional spectrum expansion.

Development of Key Standards

The transition to second-generation (2G) mobile networks addressed inherent limitations of first-generation analog systems, which suffered from low , restricted capacity (typically supporting fewer than 100 simultaneous calls per cell), poor voice quality susceptible to noise, and lack of leading to easy interception. Digital 2G standards employed (TDMA) or (CDMA) to multiplex signals, enabling 3-10 times higher capacity through efficient spectrum reuse and error correction, while providing clearer audio via and foundational security via authentication. The primary European standard, , emerged from efforts by the European Conference of Postal and Telecommunications Administrations (CEPT) and was formalized by the European Telecommunications Standards Institute (ETSI). Phase 1 specifications, completed in 1990, defined a TDMA-based digital system operating in the 900 MHz frequency band, encompassing radio access, core switching, and subscriber identity module (SIM) for authentication and portability. These standards prioritized interoperability across borders, with the facilitating user mobility without device changes. The first commercial GSM deployment occurred on July 1, 1991, via Radiolinja in , marking the operational debut of digital cellular service. In parallel, Qualcomm advanced CDMA technology, culminating in the IS-95 interim standard approved by the (TIA) in 1993 for North American markets. IS-95 utilized CDMA, assigning unique codes to users for simultaneous transmission in the same band, achieving 10-15 times the voice capacity of analog systems through soft handoff and to mitigate interference. This approach demonstrated superior efficiency in field trials, contrasting with TDMA's time-slot division, though initial adoption faced resistance due to complexity in implementation. While GSM achieved widespread harmonization in and much of and , enabling international roaming via standardized SIM protocols, the coexistence of GSM (TDMA) and IS-95 (CDMA) created regional divides that initially limited seamless global interoperability, as devices and networks required dual-mode support for cross-technology . The (ITU) played a coordinative role in recognizing these as stepping stones toward unified third-generation frameworks like IMT-2000, though 2G remained fragmented without a single global mandate.

Technical Framework

Core GSM Architecture

The GSM network architecture is divided into the Base Station Subsystem (BSS) and the Network Switching Subsystem (NSS). The , which interfaces with mobile stations over the radio link, includes Base Transceiver Stations (BTS) for radio transmission/reception and Base Station Controllers (BSC) for managing radio resources and handovers across multiple BTS units. The NSS handles call routing, switching, and subscriber management, comprising the Mobile Switching Center (MSC) for , the Home Location Register (HLR) for permanent subscriber data storage, and the Visitor Location Register (VLR) for temporary location and service profile data during . This modular separation causally enhances by localizing radio-specific functions in the BSS while centralizing core switching in the NSS, allowing independent upgrades without network-wide disruption. The air interface between mobile stations and BTS employs a hybrid of Time Division Multiple Access (TDMA) and Frequency Division Multiple Access (FDMA), with FDMA dividing the spectrum into 200 kHz carrier channels and TDMA allocating eight time slots per 4.615 ms frame to support up to eight simultaneous voice calls per carrier. Speech signals are digitized at 8 kHz and compressed using Regular Pulse Excitation-Long Term Prediction (RPE-LTP) coding at 13 kbit/s, which exploits to model vocal tract characteristics efficiently while preserving perceptual quality. Error correction incorporates convolutional coding (typically rate 1/2) on key bits, interleaved to combat burst errors from fading, alongside cyclic redundancy checks for detection; handover mechanisms, triggered by signal quality thresholds and executed via BSC-MSC signaling, ensure seamless mobility by pre-allocating target channels. Operating primarily in the 900 MHz (uplink 890-915 MHz, downlink 935-960 MHz) and 1800 MHz (uplink 1710-1785 MHz, downlink 1805-1880 MHz) bands, GSM enables frequency reuse patterns such as 4/12 or 7-cell clusters, where identical frequencies are reassigned to non-adjacent cells to minimize . This reuse, combined with digital modulation's narrower effective bandwidth and interference rejection via synchronization and coding, yields capacity gains empirically observed in early deployments—up to eightfold per carrier over analog systems like AMPS, which supported only one call per 30 kHz channel due to continuous analog signaling prone to and without correction. The digital approach causally reduces by discretizing signals, enabling receiver-side equalization and error mitigation that analog lacked, thus improving reliability in multipath environments and supporting denser cell layouts for .

Alternative 2G Systems (e.g., CDMA)

The IS-95 standard, also known as cdmaOne, represented a primary alternative to GSM in 2G networks, utilizing (CDMA) with modulation. Standardized by the as TIA/EIA/IS-95 in 1995 following development led by , it enabled multiple users to share the same frequency band through unique orthogonal codes, facilitating soft handoffs during cell transitions and to suppress transmission during silence periods, thereby enhancing capacity. Deployments commenced in the that year, with Sprint PCS initiating commercial CDMA services in select U.S. markets in late 1996 using chipsets, achieving initial coverage in urban areas like Washington, D.C., and . IS-95 operated with a chip rate of 1.2288 Mcps, occupying approximately 1.25 MHz of bandwidth per carrier, which supported interference rejection via processing gain—typically 21 dB for voice channels—and rake receivers that combined multipath signals constructively for improved reception in environments. Field tests in urban settings demonstrated CDMA's resilience to , yielding lower bit error rates compared to TDMA systems under similar conditions, though signal strength measurements in specific locales like showed variability favoring in low-interference rural zones. Market adoption in stemmed from allocated spectrum bands (e.g., PCS at 1.9 GHz) suiting wider channels, contrasting 's 200 kHz slots, with IS-95 capturing over 50% of U.S. digital subscribers by 2000 due to demonstrated capacity gains in high-density deployments. In , Personal Digital Cellular (PDC), a TDMA-based standard using pi/4-DQPSK modulation, launched commercially in March 1993 via NTT Docomo's Digital mova service, targeting high-capacity urban voice services in 800 MHz and 1.5 GHz bands. PDC's design emphasized smaller cell sizes and frequency hopping for interference mitigation, enabling up to three times the voice channels per MHz relative to analog predecessors in dense areas, though it faced delays from domestic R&D rather than international patent conflicts. By December 2002, PDC served 60 million subscribers exclusively in , reflecting operator choices for localized optimization over global . Complementing PDC, Japan's Personal Handy-phone System (PHS), introduced nationwide in 1995, functioned as a microcellular variant with transmit powers limited to 80 mW and cell radii under 100 meters, prioritizing low-cost infrastructure over wide-area mobility. PHS offered tariff advantages, with per-minute rates around ¥40 versus ¥150–190 for cellular, and supported data rates up to 64 kbps, but its stationary microcell architecture confined use to pedestrian urban scenarios, amassing peak adoption in metropolitan hubs before 3G transitions. Empirical contrasts highlight regional trade-offs: CDMA's spread-spectrum yielded 2–4 times higher theoretical capacity in interference-prone environments versus GSM or PDC's TDMA, influencing Americas-centric deployment, while Japan's insular standards reflected spectrum policy and density-driven innovations without broader export.

Capabilities and Innovations

Voice and Messaging Services

The core voice service in 2G networks, particularly , relied on the speech , which digitally encoded and compressed audio signals at a bitrate of 13 kbit/s, allowing for more efficient spectrum use and clearer transmission than analog predecessors. This processed 13-bit linear PCM input into frames of 260 bits every 20 ms, enabling reliable telephony-grade speech over narrowband channels while supporting between cells without perceptible interruption. Digital modulation reduced susceptibility to interference, yielding subjective voice quality improvements equivalent to wireline standards in low-error environments, in contrast to analog systems prone to and . Supplementary voice features included to alert users of incoming calls during active sessions, multi-party conferencing for up to several participants, and international enabled by SIM card-based , which verified subscriber identity against the home location register via the visited network. These capabilities, standardized in Phase 1 from 1991, leveraged circuit-switched architecture for consistent performance, with roaming agreements proliferating after initial European deployments in 1991. Messaging debuted with the Short Message Service (SMS), a store-and-forward system limited to 160 characters per message, first demonstrated on December 3, 1992, when engineer Neil Papworth sent "Merry Christmas" from a computer to a executive's Orbitel 901 on the UK network. SMS delivery employed SS7 signaling for routing through the mobile switching center and , independent of voice channels to avoid congestion. As a low-overhead application, SMS achieved explosive adoption, surpassing 250 billion global messages annually by 2002 and underpinning revenue streams for operators through its simplicity and universality across 2G variants. Early SMS lacked native or threading, restricting it to alphanumeric text, though it proved resilient in bandwidth-constrained scenarios.

Initial Data Transmission Features

The initial data transmission capabilities of 2G networks centered on circuit-switched services, with Circuit Switched Data (CSD) providing asynchronous or synchronous connections at a of 9.6 kbit/s by utilizing a single GSM time slot for modem-like data transfer over dedicated traffic channels. High Speed Circuit Switched Data (HSCSD), specified in GSM Phase 2+ releases starting in 1997, enhanced this by supporting higher modulation (V.32 bis) for 14.4 kbit/s per time slot and allowing aggregation of up to four downlink slots, though uplink was typically limited to one or two for asymmetry. These mechanisms also accommodated transmission compliant with Group 3 standards, integrating with the existing voice-oriented architecture. The circuit-switched paradigm inherently tied up full traffic channels for the entire session duration, regardless of actual data flow, making it poorly suited for bursty or intermittent traffic patterns common in early use. , introduced as a core feature, offered a more efficient alternative for low-volume text data, capped at 160 seven-bit characters to align with signaling channel constraints established in the protocol design. Unlike CSD or HSCSD, operated over control and signaling channels, avoiding resource contention with voice calls. Early 2G data services, including CSD and nascent WAP-enabled browsing precursors, suffered from setup latencies of 300–1000 ms inherent to circuit establishment, compounded by per-minute billing models that inflated costs for sporadic access. While these enabled rudimentary applications like text-based , the combination of low throughput, resource inefficiency, and economic barriers limited data to a marginal of network traffic, with voice dominating over 99% of usage in initial deployments. Critics noted that such constraints causally impeded development of interactive apps, as sustained channel occupation deterred experimentation beyond basic messaging or .

Deployment and Economic Impact

Global Rollout Timeline

The rollout of 2G networks commenced in under the GSM standard, with the first commercial service launching on July 1, 1991, in via Radiolinja, enabling encrypted digital voice calls distinct from analog systems. Test calls on the same date connected sites in , , , and , signaling coordinated pan-European deployment. Expansion ensued swiftly, with full commercial operations in by mid-1992 and additional networks in countries including the and ; by 1993, 36 GSM networks served 22 countries, primarily in . Deployment extended to Asia shortly thereafter, as SmarTone initiated Hong Kong's—and Asia's—first service in March 1993, leveraging the DCS 1800 MHz variant for urban coverage. In , Vodacom activated South Africa's inaugural network in October 1994, establishing an early foothold on the continent amid post-apartheid infrastructure liberalization. These regional advances reflected GSM's emphasis on , facilitating cross-border from inception. In the United States, 2G adoption diverged via competing standards, with IS-54 TDMA (D-AMPS) entering service in 1992 as an upgrade from analog AMPS, while CDMA pilots preceded commercial launches around 1995 by carriers like Sprint and Verizon predecessors. The Federal Communications Commission's inaugural spectrum auctions in July 1994 awarded Personal Communications Services (PCS) licenses in the 1.9 GHz band, injecting capital for nationwide digital overlays and GSM-compatible PCS 1900 MHz deployments by operators like . This bifurcated path—TDMA/CDMA dominance over —stemmed from domestic incumbency advantages and Qualcomm's CDMA advocacy, though PCS auctions marked a 1995 inflection for scaled buildouts. Global connections surpassed 100 million by 1998, coinciding with handset prices declining from over $1,000 in 1991 to approximately $200–$400 by decade's end, broadening accessibility beyond elites to mass markets through in production. This subscriber threshold underscored 2G's causal momentum, as standardized SIM cards and falling device costs enabled prepaid models and international expansion into emerging regions by the late .

Market Penetration and Subscriber Growth

Global 2G subscriber numbers expanded rapidly following commercial deployments in the early 1990s, starting from approximately 10-20 million users by 1993—primarily in via networks—and surpassing 1 billion worldwide by around 2000, with total mobile subscriptions (overwhelmingly 2G at the time) reaching over 2 billion by 2005. This growth was fueled by falling prices, from thousands of dollars in the early 1990s to under $100 by the late 1990s, alongside network expansions that lowered marginal costs for operators. Regional variations in penetration reflected differences in standardization and regulatory approaches: achieved penetration rates exceeding 50% by 2000 due to the unified standard enabling seamless roaming and economies of scale, while saw explosive adoption in the late 1990s, with countries like adding hundreds of millions of subscribers through rapid infrastructure builds and affordable devices. In contrast, the experienced comparatively slower initial growth, with 2G (primarily TDMA and CDMA) penetration lagging behind until the mid-1990s owing to prolonged reliance on analog systems and fragmented standards that increased device costs and compatibility issues. Economically, 2G rollout spurred telco employment surges during network construction phases, with operators hiring thousands for deployments and maintenance in adopting regions. Empirical analyses link 2G connectivity gains to GDP increases, estimating that a 10% rise in 2G coverage boosted output by 0.37-0.81% through enhanced productivity in sectors like and via basic voice services. auctions, implemented in many markets from the mid-1990s, promoted by enabling new entrants, which correlated with service price drops of up to 90% in competitive environments by , broadening access beyond urban elites; however, early monopoly or duopoly structures in some regions sustained elevated tariffs, constraining penetration among lower-income groups until additional licenses were issued.

Security and Vulnerabilities

Encryption Mechanisms and Weaknesses

The GSM authentication process relies on the COMP128 , a hash-based function implemented on the that challenges the with a (RAND) to compute a signed response (SRES) and a 64-bit ciphering key (Kc) using the subscriber's secret key (Ki). This one-way verifies the subscriber to the network but lacks mutual verification, enabling rogue base stations to impersonate legitimate ones without detection. The derived Kc initializes the A5 family of stream ciphers for encrypting voice and signaling data over the radio interface: , a using three linear feedback shift registers (LFSRs) with irregular clocking for networks, and the deliberately weakened A5/2 variant mandated for export to certain countries to comply with cryptographic restrictions. These ciphers generate a keystream XORed with the , providing link-layer that exceeded the unencrypted analog transmissions of systems. A5/1 employs a 64-bit effective key length, rendering it vulnerable to cryptanalytic attacks leveraging its linear structure; correlation and divide-and-conquer methods, combined with time-memory tradeoffs via precomputed tables, allow decryption of intercepted traffic in seconds to minutes on modern hardware following table generation. Such weaknesses stem from the algorithm's design in the early , optimized against then-prevalent computational constraints rather than advances in parallel processing and storage that enable exhaustive coverage of the keyspace today. A5/2 exacerbates this with even poorer resistance, succumbing to deliberate backdoors and simple known-plaintext attacks shortly after its 1999 introduction. applies solely between the and , offering no protection against at the controller or beyond, thus exposing voice and to network insiders or compromised infrastructure without end-to-end safeguards. Later adoption of A5/3, based on the stronger , mitigated some flaws in upgraded networks but was not universally deployed in core 2G implementations.

Documented Exploits and Mitigation Efforts

IMSI catchers, devices that impersonate legitimate base stations to intercept International Mobile Subscriber Identity (IMSI) numbers and force connections over unencrypted 2G channels, have been documented in operational use since at least 2014, exploiting GSM's lack of mutual authentication between devices and networks. In 2018, U.S. federal authorities detected unauthorized IMSI catchers in Washington, D.C., capable of hijacking connections to major carriers like Verizon and AT&T for surveillance without user awareness. These exploits enable real-time location tracking and call interception, with empirical evidence from field detections showing non-zero prevalence in urban areas reliant on legacy 2G coverage. SS7 signaling protocol vulnerabilities, integral to 2G core networks, have facilitated location tracking hijacks by allowing unauthorized queries for subscriber positions, with documented cases involving vendors tricking networks into revealing coordinates worldwide as recently as July 2025. Attackers exploit SS7's trust-based architecture, originally designed without robust access controls, to send forged messages that bypass authentication, as demonstrated in controlled tests and intercepted operations targeting mobile users. While SS7 attacks span generations, their impact on 2G is pronounced in regions with persistent infrastructure, where over 870 million active 2G subscriptions remain as of early 2025, sustaining exposure for machine-to-machine (M2M) and IoT devices. 2G fallback mechanisms from LTE networks enable downgrade attacks, where adversaries broadcast stronger signals to force devices into 2G mode for interception, as analyzed in studies from 2014 onward highlighting protocol weaknesses in EPS fallback procedures. A critical in Google Pixel 6 modems, disclosed in August 2023, allowed remote hijacking via a zero-click 2G call, prompting patches and recommendations to disable 2G connectivity. Such bidding-down exploits have been verified in lab and field scenarios, with prevalence low in advanced markets but elevated in legacy-heavy areas where 2G supports fallback for voice services. Mitigation efforts include device-level 2G disabling, introduced in for supported hardware like and expanded in to allow enterprise enforcement, preventing fallback connections at the modem layer. Carriers have deployed SS7 firewalls and signaling filters to block anomalous queries and detect activity, as recommended in 2018 congressional assessments, though implementation varies and does not fully eliminate risks for unauthenticated 2G handovers. These measures have curtailed casual eavesdropping in modern deployments, yet persistent vulnerabilities in 2025 IoT ecosystems underscore incomplete coverage, with empirical detections indicating ongoing but infrequent real-world incidents in regions delaying 2G shutdowns beyond 2030.

Major Controversies

Spectrum Allocation Scandals

In 2008, the Indian under Minister allocated 122 unified access service licenses for 2G spectrum using a first-come-first-served (FCFS) at 2001 reserve prices, rather than through competitive auctions, to expedite telecom expansion in a developing economy. This approach, intended to prioritize rapid network rollout and affordability over revenue maximization, involved advance notice to select applicants, enabling queue manipulation and perceived favoritism toward certain firms like Uninor and Swan Telecom. The Comptroller and Auditor General (CAG) report in 2010 estimated a presumptive loss of ₹1.76 (approximately $39 billion at then-exchange rates) by presuming higher revenues from hypothetical auctions, a notional figure criticized for ignoring goals of . The , in its February 2, 2012 ruling, quashed all 122 licenses as arbitrary and unconstitutional due to deviations from FCFS norms, such as arbitrary cut-off dates and failure to return excess applications, but emphasized procedural irregularities rather than proven financial loss or criminal conspiracy. Subsequent trials in a special CBI court culminated in December 2017 acquittals for all 17 accused, including and corporate executives; Judge O.P. Saini ruled no evidence supported claims of deliberate revenue loss, describing the "scam" as media-amplified perception without substantive , as spectrum pricing aligned with extant policy not mandating auctions. Empirical impacts included delayed competition and spectrum scarcity, distorting markets until re-auctions in 2012-2015, yet the policy's causal intent—boosting teledensity from 18% in 2007 to over 70% by 2012—demonstrated trade-offs between fiscal gains and developmental access, with no verified illicit kickbacks per court findings. Globally, similar non-auction methods prevailed in early mobile eras; the U.S. allocated initial cellular licenses via lotteries in the , favoring entrepreneurial entry but inviting speculation and inefficiency until auctions replaced them in 1994 under the Omnibus Budget Reconciliation Act. Proponents of auctions argue they ensure efficient allocation and , as evidenced by U.S. proceeds exceeding $33 billion from 2001-2010 sales, while critics of administrative methods like India's FCFS highlight risks of ; however, access-oriented policies in low-income contexts prioritize deployment velocity, with India's case underscoring policy design flaws over systemic graft, as final judgments found no criminal predicate for the hyped "scam" narrative.

Criticisms of Phase-Out Policies

Policies to phase out 2G networks cite spectrum refarming to support and deployments as a primary rationale, alongside 2G's weaknesses, such as vulnerabilities to due to flawed protocols like A5/1. Operators also highlight operational cost reductions from consolidating . These shutdowns have freed bandwidth in regions like , where 19 countries completed 2G decommissioning by mid-2025, enabling enhanced services. Critics contend that hasty phase-outs overlook 2G's persistent utility for hundreds of millions of users in developing nations, where feature phones provide basic voice and access amid limited alternatives, exacerbating digital exclusion for low-income and rural populations lacking / coverage or smartphones. In , an estimated 250 million subscribers remained on 2G as of early 2024, with resistance to mandated shutdowns stemming from concerns over forced device upgrades that strain affordability. Telco-driven advocacy for shutdowns, such as Reliance 's calls for government-mandated 2G/ termination in , has drawn scrutiny for prioritizing commercial interests—Jio operates solely on / and seeks to migrate users to higher-ARPU services—over equitable access, as evidenced by competitors' opposition and regulatory deference to market-led timelines. Environmental justifications for phase-outs, emphasizing energy efficiency gains from modern networks, remain empirically contested, as life-cycle assessments indicate that device replacement cycles generate substantial e-waste—potentially mirroring the millions of tons from analogous shutdowns—without guaranteed net reductions in overall impact when factoring production emissions. For machine-to-machine applications, 2G offers superior reliability in remote or coverage-gap areas where deployment lags, supporting critical IoT uses in utilities and that demand consistent low-bandwidth connectivity over intermittent higher-generation alternatives. Abrupt shutdowns thus risk disrupting these systems absent comprehensive migration plans, prioritizing spectral efficiency over proven operational dependability.

Enhancements and Transitions

Packet-Switched Extensions (GPRS/EDGE)

The General Packet Radio Service (GPRS), standardized by the 3rd Generation Partnership Project () in Release 97 and published in 1998, introduced packet-switched data transmission to existing networks, enabling always-on connectivity without dedicating full circuits for data sessions. This upgrade, often termed 2.5G, allowed theoretical maximum downlink speeds of 171.2 kbit/s by aggregating up to eight time slots using coding schemes like CS-4, though practical rates typically ranged from 40 to 115 kbit/s due to channel sharing and coding efficiencies. GPRS supported multi-slot classes, categorized from 1 to 45, defining the maximum uplink and downlink time slots allocatable simultaneously, with common classes like 10 permitting four downlink and two uplink slots for asymmetric flows such as . Billing shifted from time-based circuit charges to volume-based packet pricing, reducing costs for bursty and encouraging early mobile adoption. Devices classified as A, B, or C determined concurrent voice and handling, with Class A allowing simultaneous circuit-switched voice and packet but limited by . Enhanced Data rates for GSM Evolution (EDGE), deployed commercially starting in , extended GPRS capabilities—termed 2.75G—through 8-phase shift keying (8-PSK) modulation, which transmitted three bits per symbol versus Gaussian minimum shift keying's one bit, achieving theoretical peak speeds of 384 kbit/s downlink across eight slots. This modulation upgrade, backward-compatible with /GPRS infrastructure, quadrupled over GPRS, supporting user bit-rates around 250 kbit/s and handling four times the traffic volume. These extensions served as a cost-effective bridge to by leveraging existing spectrum and base stations, minimizing for operators while enabling messaging and basic web services; however, voice remained circuit-switched, and packet channels were prone to latency from contention among users, limiting real-world performance below theoretical maxima. Post-rollout empirical data indicated substantial capacity gains, with EDGE networks processing significantly higher data loads than pure GPRS, though exact usage surges varied by market without uniform 10-fold increases universally documented.

Path to 3G Compatibility

The migration from 2G to 3G emphasized evolutionary compatibility, reusing the core network including the Mobile Application Part (MAP) for signaling and mobility management, while introducing WCDMA as the new . This shared infrastructure enabled seamless handover and load sharing between and , with dual-mode handsets supporting both technologies to facilitate gradual subscriber transition. The 3GPP Release 99 specifications, finalized in 2000, formalized this dual-mode operation, laying the groundwork for interoperability without requiring immediate full network replacement. Spectrum auctions in 2000 marked a pivotal event, with the auction concluding on April 27, 2000, after 150 rounds and raising £22.5 billion for five licenses, funding infrastructure but imposing heavy debt on operators. Similar auctions across , including Germany's, allocated spectrum but highlighted the high financial barriers to rapid deployment. Operators pursued spectrum refarming pilots and incremental enhancements to existing 2G spectrum, delaying comprehensive 3G rollout as cost analyses favored extending viability over revolutionary overhauls. Proprietary extensions in vendor-specific implementations during the GSM-to-UMTS evolution exacerbated challenges, contributing to that bound operators to particular suppliers for upgrades and maintenance. This reliance on non-standardized elements in core transitions increased long-term costs and hindered multi-vendor flexibility, despite efforts toward openness. The persistence of substantial traffic into the underscored the economic rationale of these phased approaches, as operators balanced upgrade expenses against ongoing 2G demand in mature and emerging markets.

Current Status and Legacy

Ongoing Networks and Use Cases

As of mid-2025, 2G networks operate in the vast majority of countries, with only 19 having fully completed shutdowns, primarily in developed regions like and parts of . These networks persist especially in and , where they underpin voice services for millions reliant on affordable, low-data handsets amid economic barriers to upgrading infrastructure or devices. Examples include holdouts like , , and , where operators have delayed or abandoned phase-outs due to risks of digital exclusion for low-income users and dependencies in sectors such as ride-hailing backups. In machine-to-machine (M2M) and (IoT) niches, 2G excels for applications demanding minimal power and bandwidth, such as asset trackers, utility meters (e.g., water and electricity monitoring), and basic in remote or low-density areas. Its circuit-switched architecture ensures high reliability for infrequent, low-volume transmissions where alternatives like or prove uneconomical or power-intensive, sustaining deployments that would otherwise fail in coverage gaps. While 2G's global connection share has fallen below 10% in regions like , it maintains essential coverage—estimated at 10-20% in persistent markets—where spectrum refarming to newer technologies remains unfeasible without disrupting basic access. data indicate 61 planned 2G or closures for 2025 among 131 scheduled by 2030, yet many face challenges from entrenched dependencies, rendering full global shutdown impractical in the near term. This endurance highlights 2G's role in bridging connectivity voids, prioritizing empirical reliability over rapid obsolescence.

Shutdown Timelines and Challenges

in the United States began reducing 2G network capacity and coverage in February 2025 to repurpose spectrum for and enhancements, marking one of the final major 2G shutdowns in following earlier discontinuations by competitors like and Verizon. In Europe, timelines vary by operator and region, with France's Orange scheduling a phased 2G decommissioning starting December 31, 2025, in select areas and completing nationwide by the end of 2026 to improve energy efficiency and / performance. Other French carriers, including and , align with similar 2026 endpoints for 2G, prioritizing coordinated spectrum refarming amid regulatory oversight. In , phase-out schedules remain uneven, with countries like postponing 2G closures indefinitely due to persistent reliance on low-cost feature phones—comprising 73% of 55-60 million annual shipments as of 2024—for basic voice and services among rural and low-income populations. and completed 2G shutdowns earlier, by 2025 in tandem with retirements, while and maintain 2G for machine-to-machine applications but plan gradual transitions through 2027. Key challenges encompass legacy device obsolescence, where millions of 2G-only endpoints, including IoT sensors and alarm systems, risk disconnection without viable upgrades, complicating global service continuity for industries like and metering. Rural deployment lags exacerbate coverage gaps, as / infrastructure investments prioritize urban densities, potentially widening access disparities in underdeveloped areas. E-waste generation surges from discarded hardware, with improper disposal of legacy phones and modules contributing to environmental hazards from materials like lead and , straining capacities in regions with high 2G penetration. Operators emphasize spectrum efficiency gains—2G's low data throughput (0.1 bits per second per hertz) hinders modern demands—enabling reallocations that accelerate rollout and reduce operational costs. Conversely, stakeholders in developing markets underscore 2G's foundational role in extending connectivity to underserved users, where abrupt shutdowns could isolate populations without affordable alternatives, as evidenced by slowed transitions in high-dependency areas. Empirical transitions, such as those in advanced markets, show initial service disruptions prompting device upgrades that ultimately boost higher-generation adoption, though uncoordinated timelines across borders challenge multinational IoT deployments.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.