Recent from talks
Nothing was collected or created yet.
Information and communications technology
View on Wikipedia
| Internet history timeline |
|
Early research and development:
Merging the networks and creating the Internet:
Commercialization, privatization, broader access leads to the modern Internet:
Examples of Internet services:
|
Information and communications technology (ICT) is an extensional term for information technology (IT) that stresses the role of unified communications[1] and the integration of telecommunications (telephone lines and wireless signals) and computers, as well as necessary enterprise software, middleware, storage and audiovisual, that enable users to access, store, transmit, understand and manipulate information.
ICT is also used to refer to the convergence of audiovisuals and telephone networks with computer networks through a single cabling or link system. There are large economic incentives to merge the telephone networks with the computer network system using a single unified system of cabling, signal distribution, and management. ICT is an umbrella term that includes any communication device, encompassing radio, television, cell phones, computer and network hardware, satellite systems and so on, as well as the various services and appliances with them such as video conferencing and distance learning. ICT also includes analog technology, such as paper communication, and any mode that transmits communication.[2]
ICT is a broad subject and the concepts are evolving.[3] It covers any product that will store, retrieve, manipulate, process, transmit, or receive information electronically in a digital form (e.g., personal computers including smartphones, digital television, email, or robots). Skills Framework for the Information Age is one of many models for describing and managing competencies for ICT professionals in the 21st century.[4]
Etymology
[edit]The phrase "information and communication technologies" has been used by academic researchers since the 1980s.[5] The abbreviation "ICT" became popular after it was used in a report to the UK government by Dennis Stevenson in 1997,[6] and then in the revised National Curriculum for England, Wales and Northern Ireland in 2000. However, in 2012, the Royal Society recommended that the use of the term "ICT" should be discontinued in British schools "as it has attracted too many negative connotations".[7] From 2014, the National Curriculum has used the word computing, which reflects the addition of computer programming into the curriculum.[8]
Variations of the phrase have spread worldwide. The United Nations has created a "United Nations Information and Communication Technologies Task Force" and an internal "Office of Information and Communications Technology".[9]
Monetization
[edit]The money spent on IT worldwide has been estimated as US$3.8 trillion[10] in 2017 and has been growing at less than 5% per year since 2009. The estimated 2018 growth of the entire ICT is 5%. The biggest growth of 16% is expected in the area of new technologies (IoT, Robotics, AR/VR, and AI).[11]
The 2014 IT budget of the US federal government was nearly $82 billion.[12] IT costs, as a percentage of corporate revenue, have grown 50% since 2002, putting a strain on IT budgets. When looking at current companies' IT budgets, 75% are recurrent costs, used to "keep the lights on" in the IT department, and 25% are the cost of new initiatives for technology development.[13]
The average IT budget has the following breakdown:[13]
- 34% personnel costs (internal), 31% after correction
- 16% software costs (external/purchasing category), 29% after correction
- 33% hardware costs (external/purchasing category), 26% after correction
- 17% costs of external service providers (external/services), 14% after correction
The estimated amount of money spent in 2022 is just over US$6 trillion.[14]
Technological capacity
[edit]The world's technological capacity to store information grew from 2.6 (optimally compressed) exabytes in 1986 to 15.8 in 1993, over 54.5 in 2000, and to 295 (optimally compressed) exabytes in 2007, and some 5 zettabytes in 2014.[15][16] This is the informational equivalent to 1.25 stacks of CD-ROM from the earth to the moon in 2007, and the equivalent of 4,500 stacks of printed books from the earth to the sun in 2014. The world's technological capacity to receive information through one-way broadcast networks was 432 exabytes of (optimally compressed) information in 1986, 715 (optimally compressed) exabytes in 1993, 1.2 (optimally compressed) zettabytes in 2000, and 1.9 zettabytes in 2007.[15] The world's effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of (optimally compressed) information in 1986, 471 petabytes in 1993, 2.2 (optimally compressed) exabytes in 2000, 65 (optimally compressed) exabytes in 2007,[15] and some 100 exabytes in 2014.[17] The world's technological capacity to compute information with humanly guided general-purpose computers grew from 3.0 × 10^8 MIPS in 1986, to 6.4 x 10^12 MIPS in 2007.[15]
Sector in the OECD
[edit]The following is a list of OECD countries by share of ICT sector in total value added in 2013.[18]
| Rank | Country | ICT sector in % | Relative size |
|---|---|---|---|
| 1 | 10.7 | ||
| 2 | 7.02 | ||
| 3 | 6.99 | ||
| 4 | 6.82 | ||
| 5 | 6.09 | ||
| 6 | 5.89 | ||
| 7 | 5.87 | ||
| 8 | 5.74 | ||
| 9 | 5.60 | ||
| 10 | 5.53 | ||
| 11 | 5.33 | ||
| 12 | 4.87 | ||
| 13 | 4.84 | ||
| 14 | 4.54 | ||
| 15 | 4.63 | ||
| 16 | 4.33 | ||
| 17 | 4.26 | ||
| 18 | 4.06 | ||
| 19 | 4.00 | ||
| 20 | 3.86 | ||
| 21 | 3.72 | ||
| 22 | 3.72 | ||
| 23 | 3.56 | ||
| 24 | 3.43 | ||
| 25 | 3.33 | ||
| 26 | 3.32 | ||
| 27 | 3.31 | ||
| 28 | 2.87 | ||
| 29 | 2.77 |
ICT Development Index
[edit]The ICT Development Index ranks and compares the level of ICT use and access across the various countries around the world.[19] In 2014 ITU (International Telecommunication Union) released the latest rankings of the IDI, with Denmark attaining the top spot, followed by South Korea. The top 30 countries in the rankings include most high-income countries where the quality of life is higher than average, which includes countries from Europe and other regions such as "Australia, Bahrain, Canada, Japan, Macao (China), New Zealand, Singapore, and the United States; almost all countries surveyed improved their IDI ranking this year."[20]
The WSIS process and development goals
[edit]On 21 December 2001, the United Nations General Assembly approved Resolution 56/183, endorsing the holding of the World Summit on the Information Society (WSIS) to discuss the opportunities and challenges facing today's information society.[21] According to this resolution, the General Assembly related the Summit to the United Nations Millennium Declaration's goal of implementing ICT to achieve Millennium Development Goals. It also emphasized a multi-stakeholder approach to achieve these goals, using all stakeholders including civil society and the private sector, in addition to governments.
To help anchor and expand ICT to every habitable part of the world, "2015 is the deadline for achievements of the UN Millennium Development Goals (MDGs), which global leaders agreed upon in the year 2000."[22]
In education
[edit]
There is evidence that, to be effective in education, ICT must be fully integrated into the pedagogy. Specifically, when teaching literacy and math, using ICT in combination with Writing to Learn[23][24] produces better results than traditional methods alone or ICT alone.[25] The United Nations Educational, Scientific and Cultural Organisation (UNESCO), a division of the United Nations, has made integrating ICT into education as part of its efforts to ensure equity and access to education. The following, which was taken directly from a UNESCO publication on educational ICT, explains the organization's position on the initiative.
Information and Communication Technology can contribute to universal access to education, equity in education, the delivery of quality learning and teaching, teachers' professional development and more efficient education management, governance, and administration. UNESCO takes a holistic and comprehensive approach to promote ICT in education. Access, inclusion, and quality are among the main challenges they can address. The Organization's Intersectoral Platform for ICT in education focuses on these issues through the joint work of three of its sectors: Communication & Information, Education and Science.[26]

Despite the power of computers to enhance and reform teaching and learning practices, improper implementation is a widespread issue beyond the reach of increased funding and technological advances with little evidence that teachers and tutors are properly integrating ICT into everyday learning.[27] Intrinsic barriers such as a belief in more traditional teaching practices and individual attitudes towards computers in education as well as the teachers own comfort with computers and their ability to use them all as result in varying effectiveness in the integration of ICT in the classroom.[28]
Mobile learning for refugees
[edit]School environments play an important role in facilitating language learning. However, language and literacy barriers are obstacles preventing refugees from accessing and attending school, especially outside camp settings.[29]
Mobile-assisted language learning apps are key tools for language learning. Mobile solutions can provide support for refugees' language and literacy challenges in three main areas: literacy development, foreign language learning and translations. Mobile technology is relevant because communicative practice is a key asset for refugees and immigrants as they immerse themselves in a new language and a new society. Well-designed mobile language learning activities connect refugees with mainstream cultures, helping them learn in authentic contexts.[29]
Developing countries
[edit]Africa
[edit]
ICT has been employed as an educational enhancement in Sub-Saharan Africa since the 1960s. Beginning with television and radio, it extended the reach of education from the classroom to the living room, and to geographical areas that had been beyond the reach of the traditional classroom. As the technology evolved and became more widely used, efforts in Sub-Saharan Africa were also expanded. In the 1990s a massive effort to push computer hardware and software into schools was undertaken, with the goal of familiarizing both students and teachers with computers in the classroom. Since then, multiple projects have endeavoured to continue the expansion of ICT's reach in the region, including the One Laptop Per Child (OLPC) project, which by 2015 had distributed over 2.4 million laptops to nearly two million students and teachers.[30]
The inclusion of ICT in the classroom, often referred to as M-Learning, has expanded the reach of educators and improved their ability to track student progress in Sub-Saharan Africa. In particular, the mobile phone has been most important in this effort. Mobile phone use is widespread, and mobile networks cover a wider area than internet networks in the region. The devices are familiar to student, teacher, and parent, and allow increased communication and access to educational materials. In addition to benefits for students, M-learning also offers the opportunity for better teacher training, which leads to a more consistent curriculum across the educational service area. In 2011, UNESCO started a yearly symposium called Mobile Learning Week with the purpose of gathering stakeholders to discuss the M-learning initiative.[30]
Implementation is not without its challenges. While mobile phone and internet use are increasing much more rapidly in Sub-Saharan Africa than in other developing countries, the progress is still slow compared to the rest of the developed world, with smartphone penetration only expected to reach 20% by 2017.[30] Additionally, there are gender, social, and geo-political barriers to educational access, and the severity of these barriers vary greatly by country. Overall, 29.6 million children in Sub-Saharan Africa were not in school in the year 2012, owing not just to the geographical divide, but also to political instability, the importance of social origins, social structure, and gender inequality. Once in school, students also face barriers to quality education, such as teacher competency, training and preparedness, access to educational materials, and lack of information management.[30]
Growth in modern society and developing countries
[edit]In modern society, ICT is ever-present, with over three billion people having access to the Internet.[31] With approximately 8 out of 10 Internet users owning a smartphone, information and data are increasing by leaps and bounds.[32] This rapid growth, especially in developing countries, has led ICT to become a keystone of everyday life, in which life without some facet of technology renders most of clerical, work and routine tasks dysfunctional.
The most recent authoritative data, released in 2014, shows "that Internet use continues to grow steadily, at 6.6% globally in 2014 (3.3% in developed countries, 8.7% in the developing world); the number of Internet users in developing countries has doubled in five years (2009–2014), with two-thirds of all people online now living in the developing world."[20]
Limitations
[edit]However, hurdles are still large. "Of the 4.3 billion people not yet using the Internet, 90% live in developing countries. In the world's 42 Least Connected Countries (LCCs), which are home to 2.5 billion people, access to ICTs remains largely out of reach, particularly for these countries' large rural populations."[33] ICT has yet to penetrate the remote areas of some countries, with many developing countries dearth of any type of Internet. This also includes the availability of telephone lines, particularly the availability of cellular coverage, and other forms of electronic transmission of data. The latest "Measuring the Information Society Report" cautiously stated that the increase in the aforementioned cellular data coverage is ostensible, as "many users have multiple subscriptions, with global growth figures sometimes translating into little real improvement in the level of connectivity of those at the very bottom of the pyramid; an estimated 450 million people worldwide live in places which are still out of reach of mobile cellular service."[31]
Favourably, the gap between the access to the Internet and mobile coverage has decreased substantially in the last fifteen years, in which "2015 was the deadline for achievements of the UN Millennium Development Goals (MDGs), which global leaders agreed upon in the year 2000, and the new data show ICT progress and highlight remaining gaps."[22] ICT continues to take on a new form, with nanotechnology set to usher in a new wave of ICT electronics and gadgets. ICT newest editions into the modern electronic world include smartwatches, such as the Apple Watch, smart wristbands such as the Nike+ FuelBand, and smart TVs such as Google TV. With desktops soon becoming part of a bygone era, and laptops becoming the preferred method of computing, ICT continues to insinuate and alter itself in the ever-changing globe.
Information communication technologies play a role in facilitating accelerated pluralism in new social movements today. The internet according to Bruce Bimber is "accelerating the process of issue group formation and action"[34] and coined the term accelerated pluralism to explain this new phenomena. ICTs are tools for "enabling social movement leaders and empowering dictators"[35] in effect promoting societal change. ICTs can be used to garner grassroots support for a cause, due to the internet allowing for political discourse and direct interventions with state policy.[36] Furthermore, ICTs in a household are associated with women rejecting justifications for intimate partner violence. According to a study published in 2017, this is likely because "access to ICTs exposes women to different ways of life and different notions about women's role in society and the household, especially in culturally conservative regions where traditional gender expectations contrast observed alternatives".[37]
In government
[edit]Governments use ICT in various ways. UK government minister Francis Maude, endorsing the use of open standards in government IT, stated in 2012 that "Government must be better connected to the people it serves and partners who can work with it - especially small businesses, voluntary and community organisations."[38] ICT can also change the way complaints from the populace are handled by governments.[citation needed]
In health care
[edit]In science
[edit]Applications of ICTs in science, research and development, and academia include:
- Internet research
- Online research methods
- Science communication and communication between scientists
- Scholarly databases
- Applied metascience
Models of access
[edit]Scholar Mark Warschauer defines a "models of access" framework for analyzing ICT accessibility. In the second chapter of his book, Technology and Social Inclusion: Rethinking the Digital Divide, he describes three models of access to ICTs: devices, conduits, and literacy.[41] Devices and conduits are the most common descriptors for access to ICTs, but they are insufficient for meaningful access to ICTs without third model of access, literacy.[41] Combined, these three models roughly incorporate all twelve of the criteria of "Real Access" to ICT use, conceptualized by a non-profit organization called Bridges.org in 2005:[42]
- Physical access to technology
- Appropriateness of technology
- Affordability of technology and technology use
- Human capacity and training
- Locally relevant content, applications, and services
- Integration into daily routines
- Socio-cultural factors
- Trust in technology
- Local economic environment
- Macro-economic environment
- Legal and regulatory framework
- Political will and public support
Devices
[edit]The most straightforward model of access for ICT in Mark Warschauer's theory is devices.[41] In this model, access is defined most simply as the ownership of a device such as a phone or computer.[41] Warschauer identifies many flaws with this model, including its inability to account for additional costs of ownership such as software, access to telecommunications, knowledge gaps surrounding computer use, and the role of government regulation in some countries.[41] Therefore, Warschauer argues that considering only devices understates the magnitude of digital inequality. For example, the Pew Research Center notes that 96% of Americans own a smartphone,[43] although most scholars in this field would contend that comprehensive access to ICT in the United States is likely much lower than that.
Conduits
[edit]A conduit requires a connection to a supply line, which for ICT could be a telephone line or Internet line. Accessing the supply requires investment in the proper infrastructure from a commercial company or local government and recurring payments from the user once the line is set up. For this reason, conduits usually divide people based on their geographic locations. As a Pew Research Center poll reports, Americans in rural areas are 12% less likely to have broadband access than other Americans, thereby making them less likely to own the devices.[44] Additionally, these costs can be prohibitive to lower-income families accessing ICTs. These difficulties have led to a shift toward mobile technology; fewer people are purchasing broadband connection and are instead relying on their smartphones for Internet access, which can be found for free at public places such as libraries.[45] Indeed, smartphones are on the rise, with 37% of Americans using smartphones as their primary medium for internet access[45] and 96% of Americans owning a smartphone.[43]
Literacy
[edit]
In 1981, Sylvia Scribner and Michael Cole studied a tribe in Liberia, the Vai people, who have their own local script. Since about half of those literate in Vai have never had formal schooling, Scribner and Cole were able to test more than 1,000 subjects to measure the mental capabilities of literates over non-literates.[46] This research, which they laid out in their book The Psychology of Literacy,[46] allowed them to study whether the literacy divide exists at the individual level. Warschauer applied their literacy research to ICT literacy as part of his model of ICT access.
Scribner and Cole found no generalizable cognitive benefits from Vai literacy; instead, individual differences on cognitive tasks were due to other factors, like schooling or living environment.[46] The results suggested that there is "no single construct of literacy that divides people into two cognitive camps; [...] rather, there are gradations and types of literacies, with a range of benefits closely related to the specific functions of literacy practices."[41] Furthermore, literacy and social development are intertwined, and the literacy divide does not exist on the individual level.
Warschauer draws on Scribner and Cole's research to argue that ICT literacy functions similarly to literacy acquisition, as they both require resources rather than a narrow cognitive skill. Conclusions about literacy serve as the basis for a theory of the digital divide and ICT access, as detailed below:
There is not just one type of ICT access, but many types. The meaning and value of access varies in particular social contexts. Access exists in gradations rather than in a bipolar opposition. Computer and Internet use brings no automatic benefit outside of its particular functions. ICT use is a social practice, involving access to physical artifacts, content, skills, and social support. And acquisition of ICT access is a matter not only of education but also of power.[41]
Therefore, Warschauer concludes that access to ICT cannot rest on devices or conduits alone; it must also engage physical, digital, human, and social resources.[41] Each of these categories of resources have iterative relations with ICT use. If ICT is used well, it can promote these resources, but if it is used poorly, it can contribute to a cycle of underdevelopment and exclusion.[46]
Environmental impact
[edit]Progress during the century
[edit]In the early 21st century a rapid development of ICT services and electronical devices took place, in which the internet servers multiplied by a factor of 1000 to 395 million and its still increasing. This increase can be explained by Moore's law, which states, that the development of ICT increases every year by 16–20%, so it will double in numbers every four to five years.[47] Alongside this development and the high investments in increasing demand for ICT capable products, a high environmental impact came with it. Software and Hardware development as well as production causing already in 2008 the same amount of CO2 emissions as global air travels.[47]
There are two sides of ICT, the positive environmental possibilities and the shadow side. On the positive side, studies proved, that for instance in the OECD countries a reduction of 0.235% energy use is caused by an increase in ICT capital by 1%.[48] On the other side the more digitization is happening, the more energy is consumed, that means for OECD countries 1% increase in internet users causes a raise of 0.026% electricity consumption per capita and for emerging countries the impact is more than 4 times as high.
Currently the scientific forecasts are showing an increase up to 30700 TWh in 2030 which is 20 times more than it was in 2010.[48]
Implication
[edit]To tackle the environmental issues of ICT, the EU commission plans proper monitoring and reporting of the GHG emissions of different ICT platforms, countries and infrastructure in general. Further the establishment of international norms for reporting and compliance are promoted to foster transparency in this sector.[49]
Moreover it is suggested by scientists to make more ICT investments to exploit the potentials of ICT to alleviate CO2 emissions in general, and to implement a more effective coordination of ICT, energy and growth policies.[50] Consequently, applying the principle of the coase theorem makes sense. It recommends to make investments there, where the marginal avoidance costs of emissions are the lowest, therefore in the developing countries with comparatively lower technological standards and policies as high-tech countries. With these measures, ICT can reduce environmental damage from economic growth and energy consumption by facilitating communication and infrastructure.
In problem-solving
[edit]ICTs could also be used to address environmental issues, including climate change, in various ways, including ways beyond education.[51][52][53]
See also
[edit]- Behavioral change support system
- Cloud computing
- Cognitive infocommunications
- DICOM
- Digital divide
- Example of information and communication technologies for education
- Gender digital divide
- Global e-Schools and Communities Initiative
- Infocommunications
- Information Age
- Market information systems
- Mobile web
- Picture archiving and communication system
- 21st century skills
- World Innovation, Technology and Services Alliance
References
[edit]- ^ Murray, James (2011-12-18). "Cloud network architecture and ICT - Modern Network Architecture". TechTarget =ITKnowledgeExchange. Archived from the original on 2017-09-20. Retrieved 2013-08-18.
- ^ Ozdamli, Fezile; Ozdal, Hasan (May 2015). "Life-long Learning Competence Perceptions of the Teachers and Abilities in Using Information-Communication .Technologies". Procedia - Social and Behavioral Sciences. 182: 718–725. doi:10.1016/j.access=free (inactive 14 October 2025).
{{cite journal}}: CS1 maint: DOI inactive as of October 2025 (link) - ^ "ICT - What is it?". www.tutor2u.net. Archived from the original on 2015-11-02. Retrieved 2015-09-01.
- ^ "IEEE-CS Adopts Skills Framework for the Information Age • IEEE Computer Society". www.computer.org. Archived from the original on 13 June 2018. Retrieved 14 March 2018.
- ^ William Melody et al., Information and Communication Technologies: Social Sciences Research and Training: A Report by the ESRC Programme on Information and Communication Technologies, ISBN 0-86226-179-1, 1986. Roger Silverstone et al., "Listening to a long conversation: an ethnographic approach to the study of information and communication technologies in the home", Cultural Studies, 5(2), pages 204–227, 1991.
- ^ The Independent ICT in Schools Commission, Information and Communications Technology in UK Schools: An Independent Inquiry, 1997. Impact noted in Jim Kelly, What the Web is Doing for Schools Archived 2011-07-11 at the Wayback Machine, Financial Times, 2000.
- ^ "Shut down or restart? The way forward for computing in UK schools" (PDF). Royal Society. January 2012. p. 18. Retrieved 2024-12-14.
- ^ Department for Education, "National curriculum in England: computing programmes of study".
- ^ United Nations Office of Information and Communications Technology, About Archived 2018-02-04 at the Wayback Machine
- ^ "IDC - Global ICT Spending - 2018 - $3.8T". IDC: The premier global market intelligence company. Retrieved 2018-09-24.
- ^ "IDC - Global ICT Spending - Forecast 2018 – 2022". IDC: The premier global market intelligence company. Retrieved 2018-09-24.
- ^ "Federal Information Technology FY2014 Budget Priorities" (PDF). obamawhitehouse.archives.gov.
- ^ a b "IT Costs – The Costs, Growth And Financial Risk Of Software Assets". OMT-CO Operations Management Technology Consulting GmbH. Archived from the original on 12 August 2013. Retrieved 26 June 2011.
- ^ "IDC - Global ICT Spending - Forecast 2018 – 2022". IDC: The premier global market intelligence company. Retrieved 2018-09-24.
- ^ a b c d "The World's Technological Capacity to Store, Communicate, and Compute Information", Martin Hilbert and Priscila López (2011), Science, 332(6025), 60–65; see also "free access to the study" and "video animation".
- ^ Gillings, Michael R; Hilbert, Martin; Kemp, Darrell J (2016). "Information in the Biosphere: Biological and Digital Worlds". Trends in Ecology & Evolution. 31 (3): 180–189. Bibcode:2016TEcoE..31..180G. doi:10.1016/j.tree.2015.12.013. PMID 26777788. S2CID 3561873.
- ^ Hilbert, Martin (2016). "The bad news is that the digital access divide is here to stay: Domestically installed bandwidths among 172 countries for 1986–2014". Telecommunications Policy. 40 (6): 567–581. doi:10.1016/j.telpol.2016.01.006.
- ^ Figure 1.9 Share of ICT sector in total value added, 2013, doi:10.1787/888933224163
- ^ "Measuring the Information Society" (PDF). International Telecommunication Union. 2011. Retrieved 25 July 2013.
- ^ a b "ITU releases annual global ICT data and ICT Development Index country ranking - librarylearningspace.com". 2014-11-30. Retrieved 2015-09-01.
- ^ "Basic information: about was". International Telecommunication Union. 17 January 2006. Retrieved 26 May 2012.
- ^ a b "ICT Facts and Figures – The world in 2015". ITU. Retrieved 2015-09-01.
- ^ "What is Writing to Learn, WAC Clearinghouse".
- ^ "Evidence for How Writing Can Improve Reading, Carnegie.Org 2010" (PDF).
- ^ Genlott, Annika Agélii; Grönlund, Åke (August 2016). "Closing the gaps – Improving literacy and mathematics by ict-enhanced collaboration". Computers & Education. 99: 68–80. doi:10.1016/j.compedu.2016.04.004.
- ^ "ICT in Education". Unesco. Retrieved 10 March 2016.
- ^ Birt, Jacqueline; Safari, Maryam; de Castro, Vincent Bicudo (2023-03-20). "Critical analysis of integration of ICT and data analytics into the accounting curriculum: A multidimensional perspective". Accounting & Finance. 63 (4): 4037–4063. doi:10.1111/acfi.13084. ISSN 0810-5391. S2CID 257675501.
- ^ Blackwell, C.K., Lauricella, A.R. and Wartella, E., 2014. Factors influencing digital technology use in early childhood education. Computers & Education, 77, pp.82-90.
- ^ a b UNESCO (2018). A Lifeline to learning: leveraging mobile technology to support education for refugees. UNESCO. ISBN 978-92-3-100262-5.
- ^ a b c d Agence Française de Développement (February 2015). "Digital services for education in Africa" (PDF). unesco.org. Retrieved 19 May 2018.
- ^ a b "ITU releases annual global ICT data and ICT Development Index country rankings". www.itu.int. Retrieved 2015-09-01.
- ^ "Survey: 1 In 6 Internet Users Own A Smartwatch Or Fitness Tracker". ARC. Retrieved 2015-09-01.
- ^ "ITU releases annual global ICT data and ICT Development Index country rankings". www.itu.int. Retrieved 2015-09-01.
- ^ Bimber, Bruce (1998-01-01). "The Internet and Political Transformation: Populism, Community, and Accelerated Pluralism". Polity. 31 (1): 133–160. doi:10.2307/3235370. JSTOR 3235370. S2CID 145159285.
- ^ Hussain, Muzammil M.; Howard, Philip N. (2013-03-01). "What Best Explains Successful Protest Cascades? ICTs and the Fuzzy Causes of the Arab Spring". International Studies Review. 15 (1): 48–66. doi:10.1111/misr.12020. hdl:2027.42/97489. ISSN 1521-9488.
- ^ Kirsh, David (2001). "The Context of Work". Human Computer Interaction. 16 (2–4): 305–322. doi:10.1207/S15327051HCI16234_12. S2CID 28915179.
- ^ Cardoso LG, Sorenson SB. Violence against women and household ownership of radios, computers, and phones in 20 countries. American Journal of Public Health. 2017; 107(7):1175–1181.
- ^ Cabinet Office, Government bodies must comply with Open Standards Principles, published on 1 November 2012, accessed on 3 September 2025
- ^ Novak, Matt. "Telemedicine Predicted in 1925". Smithsonian Magazine. Retrieved 27 January 2022.
- ^ Albritton, Jordan; Ortiz, Alexa; Wines, Roberta; Booth, Graham; DiBello, Michael; Brown, Stephen; Gartlehner, Gerald; Crotty, Karen (7 December 2021). "Video Teleconferencing for Disease Prevention, Diagnosis, and Treatment" (PDF). Annals of Internal Medicine. 175 (2): 256–266. doi:10.7326/m21-3511. ISSN 0003-4819. PMID 34871056. S2CID 244923066.
- ^ a b c d e f g h Warschauer, Mark (2004). Technology and Social Inclusion. Cambridge, Massachusetts: The MIT Press. pp. 39–49. ISBN 0-262-23224-3.
- ^ "The Real Access / Real Impact framework for improving the way that ICT is used in development" (PDF). 26 December 2005.
- ^ a b "Mobile Fact Sheet". Pew Research Center. 13 November 2024.
- ^ Perrin, Andrew (19 August 2021). "Digital gap between rural and nonrural America persists". Pew Research Center.
- ^ a b Anderson, Monica (13 June 2019). "Mobile Technology and Home Broadband 2019". Pew Research Center.
- ^ a b c d Scribner and Cole, Sylvia and Michael (1981). The Psychology of Literacy. ISBN 978-0-674-43301-4.
- ^ a b Gerhard, Fettweis; Zimmermann, Ernesto (2008). "ITC Energy Consumption - Trends and Challenges". The 11th International Symposium on Wireless Personal Multimedia Communications (WPMC 2008) – via ResearchGate.
- ^ a b Lange, Steffen; Pohl, Johanna; Santarius, Tilman (2020-10-01). "Digitalization and energy consumption. Does ICT reduce energy demand?". Ecological Economics. 176 106760. Bibcode:2020EcoEc.17606760L. doi:10.1016/j.ecolecon.2020.106760. ISSN 0921-8009. S2CID 224947774.
- ^ "Rolling Plan for ICT standardization 2021". Joinup. European Commission. 2021. Retrieved 2022-01-08.
- ^ Lu, Wen-Cheng (2018-12-01). "The impacts of information and communication technology, energy consumption, financial development, and economic growth on carbon dioxide emissions in 12 Asian countries". Mitigation and Adaptation Strategies for Global Change. 23 (8): 1351–1365. Bibcode:2018MASGC..23.1351L. doi:10.1007/s11027-018-9787-y. ISSN 1573-1596. S2CID 158412820.
- ^ Fox, Evan Michael (2019). "Mobile Technology: A Tool to Increase Global Competency Among Higher Education Students". The International Review of Research in Open and Distributed Learning. 20 (2). doi:10.19173/irrodl.v20i2.3961. ISSN 1492-3831. S2CID 242492985.
- ^ "Digitalisation for a circular economy: A driver for European Green Deal". EPC. Archived from the original on Oct 8, 2023.
- ^ Charfeddine, Lanouar; Umlai, Mohamed (2023). "ICT sector, digitization and environmental sustainability: A systematic review of the literature from 2000 to 2022". Renewable and Sustainable Energy Reviews. 184 113482. Bibcode:2023RSERv.18413482C. doi:10.1016/j.rser.2023.113482.
Sources
[edit]
This article incorporates text from a free content work. Licensed under CC BY-SA 3.0 IGO. Text taken from A Lifeline to learning: leveraging mobile technology to support education for refugees, UNESCO, UNESCO. UNESCO.
Further reading
[edit]- Cantoni, L., & Danowski, J. A. (Eds.). (2015). Communication and Technology. Berlin: De Gruyter Mouton.
- Carnoy, Martin. "ICT in Education: Possibilities and Challenges." Universitat Oberta de Catalunya, 2005.
- "Good Practice in Information and Communication Technology for Education." Asian Development Bank, 2009.
- Grossman, G.; Helpman, E. (2005). "Outsourcing in a global economy". Review of Economic Studies. 72: 135–159. CiteSeerX 10.1.1.159.5158. doi:10.1111/0034-6527.00327.
- Feridun, Mete; Karagiannis, Stelios (2009). "Growth Effects of Information and Communication Technologies: Empirical Evidence from the Enlarged EU". Transformations in Business and Economics. 8 (2): 86–99.
- Oliver, Ron. "The Role of ICT in Higher Education for the 21st Century: ICT as a Change Agent for Education." University, Perth, Western Australia, 2002.
- Walter Ong, Orality and Literacy: The Technologizing of the Word (London, UK: Routledge, 1988), in particular Chapter 4
- Measuring the Information Society: The ICT Development Index (PDF). International Telecommunication Union. 2013. p. 254.
- Measuring the Information Society Report: 2014. International Telecommunication Union.
External links
[edit]Information and communications technology
View on GrokipediaDefinition and Scope
Etymology and Terminology
The term "information and communications technology" (ICT) denotes the ensemble of technologies enabling the capture, processing, storage, transmission, and presentation of information, encompassing both computing and telecommunications infrastructures.[13] The acronym ICT specifically highlights the integration of communication systems with data-handling capabilities, distinguishing it from narrower usages.[14] The foundational phrase "information technology" (IT) originated in a 1958 Harvard Business Review article by Harold J. Leavitt and Thomas L. Whisler, which described the emerging application of electronic computers, programming, and systems analysis to managerial functions in organizations.[15] This coinage captured the post-World War II shift toward automated data processing, initially focused on hardware like mainframe computers and punch-card systems rather than interpersonal communication networks.[16] ICT as a terminology expanded from IT during the 1970s and 1980s amid technological convergence, particularly with the advent of packet-switched networks and microprocessors that blurred lines between computing devices and telecommunication apparatuses.[17] By the 1990s, ICT gained traction in policy and educational contexts to emphasize unified systems for voice, data, and video transmission, as seen in international standards bodies like the International Telecommunication Union (ITU), which adopted the term to frame global digital infrastructure development.[18] In usage, ICT often serves as a synonym for IT in American English, but in British, European, and developing-world contexts, it deliberately includes broadcasting, mobile telephony, and satellite systems to reflect broader societal applications.[13] Related terms include "information systems" (IS), which prioritizes organizational data flows over hardware, and "digital technology," a more contemporary descriptor for post-analog innovations; however, ICT remains the standard in regulatory frameworks, such as those defining spectrum allocation for wireless communications.[19] Etymologically, "information" derives from the Latin informare (to give form to the mind), while "communication" stems from communicare (to share), underscoring the field's roots in shaping and disseminating knowledge through technical means.[20]Distinction from Related Fields
Information and communications technology (ICT) differs from information technology (IT) primarily in scope, with ICT integrating IT's focus on data processing and storage with technologies enabling interpersonal and machine-to-machine communication, such as telephony, broadcasting, and networking infrastructure.[21][22] IT, by contrast, centers on the management, storage, and utilization of data through hardware, software, and internal systems, often within organizational contexts like enterprise resource planning or cybersecurity, without inherently emphasizing external communication channels.[21] This distinction arose in the late 1990s as digital convergence blurred lines between computing and telecom, prompting ICT to emerge as a broader umbrella term; for instance, IT might deploy servers for database management, whereas ICT would encompass those servers alongside VoIP systems for global voice data transmission.[23] ICT also contrasts with computer science, which prioritizes theoretical foundations such as algorithms, computational complexity, and software design principles over practical deployment and integration.[24][25] Computer science, formalized in the mid-20th century through figures like Alan Turing and institutions like MIT, abstracts computing into mathematical models—e.g., Turing machines for proving undecidability—whereas ICT applies these concepts to real-world systems, including hardware interoperability and user-centric interfaces for information exchange.[26] By 2023, enrollment data showed computer science degrees emphasizing programming paradigms like object-oriented design, while ICT curricula incorporated applied modules on network protocols and multimedia transmission, reflecting ICT's orientation toward scalable, end-to-end solutions rather than pure innovation in computation.[27] Relative to telecommunications, ICT extends beyond mere signal transmission—telecom's core domain since the 1830s invention of the telegraph—by fusing it with information processing capabilities, such as data encoding, compression, and analytics within communication pipelines.[28][29] Telecommunications engineering, governed by standards like ITU-T recommendations since 1865, handles physical layer challenges like spectrum allocation and modulation (e.g., 5G's millimeter-wave bands achieving 10 Gbps speeds by 2019 trials), but lacks ICT's holistic inclusion of endpoint devices and software ecosystems for content creation and consumption.[29] Thus, while telecom provides the conduits (e.g., fiber-optic cables spanning 1.4 million km globally by 2022), ICT orchestrates their use in convergent systems like IP-based unified communications, where data packets carry both voice and metadata analytics.[28] This boundary has shifted with IP convergence since the 1990s, yet telecom remains narrower, focused on reliable transport rather than the full information lifecycle.[30]Core Components and Boundaries
Information and communications technology (ICT) encompasses the hardware, software, networks, and data systems that enable the creation, storage, processing, transmission, and exchange of information. Core components include hardware such as computers, servers, smartphones, and networking equipment like routers and switches, which provide the physical infrastructure for computation and connectivity.[31][32] Software forms another foundational element, comprising operating systems, applications, middleware, and databases that manage data operations and user interactions.[31][33] Networks, both wired and wireless, integrate these elements by facilitating data transfer across local, wide-area, and global scales, including protocols for internet and telecommunications infrastructure.[31] Data itself, as digitized information, relies on storage solutions and processing capabilities to be actionable within these systems.[31] The boundaries of ICT are defined by its emphasis on integrated information handling and communication, distinguishing it from narrower fields. Unlike information technology (IT), which primarily focuses on computer-based data processing, storage, and management, ICT explicitly incorporates telecommunications for transmission and real-time exchange, such as through mobile networks and the internet.[21][34] Telecommunications, by contrast, centers on signal transmission over distances via mediums like cables or radio waves but excludes broader data manipulation and software ecosystems central to ICT.[35][36] ICT's scope thus extends to any technology enabling information dissemination, including satellite systems and audiovisual tools, but excludes non-technological domains like print media or purely analog broadcasting without digital integration.[37] These components and boundaries have evolved with technological convergence; for instance, the integration of IP-based protocols since the 1990s has blurred lines between traditional telecom and computing, expanding ICT to encompass cloud computing and IoT devices as unified systems for information flow.[38] However, ICT remains delimited from adjacent areas like cybersecurity (a supportive function) or media production (an application layer), focusing instead on enabling technologies rather than content creation or end-user practices.[39] This delineation ensures ICT addresses systemic capabilities for scalable, efficient information ecosystems, as evidenced by global standards from bodies like the ITU, which define it as tools for gathering, storing, and exchanging data across boundaries.[37]Historical Development
Precursors to Modern ICT (Pre-1940s)
The development of precursors to modern information and communications technology before the 1940s laid foundational principles for data processing, automated calculation, and electrical signaling over distances. Early mechanical innovations, such as Joseph Marie Jacquard's programmable loom introduced in 1801, utilized punched cards to control weaving patterns, marking an initial application of binary-like instructions for automating complex tasks. This concept influenced later data storage methods. In the 1820s, Charles Babbage conceived the Difference Engine to compute mathematical tables mechanically, followed by the Analytical Engine in 1837, a design for a general-purpose programmable machine capable of performing any calculation through punched cards for input, storage, and conditional operations—elements akin to modern programming and memory.[40] Although never fully built due to technical and funding limitations, Babbage's engines represented a shift toward programmable computation driven by the need for accurate logarithmic and astronomical tables. Electrical communication emerged in the electromechanical era starting around 1840, transforming information transmission from physical to instantaneous signaling. Samuel F. B. Morse developed the electric telegraph between 1832 and 1835, enabling messages via coded electrical pulses over wires; the first public demonstration occurred in 1838, and the inaugural long-distance line transmitted "What hath God wrought" from Washington, D.C., to Baltimore on May 24, 1844.[41] This system reduced message delivery times from days to minutes, facilitating rapid coordination for businesses, governments, and news services, with over 50,000 miles of lines in the U.S. by 1861. Building on telegraphy, Alexander Graham Bell patented the telephone on March 7, 1876, allowing voice transmission over wires through electromagnetic conversion of sound waves, which spurred global network expansion to millions of subscribers by the early 1900s.[42] Data processing advanced with electromechanical tabulation systems, exemplified by Herman Hollerith's punched-card machines deployed for the 1890 U.S. Census. Hollerith's electric tabulator, using cards with holes representing demographic data, processed over 60 million cards to complete population tallies in months rather than years, reducing processing time by up to 90% compared to manual methods and enabling scalable statistical analysis.[43] Wireless extensions followed, with Guglielmo Marconi achieving the first transatlantic radio transmission on December 12, 1901, from Poldhu, Cornwall, to Signal Hill, Newfoundland, using Morse code signals over 2,000 miles without wires, which revolutionized maritime and military communications by eliminating terrain-dependent cabling.[44] These pre-1940s advancements, rooted in empirical needs for efficiency in calculation, record-keeping, and signaling, established causal pathways—such as encoded instructions and electromagnetic propagation—integral to later digital integration, despite limitations in scale and reliability imposed by mechanical and analog constraints.Post-War Foundations and Analog Era (1940s-1970s)
The post-World War II era laid critical foundations for information and communications technology through advancements in electronic computing and analog transmission systems, driven largely by military and commercial demands for faster calculation and reliable long-distance signaling. Electronic digital computers emerged as tools for complex numerical processing, supplanting mechanical predecessors, while telecommunications infrastructure expanded using continuous-wave analog methods to handle voice, video, and emerging data signals. In 1945, the ENIAC (Electronic Numerical Integrator and Computer) became operational at the University of Pennsylvania, marking the first large-scale, general-purpose electronic digital computer designed for U.S. Army ballistic trajectory calculations.[45] It employed approximately 18,000 vacuum tubes, spanned 1,800 square feet, and performed 5,000 additions per second, though reconfiguration for new tasks required manual rewiring.[45] This machine demonstrated the feasibility of electronic computation at speeds unattainable by electromechanical devices, influencing subsequent designs like the stored-program architecture outlined by John von Neumann in 1945.[45] The invention of the transistor in December 1947 at Bell Laboratories by John Bardeen, Walter Brattain, and William Shockley revolutionized electronics by replacing fragile vacuum tubes with solid-state semiconductors capable of amplification and switching.[46] The point-contact transistor, demonstrated using germanium, amplified signals up to 100 times, enabling more compact, reliable, and energy-efficient systems that powered second-generation computers in the 1950s and 1960s.[46] Integrated circuits, pioneered by Jack Kilby at Texas Instruments in 1958, further miniaturized components, setting the stage for scaled computing hardware.[47] Communications technologies during this period relied on analog modulation techniques, such as amplitude and frequency modulation for radio and television broadcasting, which proliferated post-war with the rise of consumer television sets reaching millions of households by the 1950s. Long-distance telephony advanced through microwave relay networks and coaxial cables, but a breakthrough came with TAT-1, the first transatlantic submarine telephone cable, activated on September 25, 1956, linking Scotland to Newfoundland and initially supporting 36 simultaneous voice channels via analog frequency-division multiplexing.[48] This cable, spanning 2,200 miles and incorporating repeaters every 70 miles to boost signals, reduced latency and dependence on shortwave radio, handling up to 72 channels by the 1970s before digital alternatives emerged.[48] Satellite communications debuted with Telstar 1, launched on July 10, 1962, by NASA in collaboration with Bell Laboratories and AT&T, as the first active repeater satellite relaying analog television, telephone, and facsimile signals across the Atlantic.[49] Orbiting at about 600 miles altitude, Telstar enabled the first live transatlantic TV broadcast on July 23, 1962, though limited by its low-Earth orbit requiring ground station tracking and brief visibility windows of 20 minutes per pass.[49] These developments underscored analog systems' strengths in bandwidth for voice and video but highlighted limitations in noise susceptibility and scalability, paving the way for digital modulation in later decades.Digital Revolution and Personal Computing (1980s-1990s)
The digital revolution in information and communications technology during the 1980s and 1990s marked the widespread adoption of digital electronics for data processing and storage, supplanting analog systems and enabling personal-scale computing. This era saw the transition from mainframe-dominated environments to affordable microcomputers, driven by advances in semiconductor technology such as the Intel 8086 microprocessor family, which reduced costs and increased processing power for individual users. By the mid-1980s, personal computers began entering households and offices, facilitating tasks like word processing, spreadsheets, and basic data communications via modems, with global PC shipments rising from approximately 724,000 units in 1980 to millions annually by the decade's end.[50][51] A pivotal development was the release of the IBM Personal Computer (model 5150) on August 12, 1981, priced at $1,565 for the base configuration with 16 KB RAM and an Intel 8088 processor running PC-DOS (a variant of Microsoft's MS-DOS). IBM's adoption of an open architecture, using off-the-shelf components from third parties like Intel for the CPU and Microsoft for the OS, encouraged compatibility and cloning, which eroded IBM's market share but accelerated industry growth; by 1986, IBM-compatible PCs accounted for over 50% of sales, with 5 million units shipped that year. Apple's Macintosh 128K, introduced on January 24, 1984, for $2,495, popularized graphical user interfaces (GUIs) and mouse-based input, building on Xerox PARC innovations but tailored for consumer appeal through integrated hardware and software like Mac OS.[52][53][54] In the 1990s, personal computing matured with enhanced portability and multimedia capabilities. Microsoft's Windows 3.0, launched in May 1990, and Windows 3.1 in 1992, sold over 10 million copies in their first two years by improving GUI stability on MS-DOS and supporting applications like Microsoft Office, solidifying Windows' dominance on Intel-based PCs. Hardware advancements included the IBM PC AT (1984) with 80286 processor for multitasking, the rise of laptops like Compaq's Portable in 1982, and processors such as Intel's Pentium (1993), which boosted performance for internet access and CD-ROM-based media. By the mid-1990s, PC penetration in U.S. households reached about 20-30%, enabling early digital communications like bulletin board systems (BBS) and fax modems, though bandwidth limitations constrained widespread networking until later protocols.[55][56][57]Internet Expansion and Mobile Era (2000s-2010s)
The dot-com bubble's collapse in 2000-2001 triggered a sharp contraction in the ICT sector, with the NASDAQ Composite Index dropping over 75% from its peak and leading to widespread startup failures and layoffs, yet it paradoxically accelerated infrastructure deployment as excess fiber-optic capacity from overinvestment became available at lower costs, facilitating subsequent broadband rollout.[58][59] By the mid-2000s, broadband internet supplanted dial-up connections, with global fixed broadband subscriptions rising from negligible levels in 2000 to approximately 500 million by 2010, driven by DSL, cable, and early fiber deployments that enabled higher-speed access essential for data-intensive applications.[60][61] The emergence of Web 2.0 in the mid-2000s shifted the internet toward interactive, user-generated content platforms, exemplified by Facebook's founding in 2004, YouTube in 2005, and Twitter in 2006, which collectively amassed billions of users by decade's end and transformed information dissemination from static websites to dynamic social networks.[62] Facebook alone reached 500 million monthly active users by July 2010, underscoring the era's causal link between participatory tools and exponential network effects in content creation and sharing.[63] This period saw global internet users expand from about 413 million in 2000 (6.7% penetration) to 1.97 billion by 2010 (28.7% penetration), with penetration rates in developed regions exceeding 70% by 2010 due to affordability gains and infrastructure investments.[64] The mobile era accelerated in 2007 with Apple's iPhone launch on June 29, integrating touchscreen interfaces, app ecosystems, and mobile web browsing, which catalyzed smartphone adoption from a 3% global market share in 2007 to over 50% of mobile devices by 2015.[65][66] Google's Android platform followed in September 2008, fostering open-source competition and rapid proliferation of affordable devices, with 4G LTE networks rolling out around 2010 to support high-speed mobile data, enabling ubiquitous internet access beyond fixed lines.[65][67] In the United States, smartphone ownership surged from 35% in 2011 to 91% by 2021, reflecting broader global trends where mobile subscriptions outpaced fixed broadband and drove internet penetration in developing regions.[68] Cloud computing gained traction as a scalable infrastructure model, with Amazon Web Services (AWS) publicly launching its Elastic Compute Cloud (EC2) and Simple Storage Service (S3) in 2006, allowing on-demand access to computing resources and reducing barriers for ICT innovation by shifting from capital-intensive hardware ownership to utility-based provisioning.[69] This complemented mobile growth by enabling backend support for apps and data services, with AWS's model influencing competitors and contributing to the era's efficiency in handling surging data volumes from social and mobile usage.[70] By the 2010s, these developments intertwined to make ICT more pervasive, with mobile internet traffic comprising a majority of global data flows and fostering applications in e-commerce, streaming, and real-time communication.[71]Contemporary Advances (2020s Onward)
The 2020s have witnessed accelerated integration of artificial intelligence into ICT infrastructures, driven by the COVID-19 pandemic's demand for remote capabilities and subsequent computational scaling. Generative AI models, such as OpenAI's GPT-3 released in June 2020 with 175 billion parameters, marked a shift toward large-scale language processing, enabling applications in natural language understanding and code generation. By 2022, ChatGPT's public launch demonstrated multimodal AI's viability, processing over 100 million users within two months and spurring enterprise adoption for tasks like content creation and data analysis. Agentic AI, capable of autonomous decision-making, emerged as a 2025 trend, with systems executing multi-step workflows without constant human oversight, as forecasted by Gartner.[72] Wireless network advancements centered on 5G commercialization, with global deployments surpassing 100 operators by August 2020 and subscriber growth projected to cover 65% of the world's population by mid-decade.[73] Ericsson reported 5G connections reaching 1.76 billion by end-2023, enabling low-latency applications in industrial IoT and autonomous vehicles, though spectrum auctions and infrastructure costs delayed full standalone (SA) core implementations in some regions until 2024. Research into 6G commenced in earnest post-2020, focusing on terahertz frequencies for data rates up to 100 Gbps; by 2025, 30% of efforts targeted THz communications, with demonstrations at MWC showcasing AI-native architectures for self-optimizing networks.[74] Ericsson's 2025 prototypes integrated sensing and communication, aiming for 2030 commercialization.[75] Semiconductor innovations addressed AI's compute demands through node shrinks and specialized architectures. TSMC's 3nm process entered volume production in late 2022, powering chips like Apple's A17 Pro with 19 billion transistors, enhancing efficiency for mobile AI inference. Advanced packaging techniques, such as 3D stacking, became critical by 2025 for high-bandwidth memory (HBM) in AI accelerators, mitigating Moore's Law slowdowns and enabling Nvidia's H100 GPUs to deliver 4 petaflops in FP8 precision.[76] GaN-based wafers scaled to 300mm by Infineon in 2024 reduced power losses in RF amplifiers, supporting 5G base stations' energy efficiency.[77] Quantum computing progressed from noisy intermediate-scale regimes to error-corrected prototypes. IBM's 2023 roadmap targeted 100,000 qubits by 2033, with 2025 milestones including modular systems via quantum-centric supercomputing hybrids, achieving logical qubits for practical simulations in materials science.[78] By mid-2025, experiments demonstrated post-quantum cryptography standards, as NIST finalized algorithms like CRYSTALS-Kyber to counter harvest-now-decrypt-later threats from advancing quantum capabilities.[79] These developments, while not yet fault-tolerant at scale, underscored ICT's shift toward hybrid classical-quantum paradigms for optimization problems intractable on classical hardware.[80]Technical Foundations
Hardware Evolution
The evolution of hardware in information and communications technology (ICT) began with electronic components enabling computation and data transmission, transitioning from bulky vacuum tube-based systems in the 1940s to compact, high-performance semiconductors. Early computers like the ENIAC (1945) relied on over 17,000 vacuum tubes, which were power-hungry, generated excessive heat, and failed frequently, limiting reliability and scalability for ICT applications such as signal processing and early data networks.[81] The invention of the transistor at Bell Laboratories in 1947 marked a pivotal shift, replacing vacuum tubes with solid-state devices that amplified and switched electrical signals more efficiently, reducing size, power consumption, and cost while increasing speed—enabling second-generation computers like the IBM 7090 (1959) for scientific and communication tasks.[82] [83] The development of the integrated circuit (IC) in 1958 by Jack Kilby at Texas Instruments integrated multiple transistors onto a single silicon chip, facilitating miniaturization and mass production essential for ICT hardware.[84] This led to third-generation systems in the 1960s, such as IBM's System/360 (1964), which incorporated ICs for modular computing and peripheral interfaces supporting early telecommunications. The microprocessor, exemplified by the Intel 4004 (1971) with 2,300 transistors on a 4-bit chip operating at 740 kHz, centralized processing on a single chip, powering calculators and eventually personal computers like the Altair 8800 (1975), which spurred ICT accessibility through hobbyist kits with expandable memory up to 64 KB.[85] Moore's Law, observed by Gordon Moore in 1965, predicted transistor density doubling approximately every two years, driving exponential improvements in processor performance; by the 1980s, chips like the Intel 80386 (1985) featured 275,000 transistors at 40 MHz, enabling multitasking for networked ICT environments.[86] Memory and storage hardware evolved in parallel to support data-intensive ICT functions. Magnetic core memory, introduced in MIT's Whirlwind computer (1953), provided non-volatile storage of about 2 KB with access times under 10 microseconds, superior to prior delay-line memory for real-time applications like radar data processing.[87] Semiconductor RAM emerged in the late 1960s, with dynamic RAM (DRAM) chips like Intel's 1103 (1970) offering 1 KB per chip, scaling to gigabytes by the 2000s via denser fabrication. Storage advanced from IBM's 305 RAMAC hard disk drive (1956), storing 5 MB on 50 disks weighing over a ton, to solid-state drives (SSDs) using NAND flash, with capacities reaching 100 TB enterprise models by 2023 through 3D stacking techniques.[81] Networking hardware, including modems for analog-to-digital conversion (first commercial in 1958) and Ethernet transceivers (1973 invention), integrated into routers and switches by the 1980s, facilitating TCP/IP-based communications with speeds from 10 Mbps to fiber-optic gigabits.[83] In the mobile and embedded ICT era from the 1990s onward, hardware miniaturized further with system-on-chip (SoC) designs combining processors, memory, and radios; ARM-based chips in devices like the IBM Simon personal communicator (1994) paved the way for smartphones, with Apple's A-series processors (2008 onward) integrating billions of transistors for on-device computing and 5G modems. GPUs, originally for graphics (NVIDIA GeForce 256, 1999, with 23 million transistors), evolved into parallel processors for AI workloads, with NVIDIA's A100 (2020) delivering 19.5 TFLOPS for tensor operations in data centers.[86] Specialized accelerators like Google's Tensor Processing Units (TPUs, first deployed 2016) optimized matrix multiplications for machine learning, achieving up to 100 petaFLOPS in v4 pods by 2021. In the 2020s, process nodes shrank to 3 nm (e.g., TSMC's 2022 production), enabling chips with over 100 billion transistors, while chiplet architectures in AMD's EPYC processors (2017 debut) improved yields for high-performance computing. Quantum hardware prototypes, such as IBM's 433-qubit Osprey (2022), explore superposition for intractable ICT problems like cryptography, though error rates remain high, limiting practical deployment.[88] These advances, grounded in semiconductor physics and fabrication scaling, have causally enabled ICT's expansion by exponentially increasing computational density and energy efficiency, from kilowatts in early mainframes to watts in edge devices.[82]Software and Algorithms
Software in information and communications technology (ICT) comprises programs, procedures, and associated documentation that enable hardware to process, store, and transmit data efficiently. It transforms inert computing devices into functional systems capable of handling complex tasks such as real-time communication and data analytics. System software, including operating systems and network protocols, provides the foundational layer for resource allocation and device coordination, while application software delivers user-facing tools like email clients and web browsers. Middleware facilitates interoperability between disparate systems, such as in enterprise resource planning integrations.[89][90] Algorithms underpin software functionality by specifying step-by-step computational procedures to solve problems, with efficiency evaluated through metrics like time complexity and space usage via Big O notation. In ICT contexts, algorithms optimize data routing in networks—employing methods like Dijkstra's shortest-path algorithm, formulated in 1956 for graph traversal—to minimize latency in packet-switched environments. Compression algorithms, such as Huffman coding developed in 1952, reduce bandwidth demands for media transmission, while error-correcting codes ensure data integrity over noisy channels, as formalized in Claude Shannon's 1948 mathematical theory of communication. Encryption algorithms like RSA, introduced in 1977, secure confidential exchanges in protocols such as HTTPS.[91][92][93] Historical evolution of ICT software traces to the 1940s-1950s pioneering era of machine-code programming for early computers like ENIAC, which required manual reconfiguration for tasks. The 1960s introduced structured programming paradigms to enhance modularity and reduce errors, exemplified by languages like ALGOL 60. By the 1970s, UNIX—initially released in 1971 at Bell Labs—established portable, multi-user operating systems pivotal for networked ICT, influencing modern Linux kernels first distributed in 1991. The TCP/IP suite, designed in 1974 and implemented widely via 1983 Berkeley distributions, standardized internetworking software, enabling scalable global communications. The 1990s saw object-oriented designs in languages like C++ (1985) promote reusable code for distributed systems, while the World Wide Web's software stack, prototyped 1989-1991 at CERN, integrated hypertext transfer protocols with graphical browsers.[94] In contemporary ICT (post-2010s), software leverages cloud-native architectures for elasticity, with containers like Docker (2013 open-sourced) virtualizing environments to support microservices in 5G infrastructures. Machine learning algorithms, including convolutional neural networks refined since the 1980s but accelerated by 2012's AlexNet breakthrough, drive adaptive features in ICT applications such as predictive routing and anomaly detection in cybersecurity. Agile methodologies, emerging in the 2001 Manifesto, have supplanted waterfall models for iterative development, reducing deployment times from months to days in DevOps pipelines. However, algorithmic biases—arising from skewed training data—can propagate systemic errors in decision-making tools, necessitating rigorous validation against empirical benchmarks. Software defects persist as failure points; the 2021 Log4Shell vulnerability in Apache Log4j affected millions of ICT systems, underscoring the causal link between unpatched code and widespread disruptions.[95][94][96]| Key Algorithm Categories in ICT | Examples | Primary Function |
|---|---|---|
| Routing and Networking | Dijkstra (1956), BGP (1989) | Path optimization for data packets across topologies.[92] |
| Data Compression | Huffman (1952), LZ77 (1977) | Bandwidth-efficient storage and transmission of information.[91] |
| Security and Cryptography | RSA (1977), AES (2001) | Protection of data confidentiality and integrity in communications.[96] |
| Machine Learning | Gradient Descent (1847 origins, modern 1950s+), Neural Networks | Pattern recognition and automation in signal processing and user interfaces.[96] |
Networks, Protocols, and Infrastructure
Computer networks in information and communications technology (ICT) are systems that interconnect devices to facilitate data exchange, categorized primarily by geographic scope and scale. Local Area Networks (LANs) connect devices within a limited area, such as a building or campus, typically using Ethernet standards to achieve high-speed, low-latency communication over distances up to a few kilometers.[97] Wide Area Networks (WANs), including the global Internet, span larger regions or continents, relying on routers and diverse transmission media to manage higher latency and integrate disparate local networks.[97] Metropolitan Area Networks (MANs) bridge the gap, covering city-wide extents for applications like municipal services or enterprise connectivity.[98] Protocols define the rules for data formatting, transmission, and error handling across these networks, with the TCP/IP suite serving as the foundational standard for the Internet. Developed in the 1970s by Vinton Cerf and Robert Kahn to interconnect heterogeneous networks, TCP/IP was formalized in the early 1980s and adopted by ARPANET on January 1, 1983, replacing the earlier Network Control Protocol (NCP).[99] TCP ensures reliable, ordered delivery of data packets, while IP handles addressing and routing; together, they enable end-to-end connectivity without centralized control.[100] The Internet Engineering Task Force (IETF), established in 1986, oversees protocol evolution through open working groups and Request for Comments (RFC) documents, producing standards like HTTP for web communication and DNS for domain resolution.[101] This decentralized, consensus-driven process has sustained Internet scalability, though it prioritizes functionality over strict security in legacy designs.[101] Physical and logical infrastructure underpins these networks, comprising transmission media, switching equipment, and supporting facilities. Fiber-optic cables dominate backbone infrastructure, with submarine systems carrying over 99% of international data traffic; as of 2025, 570 such cables are operational globally, with 81 more planned to address surging demand from cloud computing and AI.[102] Investments in new subsea cables from 2025 to 2027 exceed $13 billion, driven by hyperscale data centers that process and store petabytes of data.[103] Terrestrial infrastructure includes coaxial and fiber links, supplemented by wireless technologies: 5G networks, deployed commercially since 2019, offer peak speeds up to 20 Gbps—over 100 times faster than 4G—and latencies under 1 millisecond, enabling applications like autonomous vehicles and remote surgery.[104] Data centers, numbering over 10,000 worldwide in 2025, host servers for edge computing and cloud services, with power consumption reaching 2-3% of global electricity amid efficiency challenges from dense AI workloads.[105] Emerging trends include software-defined networking (SDN) for dynamic resource allocation and satellite constellations like Starlink, providing WAN alternatives in underserved regions with latencies around 20-40 ms.[106]Economic Role and Impacts
Industry Structure and Monetization Models
The information and communications technology (ICT) industry is structured around four core segments: hardware manufacturing, software development, information technology (IT) services, and telecommunications infrastructure provision. Global IT spending, which largely overlaps with ICT expenditures, totaled an estimated $5.43 trillion in 2025, marking a 7.9% year-over-year increase driven by demand for cloud infrastructure and AI capabilities.[107] IT services formed the dominant segment at $1.50 trillion in revenue for 2025, surpassing hardware at approximately $141 billion.[108][109] Market concentration varies by subsector; cloud computing exhibits oligopolistic traits, with Amazon Web Services, Microsoft Azure, and Google Cloud commanding over 60% combined share as of 2024, enabling control over scalable computing resources essential for AI deployment.[110] The semiconductor supply chain similarly features tight oligopolies among foundries like TSMC and Samsung, which produced over 50% of advanced nodes in 2024, constraining upstream innovation due to capital barriers exceeding $20 billion per facility.[111] Leading firms dominate revenue generation, with Microsoft topping IT services providers at over $200 billion in 2024, followed by Alphabet (Google) at $283 billion and Samsung Electronics at $234 billion across hardware and software-integrated products.[112][113] Vertical integration is common among top players; for instance, Apple controls design, manufacturing, and ecosystem services, capturing higher margins than fragmented competitors.[113] In the U.S., which held the largest national ICT market share in 2024, IT services accounted for 38% of activity, underscoring a services-led structure amid hardware commoditization.[114][115] This segmentation fosters interdependence, as hardware relies on software ecosystems for value addition, while services integrate telecom networks for enterprise solutions. Monetization models prioritize recurring revenues over transactional sales to stabilize cash flows amid rapid obsolescence. Hardware segments generate income via outright device sales (e.g., smartphones and servers) and leasing, with margins pressured by supply chain costs but bolstered by proprietary components.[109] Software has transitioned to subscription licensing and SaaS, where users pay periodic fees for access rather than perpetual licenses; this model, exemplified by Microsoft's Office 365, yielded over 70% recurring revenue by 2024, reducing piracy risks and enabling continuous updates.[116][117] Complementary freemium and pay-as-you-go variants attract volume users before upselling premium features, as seen in tools like Zoom or AWS usage billing.[117] IT services monetize through fixed-fee projects, time-and-materials contracts, and outcome-based outsourcing, with global firms like Accenture deriving 80% of earnings from long-term enterprise deals averaging multi-year durations.[118] Telecommunications traditionally employs flat-rate subscriptions for connectivity (e.g., $50-100 monthly per consumer line) augmented by metered data usage, but operators increasingly bundle ICT services like cloud storage or cybersecurity into "super-apps" for 20-30% revenue uplift.[119] Emerging streams include data monetization, where anonymized datasets fuel advertising or analytics sales—Google's model generated $224 billion in ad revenue in 2023—though regulatory scrutiny limits direct sales.[120] Overall, the shift to platform-mediated models enhances scalability but heightens dependency on user lock-in and network effects for sustained profitability.[121]| Segment | Key Monetization Models | Revenue Characteristics |
|---|---|---|
| Hardware | Device sales, component licensing, leasing | Transactional, cyclical with upgrades |
| Software | SaaS subscriptions, freemium, pay-per-use | Recurring, high margins post-acquisition |
| IT Services | Project contracts, managed services, outsourcing | Long-term, service-intensity driven |
| Telecommunications | Subscriptions, usage fees, bundled ICT add-ons | Stable base with variable overages |
Contributions to Global Productivity and Growth
Information and communications technology (ICT) has driven substantial gains in global labor productivity through automation, data processing efficiencies, and enhanced resource allocation, with empirical studies consistently showing positive correlations between ICT adoption and output per worker. For instance, a 10% increase in ICT capital investment is associated with approximately 0.6% higher economic growth rates across analyzed economies.[122] In OECD countries, ICT investments have contributed to multi-factor productivity growth, particularly in sectors with high intangible asset intensity, where digital tools complement human capital by enabling faster decision-making and reducing operational redundancies.[123] These effects stem from causal mechanisms such as network effects in broadband infrastructure, which amplify information flows and foster specialization, though gains vary by institutional quality and complementary investments in skills and regulation. The digital economy, encompassing ICT goods, services, and enabling infrastructure, accounted for 15.5% of global GDP by 2016, expanding at rates exceeding twice the overall economic average, with business e-commerce sales rising nearly 60% from 2016 to 2022 across 43 countries representing three-quarters of world GDP.[124][125] In the OECD, the ICT sector grew at an average annual rate of 6.3% from 2013 to 2023—three times the pace of the broader economy—propelling aggregate productivity through innovations like cloud computing and enterprise software that lower transaction costs and scale operations globally.[126] Country-level data further illustrate this: in the United States, IT-related investments contributed 0.35 percentage points to value-added growth in 2019, while recent surges in data center spending accounted for nearly all GDP expansion in the first half of 2025, underscoring ICT's role in sustaining momentum amid decelerating traditional sectors.[127][128] Emerging technologies within ICT, such as artificial intelligence and high-speed networks, are projected to yield further macroeconomic productivity boosts, with models estimating significant output per capita increases over the next decade in G7 economies through task automation and predictive analytics.[129] However, realization of these gains depends on overcoming barriers like skill mismatches and uneven infrastructure deployment, as evidenced by meta-analyses confirming stronger ICT-growth linkages in contexts with robust human capital and policy support.[130] Fixed broadband penetration, in particular, has been linked to accelerated per capita income growth in developing and developed settings alike, via channels including e-commerce expansion and supply chain optimization.[131] Overall, ICT's contributions reflect a compounding effect, where initial investments in hardware and connectivity yield sustained growth through iterative software advancements and data-driven efficiencies.Innovation Drivers and Market Dynamics
Innovation in information and communications technology (ICT) is primarily propelled by private sector investments, competitive pressures, and breakthroughs in foundational technologies such as artificial intelligence (AI), edge computing, and sustainable infrastructure. In 2024, U.S. venture capital firms closed 14,320 deals worth $215.4 billion, with AI-related investments surging 52% year-over-year, enabling startups to pioneer advancements like autonomous agents and hyperautomation.[132][133] These funds concentrate in hubs like Silicon Valley, where market competition incentivizes firms to enhance innovation efficiency, as empirical studies of the IT sector demonstrate a causal link between product market rivalry and increased patenting and R&D output.[134] While intense competition can occasionally reduce collaborative knowledge-sharing, it overall fosters dynamic entry by new entrants, countering monopolistic complacency.[135] Government policies further shape these drivers, with divergent approaches across regions amplifying or constraining progress. In the United States, a relatively permissive regulatory environment and emphasis on intellectual property protection have sustained leadership, underpinning public R&D that complements private efforts in semiconductors and AI.[136] China's state-directed model, involving substantial subsidies and technology security strategies, accelerates catch-up in areas like 5G infrastructure and AI hardware, though it risks inefficiencies from over-centralization.[137][138] The European Union, prioritizing regulatory frameworks like data privacy mandates, has spurred innovations in ethical AI but trails in raw investment scale, with state aid reaching 1.4% of GDP amid efforts to bolster digital sovereignty.[139] This policy variance underscores how lighter-touch regimes correlate with higher innovation velocity, as evidenced by the concentration of top AI startups in the U.S.[140] Market dynamics reflect a winner-take-all structure dominated by a few hyperscalers—such as Alphabet, Amazon, Apple, Meta, and Microsoft—whose network effects and scale economies reinforce barriers to entry, yet paradoxically fuel ecosystem-wide innovation through platform APIs and cloud services. Global ICT spending is projected to reach $5.43 trillion in 2025, growing 7.9% from 2024, driven by enterprise AI adoption and infrastructure upgrades.[107] Regional imbalances persist, with North America capturing over 40% of VC inflows, while Asia-Pacific growth in manufacturing and deployment offsets slower European expansion.[141] Antitrust scrutiny in jurisdictions like the EU aims to curb concentration, but evidence suggests that curbing dominant firms' R&D could inadvertently slow sector-wide progress unless balanced against competitive incentives.[142] Overall, these dynamics exhibit resilience, with 2025 outlooks pointing to sustained expansion amid AI integration, though geopolitical tensions and supply chain vulnerabilities pose risks to uninterrupted scaling.[143]Sectoral Applications
In Education and Learning
Information and communications technology (ICT) in education encompasses the integration of digital devices, software, and networks into teaching and learning processes to facilitate access to information, interactive instruction, and personalized education. Common applications include computers, tablets, internet connectivity for online resources, learning management systems like Moodle or Google Classroom, and educational software for simulations and adaptive learning. By 2022, approximately 50% of lower secondary schools worldwide had internet connectivity, reflecting accelerated adoption during the COVID-19 pandemic when remote learning became widespread.[144] Empirical evidence on ICT's impact on student outcomes remains mixed, with meta-analyses indicating modest positive effects in specific contexts such as STEM education and deep learning, where effect sizes range from small to moderate depending on implementation. For instance, a 2023 meta-analysis found digital technology-assisted STEM instruction significantly boosted academic achievement, attributed to interactive visualizations enhancing conceptual understanding. However, broader reviews, including those from PISA data, show no consistent positive relationship between ICT use and performance across subjects, often due to inadequate teacher training or overuse leading to distractions. In high-income countries, only about 10% of 15-year-old students reported frequent classroom ICT use in 2018, suggesting persistent underutilization despite availability.[145][146][147][148] Adoption rates surged in the 2020s, with K-12 EdTech usage increasing 99% since 2020, driven by platforms for virtual collaboration and AI-assisted tutoring. In higher education, studies report improved engagement and efficiency, with 63% of K-12 teachers incorporating generative AI by 2025. Yet, causal realism highlights that benefits hinge on pedagogical integration rather than mere access; poorly designed tech can exacerbate cognitive overload or reduce face-to-face interaction without yielding superior outcomes compared to traditional methods.[149][150][151] Significant challenges persist, particularly the digital divide, which widens educational inequities. An estimated 1.3 billion school-aged children lacked home internet access as of 2023, disproportionately affecting rural and low-income areas, leading to learning losses during disruptions. Urban-rural disparities in teacher digital literacy and infrastructure further hinder equitable implementation, with empirical data linking socioeconomic status to ICT proficiency gaps that perpetuate achievement disparities. Over-reliance on screens also raises concerns about attention spans and social development, though rigorous longitudinal studies on these effects are limited.[152][153][154]In Healthcare Delivery
Information and communications technology (ICT) has transformed healthcare delivery by enabling electronic health records (EHRs), telemedicine, artificial intelligence (AI)-assisted diagnostics, wearable monitoring devices, and data analytics platforms. EHRs facilitate the digitization and sharing of patient data, reducing duplication of tests and delays in treatment while providing alerts for improved safety.[155][156] Implementation of EHRs correlates with enhanced clinical workflows, better care coordination, and up to 18% lower readmission rates in fully adopting hospitals.[157][158] However, interoperability challenges persist, limiting full realization of these benefits without standardized protocols.[159] Telemedicine, leveraging video conferencing and remote monitoring protocols, expanded rapidly post-2019, with U.S. physician adoption rising from 15.4% to 86.5% by 2021, addressing physician shortages projected at 86,000 by 2036.[160][161] Overall adoption reached 80% for certain services like prescription care by 2025, with patient satisfaction at 55% for virtual visits due to convenience.[162][163] The global telehealth market is forecasted to exceed $55 billion by end-2025, driven by hybrid models integrating AI for triage, though low-value care utilization remains a concern in some analyses.[164][165] AI applications in diagnostics show variable performance; a meta-analysis of 83 studies reported 52.1% overall accuracy, comparable to physicians but susceptible to bias, with accuracy dropping 11.3% under systematically flawed inputs.[166][167] Large language models like ChatGPT Plus yielded no significant diagnostic improvement over standard resources in controlled tests.[168] Despite this, AI aids workload reduction and early detection in specific contexts, such as 85.7% accuracy in sepsis prediction across 52,000 patients.[169][170] Wearable devices enable continuous health tracking, proving effective for increasing physical activity across populations and monitoring chronic conditions like cardiovascular disease to prevent escalations.[171][172] They yield cost savings and quality-adjusted life years gains, though usage disparities exist, with lower adoption among those needing them most for equity reasons.[173][174] Healthcare data analytics optimizes resource allocation, identifying inefficiencies to cut wasteful spending—estimated at 25% of U.S. healthcare costs—and enabling savings from $126 to over $500 per patient via predictive interventions.[175][176] Integration with health information exchanges further reduces readmissions and administrative burdens when embedded in workflows.[177] These tools collectively enhance delivery efficiency and outcomes, contingent on addressing data privacy, equity gaps, and validation against empirical benchmarks.[178]In Scientific Research
Information and communications technology (ICT) facilitates scientific research by enabling the processing of vast datasets, execution of intricate simulations, and coordination among distributed teams. High-performance computing (HPC) systems, comprising clusters of processors operating in parallel, allow researchers to model complex phenomena such as nuclear reactions, climate dynamics, and molecular interactions that exceed the scope of physical experimentation.[179] For instance, facilities like Lawrence Livermore National Laboratory employ HPC for realistic engineering simulations that complement empirical testing.[180] In fields generating petabyte-scale data, ICT underpins analytics and storage infrastructures essential for discovery. The Large Hadron Collider (LHC) at CERN produces approximately 1 petabyte of data annually from particle collisions, processed via the Worldwide LHC Computing Grid (WLCG), a distributed network granting near real-time access to over 12,000 physicists worldwide for event reconstruction and pattern analysis.[181] Similarly, genomics research leverages big data analytics to interpret DNA sequences from high-throughput sequencing, decoding functional information through computational and statistical methods to advance understandings of disease mechanisms and personalized medicine.[182] Machine learning applications within ICT have accelerated breakthroughs in predictive modeling. DeepMind's AlphaFold, released in 2021, achieved atomic-level accuracy in protein structure prediction by integrating neural networks trained on evolutionary data, solving structures for nearly all known human proteins and enabling rapid hypothesis testing in biology that previously required decades of lab work.[183] Validation through competitions like CASP14 confirmed its superiority over prior methods, though predictions for novel proteins without close homologs remain subject to experimental verification.[184] ICT also supports remote instrumentation and collaborative platforms, allowing real-time data sharing across global consortia. In astronomy and earth sciences, simulations on supercomputers like those at Idaho National Laboratory model seismic events or planetary atmospheres, integrating observational data for predictive accuracy unattainable by manual computation.[185] These tools, while transformative, depend on robust middleware for resource optimization and data integrity, as seen in grid systems that standardize access to heterogeneous hardware. Overall, ICT's integration has shortened research timelines, from years to months in cases like structural biology, by automating analysis and scaling computational power.[186]In Business and Commerce
Information and communications technology (ICT) underpins modern business operations by enabling automation, data integration, and real-time decision-making through systems like enterprise resource planning (ERP) and customer relationship management (CRM). ERP software centralizes core processes such as inventory management, financial reporting, and supply chain coordination, reducing manual errors and operational delays; for instance, implementations have streamlined back-office functions and built historical data for forecasting in manufacturing firms.[187] CRM platforms aggregate customer interactions, sales pipelines, and marketing analytics, fostering targeted outreach and retention strategies that enhance revenue per client. Integration of ERP and CRM systems synchronizes data flows, minimizing silos and boosting overall profitability by automating routine tasks across departments.[188] Empirical evidence from manufacturing enterprises indicates that such digital tools elevate production efficiency by optimizing resource allocation and minimizing downtime.[189] Cloud computing has accelerated ICT adoption in commerce, with over 94% of enterprises utilizing public or hybrid models for scalable storage, computing power, and collaboration tools as of 2025. This shift allows businesses to deploy applications without heavy upfront infrastructure investments, supporting remote workforces and dynamic scaling during demand fluctuations; global end-user spending on public cloud services reached $723.4 billion in 2025. In sectors like retail and logistics, cloud-based platforms facilitate predictive analytics for demand forecasting and just-in-time inventory, cutting costs by up to 30% in optimized supply chains according to enterprise case studies.[190] [191] E-commerce, a cornerstone of ICT-driven commerce, generated $6.01 trillion in global retail sales in 2024, projected to rise to $6.42 trillion in 2025 amid penetration rates exceeding 20% of total retail. Platforms leveraging ICT for secure transactions, personalized recommendations via machine learning, and global logistics tracking have democratized market access for small enterprises, enabling cross-border sales without physical storefronts. Digital marketplaces like those powered by AWS or similar infrastructures process billions of transactions annually, with growth fueled by mobile integration and AI-driven fraud detection.[192] [193] [194] Broader digital transformation via ICT correlates with gains in total factor productivity (TFP), as firms adopting integrated technologies report enhanced labor productivity through process automation and data-driven insights; studies of Chinese enterprises, for example, quantify a positive TFP uplift from reduced production costs and improved innovation mechanisms. In commerce, big data analytics from ICT systems enable granular market segmentation and pricing optimization, with platforms analyzing consumer behavior to predict trends and mitigate risks. However, realization of these benefits hinges on robust implementation, as incomplete integrations can exacerbate inefficiencies, underscoring the need for strategic alignment over mere tool deployment.[195] [196]Societal and Developmental Dimensions
Access Frameworks and Digital Divides
Access frameworks in information and communications technology (ICT) encompass regulatory policies and mechanisms designed to ensure equitable availability of basic services, such as voice telephony, internet connectivity, and broadband, particularly in underserved areas. These include universal service obligations imposed on operators to provide minimum service levels at affordable prices, and universal service and access funds (USAFs) that collect levies from telecom revenues to subsidize infrastructure deployment in remote or low-income regions.[197][198] By 2024, over 100 countries had established such funds or policies, often expanding from traditional telephony to broadband as digital services became essential for economic participation.[199] These frameworks operate through public-private partnerships, competitive bidding for subsidized projects, and incentives like tax breaks for rural deployments, aiming to extend physical infrastructure such as fiber optics and mobile towers where market forces alone fail due to high costs and low population density. In practice, effectiveness varies; for instance, Latin American USAFs have financed thousands of community access points, but inefficiencies like poor project monitoring have limited outcomes in some cases.[200][199] Periodic policy reviews adapt to technological shifts, such as integrating satellite and 5G solutions, to maintain relevance amid evolving ICT needs.[201] Digital divides refer to disparities in ICT access and usage that exacerbate inequalities, manifesting along geographic, economic, demographic, and skill-based lines. Globally, as of 2024, 5.5 billion people (68% of the population) use the internet, leaving 2.6 billion offline, with high-income countries achieving 93% penetration compared to 27% in low-income ones.[202] Urban-rural gaps persist starkly, with 83% internet usage in cities versus 48% in rural areas, driven by infrastructure deficits like sparse network coverage and high deployment costs in low-density zones.[203] Gender disparities show 70% of men online versus 65% of women, equating to a 189 million person gap, often rooted in cultural barriers and device ownership differences in developing regions.[204] Within countries, divides compound across income and education levels; for example, OECD data from 2024 indicate fixed broadband speed gaps between urban and rural areas widened to 58 Mbps from 22 Mbps five years prior, hindering high-bandwidth applications like remote work and education in peripheral regions.[205] Affordability remains a barrier, with 49% of non-users citing lack of need or cost as reasons, alongside skills gaps that limit effective utilization even where access exists.[206] These divides causally impede economic mobility, as unconnected populations miss opportunities in e-commerce, online learning, and job markets reliant on digital tools. Bridging efforts rely on subsidies and infrastructure investments, such as the U.S. NTIA's administration of nearly $50 billion in 2024 for broadband expansion targeting unserved areas, including affordability vouchers and rural fiber builds.[207] Globally, governments promote shared infrastructure and digital literacy programs, though studies suggest affordability subsidies often yield faster adoption gains than pure infrastructure outlays in demand-constrained markets.[208] Despite progress, with internet users rising 227 million from 2023 to 2024, structural challenges like regulatory hurdles and private investment reluctance in low-return areas sustain divides, necessitating sustained, targeted interventions over broad-spectrum approaches.[209][210]ICT in Developing Regions
In developing regions, characterized by low- and middle-income economies in sub-Saharan Africa, South Asia, and Latin America, ICT adoption has accelerated primarily through mobile technologies, bypassing traditional fixed-line infrastructure. As of 2024, mobile cellular subscriptions reach over 90% penetration in many such areas, enabling leapfrogging to digital services, though fixed broadband remains below 10% in least developed countries (LDCs).[211] Internet usage stands at approximately 35% in LDCs, compared to the global average of 68%, with 2.6 billion people worldwide—predominantly in low-income regions—remaining offline due to uneven coverage.[212][202][213] Empirical evidence indicates that ICT deployment correlates with economic growth in these regions, particularly via mobile broadband, which exhibits a stronger positive relationship in low-income per capita areas than in wealthier ones. Studies across developing economies show ICT infrastructure contributing to GDP increases through enhanced productivity, job creation, and financial inclusion, with mobile adoption driving regional growth rates up to 1-2% annually in affected sectors.[214][215] A notable example is Kenya's M-Pesa, launched in 2007 by Safaricom, which has facilitated mobile money transfers for over 51 million users across East Africa, processing $236.4 billion in transactions in 2022 and enabling unbanked populations to save, remit, and access credit, thereby boosting local economies and reducing poverty by improving financial access.[216][217] This model has spurred broader mobile money adoption, with 40% of adults in developing economies holding financial accounts by 2024, a 16-percentage-point rise since 2021, primarily via phone-based services.[218] Despite these gains, persistent challenges hinder equitable ICT diffusion, including inadequate infrastructure, unreliable electricity, and high data costs relative to income—often exceeding 10% of average monthly earnings in LDCs. Digital literacy gaps and institutional weaknesses, such as weak regulatory enforcement, exacerbate adoption barriers, leaving rural and female populations disproportionately excluded; for instance, only 27% of low-income country residents access the internet, widening the global digital divide.[219][220][213] Conflicts, climate disasters, and underinvestment in skills training further compound these issues, risking long-term exclusion from digital economies unless addressed through targeted infrastructure and policy reforms.[221][222]Metrics and Indices of Adoption
The ICT Development Index (IDI), compiled by the International Telecommunication Union (ITU), evaluates national levels of ICT access, use, and skills across 164 countries, with the 2025 edition reporting a global average score of 78 out of 100, reflecting incremental advances in universal and meaningful connectivity despite persistent gaps in skills and usage.[223][224] Fixed broadband subscriptions reached 19.6 per 100 people worldwide in 2024, while mobile-cellular subscriptions averaged 112 per 100 inhabitants, underscoring mobile networks' dominance in extending access, particularly in low-income regions.[225][226] Internet penetration stood at 67.9% globally as of early 2025, equating to 5.56 billion users, with China hosting the largest absolute number at 1.11 billion (78.2% of its population) and Northern European countries like Iceland and Denmark exceeding 98% coverage.[227][228] Mobile-broadband subscriptions neared parity with cellular subscriptions in many markets, at 87 per 100 people in 2023, driven by 4G expansions and early 5G rollouts, though fixed-broadband lags in developing areas limited high-speed applications.[229] The Networked Readiness Index (NRI), produced by the Portulans Institute, gauges broader digital ecosystem maturity, including technology adoption, governance, and impact; in 2024, the United States led with a score of 77.19, followed by Singapore (76.94) and Finland (75.76), while India improved to 49th place amid gains in AI and fiber-optic infrastructure.[230] These indices reveal adoption disparities: high-income economies average IDI scores above 90, versus below 50 in least-developed countries, where infrastructure costs and regulatory hurdles impede progress.[231] Regional leaders like South Korea in broadband speeds (averaging 200 Mbps download in 2024) contrast with sub-Saharan Africa's 40% internet penetration, highlighting causal factors such as investment density and policy stability over mere population metrics.[232]| Metric | Global Value (Latest) | Source |
|---|---|---|
| IDI Score | 78/100 (2025) | ITU[223] |
| Internet Penetration | 67.9% (2025) | DataReportal[227] |
| Mobile Subscriptions | 112/100 people (2024) | World Bank[233] |
| Fixed Broadband Subscriptions | 19.6/100 people (2024) | ITU[225] |
| NRI Top Rank | United States (2024) | Portulans Institute |