Criticism of technology
View on Wikipedia
Criticism of technology is an analysis of adverse impacts of industrial and digital technologies. It is argued that, in all advanced industrial societies (not necessarily only capitalist ones), technology becomes a means of domination, control, and exploitation,[1] or more generally something which threatens the survival of humanity. Some of the technology opposed by the most radical critics may include everyday household products, such as refrigerators, computers, and medication.[2] However, criticism of technology comes in many shades.
Overview
[edit]Some authors such as Chellis Glendinning and Kirkpatrick Sale consider themselves Neo-Luddites and hold that technological progress has had a negative impact on humanity. Their work focused on seeking meaning out of technological change, specifically wrestling with the question of "how tools and their affordances change and alter the fabric of everyday life."[3] Ellul, for instance, maintained that when people assert that technology is an instrument of freedom or the means to achieve historical destiny or the execution of divine vocation, it results in the glorification and sanctification of Technique so that it becomes that which gives meaning and value to life rather than mere ensemble of materials.[4] This is echoed by rhetorical critics who cite the way technological discourse damages institutions and individuals who make up those institutions due to its idealization and capacity to define social hierarchies.[5]
In its most extreme, criticisms of technology produce analyses of technology as potentially leading to catastrophe. For instance, activist Naomi Klein described how technology is employed by capitalism in its commitment to a "shock doctrine", which promotes a series of crises so that speculative profit can be accumulated.[4] There are theorists who also cite the 2008 financial crisis as well as the Chernobyl and Fukushima disasters to support their critique.[4] Critiques also focus on specific issues such as how technology—through robotics, automation, and software—is destroying people's jobs faster than it is creating them, contributing to the incidence of poverty and inequality.[6]
In the 1970s in the US, the critique of technology became the basis of a new political perspective called anarcho-primitivism, which was forwarded by thinkers such as Fredy Perlman, John Zerzan, and David Watson. They proposed differing theories about how it became an industrial society, and not capitalism as such, that was at the root of contemporary social problems. This theory was developed in the journal Fifth Estate in the 1970s and 1980s, and was influenced by the Frankfurt School, the Situationist International, Jacques Ellul and others.
The critique of technology overlaps with the philosophy of technology but whereas the latter tries to establish itself as an academic discipline the critique of technology is basically a political project, not limited to academia. It features prominently in neo-Marxist (Herbert Marcuse and Andrew Feenberg), ecofeminism (Vandana Shiva) and in post development (Ivan Illich)
See also
[edit]References
[edit]- ^ Lorenzano, Pablo; Rheinberger, Hans-Jörg; Ortiz, Eduardo; Galles, Carlos (2010). History and Philosophy of Science and Technology. Oxford: EOLSS Publishers Co. Ltd. pp. 124–125. ISBN 9781848267763.
- ^ Glendinning, Chellis. Notes towards a Neo-Luddite manifesto. Utne Reader, 1990.
- ^ Watson, Sara (October 2016). "Toward a Constructive Technology Criticism". Columbia Journalism Review. Retrieved 2018-10-19.
- ^ a b c Jeronimo, Helena; Garcia, Jose; Mitcham, Carl (2013). Jacques Ellul and the Technological Society in the 21st Century. Dordrecht: Springer Science+Business Media. p. 116. ISBN 9789400766570.
- ^ Enos, Theresa (2013). Encyclopedia of Rhetoric and Composition: Communication from Ancient Times to the Information Age. New York: Routledge. p. 619. ISBN 978-0824072001.
- ^ Rotman, David. "How Technology Is Destroying Jobs". MIT Technology Review. Retrieved 2018-10-19.
Further reading
[edit]- Wendy Hui Kyong Chun, Control and freedom (2006)
- Donna J. Haraway, A Cyborg Manifesto (1985)
- Gilles Deleuze, Postscript on the Societies of Control (1992)
- Lewis Mumford, Technics and Civilization (1934)
- Layla AbdelRahim, Children's Literature, Domestication, and Social Foundation: Narratives of Civilization and Wilderness, Routledge, 2018 paperback ISBN 9781138547810; 2015 hardback ISBN 9780415661102
- Michael Adas, Machines as the Measure of Men: Science, Technology, and Ideologies of Western Dominance, Cornell University Press 1990
- Braun, Ernest (2009). Futile Progress: Technology’s Empty Promise, Routledge.
- Jacques Ellul, The Technological Society, Trans. John Wilkinson. New York: Knopf, 1964. London: Jonathan Cape, 1965. Rev. ed.: New York: Knopf/Vintage, 1967. with introduction by Robert K. Merton (professor of sociology, Columbia University).
- Jacques Ellul, The Technological Bluff, Trans. Geoffrey W. Bromiley. Grand Rapids: Eerdmans, 1990.
- Andrew Feenberg, Transforming Technology. A Critical Theory Revisited, Oxford University Press, 2nd edition 2002, ISBN 0-19-514615-8 - Feenberg offers a "coherent starting point for anticapitalist technical politics" [citation needed] to overcome what he considers to be the "fatalism" of Ellul, Heidegger, and other proponents of "substantive" theories of technology.
- Martin Heidegger, The Question Concerning Technology, and Other Essays, B&T 1982, ISBN 0-06-131969-4
- Huesemann, Michael H., and Joyce A. Huesemann (2011). Technofix: Why Technology Won’t Save Us or the Environment, New Society Publishers, Gabriola Island, British Columbia, Canada, ISBN 0865717044.
- Derrick Jensen and George Draffan, Welcome to the Machine: Science, Surveillance, and the Culture of Control, Chelsea Green Publishing Company, 2004, ISBN 1-931498-52-0
- Mander, Jerry (1992). In the Absence of the Sacred: The Failure of Technology and the Survival of the Indian Nations, Sierra Club Books.
- Postman, Neil (1993). Technopoly: The Surrender of Culture to Technology, Vintage.
- David Watson, Against the Megamachine, Brooklyn: Autonomedia, 1998, ISBN 1-57027-087-2 - The title essay is available online here
- Joseph Weizenbaum, Computer Power and Human Reason: From Judgement to Calculation, W.H.Freeman & Co Ltd, New Edition 1976
- Langdon Winner, Autonomous Technology: Technics-Out-Of-Control as a Theme in Political Thought, MIT Press 1977, ISBN 978-0-262-23078-0
- Peter Zelchenko (1999). Exploring Alternatives to Hype. Educational Leadership 56(5), pp. 78–81.
- Theodore John Kaczynski, Anti-Tech Revolution: Why and How, Fitch & Madison, 2016
External links
[edit]Criticism of technology
View on GrokipediaHistorical Foundations
Ancient and Pre-Industrial Critiques
One of the earliest recorded critiques of technology appears in Plato's Phaedrus (c. 370 BCE), where Socrates recounts the Egyptian myth of Theuth, the god credited with inventing writing, and King Thamus's response to it. Thamus warns that writing would induce forgetfulness in the mind by providing an external aid to memory rather than cultivating genuine recollection and wisdom, leading people to rely on "reminders" instead of internal knowledge stored in the soul.[7] This oral critique, preserved through Plato's dialogues, reflects a broader Socratic skepticism toward innovations that might undermine dialectical reasoning and the pursuit of truth via live discourse, as writing fixed words without the adaptability of spoken exchange.[8] In ancient Hebrew tradition, the Tower of Babel narrative in Genesis 11:1-9 (composed c. 6th-5th century BCE) portrays human technological ambition—constructing a massive brick tower to reach the heavens—as an act of collective hubris that provokes divine intervention, scattering languages and halting the project. Interpreted by some scholars as a caution against overreliance on unified material engineering to achieve godlike status, the story underscores the limits of human invention when divorced from ethical or transcendent constraints, emphasizing instead humility before natural and divine order.[9] Taoist philosophy in ancient China, particularly Laozi's Tao Te Ching (c. 6th-4th century BCE), critiques technological contrivances as deviations from the natural Tao (way), arguing that excessive reliance on tools and artifices disrupts simplicity, fosters dependency, and leads to societal disorder. Laozi advocates returning to unadorned existence, viewing inventions as products of human excess that complicate rather than harmonize with the spontaneous flow of nature, a perspective echoed in later Daoist texts warning against mechanical interference in organic processes.[10] In pre-industrial Europe, Jean-Jacques Rousseau's Discourse on the Sciences and Arts (1750) extended these themes by contending that advancements in knowledge and mechanical arts, while ostensibly purifying society, actually corrupted morals by promoting vanity, inequality, and luxury over virtue and self-sufficiency. Drawing on historical examples from Sparta's simplicity versus decadent civilizations, Rousseau argued empirically that scientific progress inversely correlated with civic integrity, as evidenced by the moral decline in refined societies like 18th-century France compared to rustic or ancient republics.[11] This critique, which won the Dijon Academy's essay prize, influenced later romantic skepticism toward unchecked innovation, prioritizing causal links between technological sophistication and social fragmentation over unexamined progress narratives.[12]Industrial Revolution and Mechanization Fears
The advent of mechanized production in Britain's textile sector during the late 18th century elicited early apprehensions about labor displacement and economic insecurity. Devices such as James Hargreaves' spinning jenny (1764) and Richard Arkwright's water frame (1769) automated spinning, diminishing demand for hand spinners, while Edmund Cartwright's power loom (patented 1785) threatened weavers by enabling factories to produce cloth with fewer skilled operators.[13] In 1786, Leeds woolen workers petitioned Parliament, decrying machinery for causing "great want and misery" through unemployment and wage suppression, as machines replaced multiple artisans with one overseer and unskilled assistants.[14] These concerns culminated in the Luddite disturbances of 1811–1816, primarily among skilled framework knitters in Nottinghamshire, Derbyshire, Leicestershire, and Yorkshire, who smashed wide knitting frames and cropping frames that facilitated the use of cheaper, lower-quality yarn and reduced the need for expertise.[15] Attributed to a legendary figure "Ned Ludd," the movement involved coordinated nighttime attacks on factories, with over 1,000 frames destroyed by 1812, driven by fears that mechanization enabled employers to hire women and children at reduced rates—sometimes half the men's wages—eroding craft guilds and community standards.[16] Protesters distinguished between beneficial innovations and those widening inequality, targeting only frames that deskilled labor and bypassed traditional apprenticeships.[17] The British government's suppression was severe: the Frame Breaking Act (1812) classified machine destruction as a felony punishable by death or transportation, leading to the deployment of 12,000 troops—more than in the Peninsular War against Napoleon—and the execution or imprisonment of dozens, including 17 hangings after trials at York in 1813.[15] Economic theorists grappled with these issues; David Ricardo, in the 1821 third edition of On the Principles of Political Economy and Taxation, appended a chapter conceding that machinery could harm workers by accelerating capital accumulation at labor's expense, temporarily lowering wages as displaced artisans flooded the market before new sectors absorbed them—a shift from his earlier view that technological progress uniformly benefited society.[18] Such critiques underscored mechanization's potential to disrupt established livelihoods without immediate compensatory gains, fueling broader debates on technology's societal trade-offs.[19]20th-Century Philosophical Critiques
In the early 20th century, Lewis Mumford's Technics and Civilization (1934) critiqued the historical trajectory of technology as intertwined with cultural and social forces, distinguishing between "eotechnic" phases emphasizing human-scale tools and the "paleotechnic" era of coal-powered machinery that prioritized efficiency over human well-being, leading to urban squalor, labor alienation, and environmental degradation.[20] Mumford argued that this mechanization fostered a "will-to-order" that predated machines, transforming humans into extensions of production systems before complex devices were perfected, thus inverting causal priorities where cultural mechanization enabled technological dominance.[21] He advocated a "biotechnic" future integrating biology, aesthetics, and ethics to counterbalance raw technics, warning that unchecked power technologies absorbed human values into market-driven expansion.[22] Mid-century, Martin Heidegger's essay "The Question Concerning Technology" (1954) shifted focus to technology's ontological essence, positing it as Gestell (enframing), a revealing mode that reduces nature and humans to "standing-reserve"—orderable resources extractable on demand—rather than mere instrumental tools for ends.[23] Heidegger contended this essence challenges forth the earth aggressively, concealing other ways of disclosing Being, such as poetic dwelling, and endangers authentic human existence by narrowing thought to calculative efficiency, exemplified in hydroelectric dams treating rivers as mere energy stocks.[24] He rejected optimistic views of technology as neutral, insisting its totalizing frame demands a "free relation" through meditative thinking to avert self-destruction, though critics note his analysis overlooks technology's empirical benefits like poverty reduction. Jacques Ellul, in The Technological Society (1954), introduced "technique" as an autonomous, self-augmenting system of efficient means dominating all human domains, from economics to politics, eroding freedom and morality by rendering ends subordinate to optimized processes.[1] Ellul described technique's "necessity" as propagating total mobilization, where self-critique becomes impossible due to its adaptive rationality, leading to a "technical milieu" that fragments society into specialized functions and supplants organic human judgment with quantified norms. He evidenced this in state bureaucracies and propaganda, arguing it fosters dependency and loss of transcendence, with no political ideology—left or right—able to resist, as technique infiltrates both.[25] Later, Günther Anders's works, including The Obsolescence of Man (1956 onward), highlighted "Promethean shame"—human inadequacy before god-like technologies like nuclear weapons, where makers disavow their creations' scale, fostering moral numbness and self-adaptation to machines rather than vice versa.[26] Anders critiqued mass media and automation for exiling humanity into artifactual worlds, urging "antiquatedness" awareness to reclaim agency against technological overproduction that outpaces ethical imagination.[27] Hans Jonas's The Imperative of Responsibility (1979) addressed technology's unprecedented power over future generations, proposing an ethics prioritizing vulnerability: "Act so that the effects of your action are compatible with the permanence of genuine human life on earth."[28] Jonas applied this to biotechnology and environmental risks, arguing traditional heuristics fail against irreversible impacts like genetic engineering, necessitating a precautionary imperative to curb hubris in altering nature's heuristics.[29] He distinguished this from mere prudential calculation, grounding it in ontological respect for life's fragility amid tech's "demigod" potential.[30]Environmental and Resource Criticisms
Resource Extraction and Depletion
The manufacture of electronic devices, renewable energy systems, and electric vehicles depends on the extraction of critical minerals including rare earth elements, lithium, and cobalt, which critics contend accelerates the depletion of non-renewable global reserves amid surging demand. Global extraction of non-metallic minerals has increased fivefold since 1970, with technology sectors contributing significantly to this trend through requirements for specialized materials in semiconductors, batteries, and magnets.[31][32] In 2024, demand for cobalt, nickel, graphite, and rare earths rose by 6-8%, primarily from energy technologies like electric vehicles, outpacing supply chain adaptations in many cases.[32] Rare earth elements, essential for components in consumer electronics and wind turbines, saw global production reach 390,000 metric tons in 2024, up from 376,000 metric tons the prior year, with reserves estimated at 30 million tons as of 2025.[33][34] Critics highlight that while reserves appear substantial, extraction is heavily concentrated—China processed over 80% of global supply in recent years—leading to vulnerabilities from geopolitical disruptions and potential long-term shortages if demand doubles by 2040 as projected for clean energy transitions.[35] Lithium, a key battery material, faces similar scrutiny: known reserves of about 22 million tonnes could theoretically yield batteries for 2.8 billion electric vehicles assuming 8 kilograms per unit, yet current production lags demand, with studies indicating even a tenfold ramp-up may insufficiently match electric vehicle growth by the 2040s due to mining bottlenecks.[36][37] Cobalt extraction, vital for lithium-ion batteries in electronics and vehicles, is predominantly from the Democratic Republic of Congo, which supplied over 70% of global output in 2024, raising depletion concerns from artisanal and industrial mining that exhausts high-grade ores without adequate recycling offsets.[38] Supply risk assessments identify cobalt, alongside lithium and rare earths, as facing high depletion pressures from cumulative demand in technology applications, potentially exacerbating price volatility and innovation hurdles if reserves—estimated at under 10 million tonnes globally—do not expand through new discoveries.[39] Empirical analyses of mining sectors, such as iron ore, reveal that resource depletion has often outpaced technological efficiency gains over the past decade, suggesting analogous risks for tech-dependent minerals absent recycling breakthroughs or substitution.[40] These dynamics fuel arguments that unchecked technological expansion prioritizes short-term innovation over sustainable resource stewardship, potentially leading to critical supply thresholds.[41]Pollution, E-Waste, and Ecosystem Disruption
The production and disposal of electronic devices generate substantial electronic waste (e-waste), which reached 62 million tonnes globally in 2022, equivalent to 7.8 kilograms per person.[42] This volume is increasing by 2.6 million tonnes annually and is projected to hit 82 million tonnes by 2030, outpacing documented recycling efforts fivefold.[43] Only 22.3% of e-waste was formally collected and recycled in 2022, with rates expected to decline to 20% by 2030 due to insufficient infrastructure and regulatory enforcement, leaving 77.7% of e-waste's fate undocumented—often resulting in informal processing, landfilling, incineration, or indefinite storage.[42] [44] E-waste contains hazardous materials such as lead, mercury, cadmium, and brominated flame retardants, which leach into soil, water, and air when improperly managed, contaminating ecosystems and entering food chains.[45] [46] Open burning and acid leaching in informal recycling sites release toxic fumes and heavy metals, exacerbating air pollution and groundwater contamination in regions like parts of Asia and Africa where much e-waste is exported.[47] Electronics manufacturing itself contributes to pollution through chemical-intensive processes, including etching with hydrofluoric acid and solvent use in semiconductor fabrication, which discharge effluents high in heavy metals and volatile organic compounds into waterways.[48] Mining for rare earth elements (REEs) essential to tech components, such as neodymium in magnets for hard drives and electric motors, disrupts ecosystems via habitat clearance, soil erosion, and tailings discharge laden with radioactive thorium and uranium.[49] Processing one tonne of REE ore can yield up to 12 tonnes of toxic and radioactive waste, leading to biodiversity loss, acid mine drainage, and long-term soil infertility in mining hotspots like China's Bayan Obo district.[50] These activities have caused measurable declines in local flora and fauna, with indirect effects amplifying through polluted watercourses affecting downstream aquatic life.[51] Tech infrastructure, including data centers supporting cloud computing and AI, adds to pollution via emissions from backup diesel generators and grid-supplied fossil fuel power, releasing nitrogen oxides, particulate matter, and sulfur dioxide that contribute to smog and acid rain.[52] In vulnerable communities near data centers, elevated pollution burdens have been linked to respiratory health risks, with facilities often sited in already high-impact areas scoring in the top 20% for environmental hazards.[53] While technological advancements like cleaner manufacturing could mitigate some effects, current practices underscore a causal link between rapid device proliferation and persistent environmental degradation, as recycling lags behind consumption driven by planned obsolescence and short product lifecycles.[54]Energy Consumption and Climate Attribution
Data centers, which power much of modern computing including cloud services and artificial intelligence applications, consumed approximately 415 terawatt-hours (TWh) of electricity globally in 2024, equivalent to about 1.5% of total global electricity demand.[55] [56] This figure is projected to grow rapidly, with electricity use potentially increasing by 15% annually through 2030 due to AI-driven demand, outpacing overall electricity growth by a factor of four.[55] Critics, including environmental analysts, argue this surge exacerbates energy shortages and grid instability, particularly in regions reliant on fossil fuels, as hyperscale facilities often require dedicated power infrastructure that delays renewable transitions.[57] Artificial intelligence models amplify these concerns through intensive training and inference phases. For instance, training OpenAI's GPT-4 is estimated to have required around 50 gigawatt-hours (GWh) of electricity, comparable to the annual consumption of thousands of households, while individual queries can consume 0.3 to 40 watt-hours depending on complexity.[58] [59] Such demands have drawn criticism for scaling inefficiently with model size, potentially leading to data center expansions that double U.S. power needs to 78 gigawatts by 2035 if unchecked.[60] Detractors contend that tech companies underreport full lifecycle emissions, including cooling and hardware manufacturing, which could inflate the sector's footprint beyond initial estimates.[57] Cryptocurrency mining, particularly Bitcoin, represents another focal point of energy critiques, with the network consuming 143-172 TWh annually in 2024—roughly 0.5% of global electricity and exceeding the usage of countries like Norway (124 TWh) or Poland.[61] [62] This proof-of-work mechanism is faulted for its inelastic energy appetite, as miners chase profitability by relocating to low-cost, often coal-dependent regions like parts of the U.S. and Kazakhstan, contributing to localized emission spikes without proportional societal benefits.[63] Estimates suggest U.S. crypto mining alone accounted for 0.6-2.3% of national electricity in recent years, prompting regulatory scrutiny for environmental externalities.[63] Attributing these consumptions to climate change, the information and communications technology (ICT) sector is estimated to emit 1.4-4% of global greenhouse gases, with user devices and data centers driving the majority.[64] [65] Critics from organizations like Greenpeace highlight that unchecked growth could push ICT emissions toward aviation levels by 2030 if fossil fuels persist in the energy mix, arguing tech's "dematerialization" benefits (e.g., remote work reducing travel) are overstated relative to direct hardware and operational impacts.[66] However, International Energy Agency analyses indicate that while AI and data centers contribute to rising demand, their global emissions share remains under 1% currently, and fears of accelerating climate change may be exaggerated given potentials for renewable integration and efficiency gains, though rapid deployment challenges persist.[67] [68] Projections vary, but without policy interventions, sector emissions could rise disproportionately if electricity decarbonization lags behind compute scaling.[69]| Sector/Component | Annual Energy Use (2024 est.) | Global Electricity Share | Equivalent Comparison |
|---|---|---|---|
| Global Data Centers | 415 TWh | ~1.5% | Similar to Netherlands' total use |
| Bitcoin Mining | 143-172 TWh | ~0.5% | Exceeds Norway (124 TWh) |
| AI Training (e.g., GPT-4) | ~50 GWh (single model) | Negligible individually, cumulative rising | Thousands of U.S. households annually |
| ICT Sector GHG | 1.4-4% global | N/A | Comparable to aviation in upper estimates |