Hubbry Logo
logo
History of computing hardware
Community hub

History of computing hardware

logo
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something to knowledge base
Hub AI

History of computing hardware AI simulator

(@History of computing hardware_simulator)

History of computing hardware

The history of computing hardware spans the developments from early devices used for simple calculations to today's complex computers, encompassing advancements in both analog and digital technology.

The first aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain the result. In later stages, computing devices began representing numbers in continuous forms, such as by distance along a scale, rotation of a shaft, or a specific voltage level. Numbers could also be represented in the form of digits, automatically manipulated by a mechanism. Although this approach generally required more complex mechanisms, it greatly increased the precision of results. The development of transistor technology, followed by the invention of integrated circuit chips, led to revolutionary breakthroughs.

Transistor-based computers and, later, integrated circuit-based computers enabled digital systems to gradually replace analog systems, increasing both efficiency and processing power. Metal-oxide-semiconductor (MOS) large-scale integration (LSI) then enabled semiconductor memory and the microprocessor, leading to another key breakthrough, the miniaturized personal computer (PC), in the 1970s. The cost of computers gradually became so low that personal computers by the 1990s, and then mobile computers (smartphones and tablets) in the 2000s, became ubiquitous.

Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was probably a form of tally stick. The Lebombo bone from the mountains between Eswatini and South Africa may be the oldest known mathematical artifact. It dates from 35,000 BCE and consists of 29 distinct notches that were deliberately cut into a baboon's fibula. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers. The use of counting rods is one example. The abacus was early used for arithmetic tasks. What we now call the Roman abacus was used in Babylonia as early as c. 2700–2300 BC. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money.

Several analog computers were constructed in ancient and medieval times to perform astronomical calculations. These included the astrolabe and Antikythera mechanism from the Hellenistic world (c. 150–100 BC). In Roman Egypt, Hero of Alexandria (c. 10–70 AD) made mechanical devices including automata and a programmable cart. The steam-powered automatic flute described by the Book of Ingenious Devices (850) by the Persian-Baghdadi Banū Mūsā brothers may have been the first programmable device.

Other early mechanical devices used to perform one or another type of calculations include the planisphere and other mechanical computing devices invented by Al-Biruni (c. AD 1000); the equatorium and universal latitude-independent astrolabe by Al-Zarqali (c. AD 1015); the astronomical analog computers of other medieval Muslim astronomers and engineers; and the astronomical clock tower of Su Song (1094) during the Song dynasty. The castle clock, a hydropowered mechanical astronomical clock invented by Ismail al-Jazari in 1206, was the first programmable analog computer.[disputed (for: The cited source doesn't support the claim, and the claim is misleading.)  – discuss] Ramon Llull invented the Lullian Circle: a notional machine for calculating answers to philosophical questions (in this case, to do with Christianity) via logical combinatorics. This idea was taken up by Leibniz centuries later, and is thus one of the founding elements in computing and information science.

Scottish mathematician and physicist John Napier discovered that the multiplication and division of numbers could be performed by the addition and subtraction, respectively, of the logarithms of those numbers. While producing the first logarithmic tables, Napier needed to perform many tedious multiplications. It was at this point that he designed his 'Napier's bones', an abacus-like device that greatly simplified calculations that involved multiplication and division.

Since real numbers can be represented as distances or intervals on a line, the slide rule was invented in the 1620s, shortly after Napier's work, to allow multiplication and division operations to be carried out significantly faster than was previously possible. Edmund Gunter built a calculating device with a single logarithmic scale at the University of Oxford. His device greatly simplified arithmetic calculations, including multiplication and division. William Oughtred greatly improved this in 1630 with his circular slide rule. He followed this up with the modern slide rule in 1632, essentially a combination of two Gunter rules, held together with the hands. Slide rules were used by generations of engineers and other mathematically involved professional workers, until the invention of the pocket calculator.

See all
Wikimedia history article
User Avatar
No comments yet.