Hubbry Logo
search
logo

Reversible computing

logo
Community Hub0 Subscribers
Write something...
Be the first to start a discussion here.
Be the first to start a discussion here.
See all
Reversible computing

Reversible computing is any model of computation where every step of the process is time-reversible. This means that, given the output of a computation, it is possible to perfectly reconstruct the input. In systems that progress deterministically from one state to another, a key requirement for reversibility is a one-to-one correspondence between each state and its successor. Reversible computing is considered an unconventional approach to computation and is closely linked to quantum computing, where the principles of quantum mechanics inherently ensure reversibility (as long as quantum states are not measured or "collapsed").

There are two major, closely related types of reversibility that are of particular interest for this purpose: physical reversibility and logical reversibility.

A process is said to be physically reversible if it results in no increase in physical entropy; it is isentropic. There is a style of circuit design ideally exhibiting this property that is referred to as charge recovery logic, adiabatic circuits, or adiabatic computing (see Adiabatic process). Although in practice no nonstationary physical process can be exactly physically reversible or isentropic, there is no known limit to the closeness with which we can approach perfect reversibility, in systems that are sufficiently well isolated from interactions with unknown external environments, when the laws of physics describing the system's evolution are precisely known.

A motivation for the study of technologies aimed at implementing reversible computing is that they offer what is predicted to be the only potential way to improve the computational energy efficiency (i.e. useful operations performed per unit energy dissipated) of computers beyond the fundamental von Neumann–Landauer limit of kT ln(2) energy dissipated per irreversible bit operation.

The Landauer limit was millions of times below the energy consumption of computers in the 2000s and thousands of times less in the 2010s. Reversible computing proponents argue that a significant portion of this energy consumption is due to architectural overheads. These overheads are the energy costs associated with non-computational parts of the system, such as wires, transistors, and memory, that are required to make a computer work. They believe that makes it difficult for current technology to achieve much greater energy efficiency without adopting reversible computing principles.

As was first argued by Rolf Landauer while working at IBM, in order for a computational process to be physically reversible, it must also be logically reversible. Landauer's principle is the observation that the oblivious erasure of n bits of known information must always incur a cost of nkT ln(2) in thermodynamic entropy. A discrete, deterministic computational process is said to be logically reversible if the transition function that maps old computational states to new ones is a one-to-one function; i.e. the output logical states uniquely determine the input logical states of the computational operation.

For computational processes that are nondeterministic (in the sense of being probabilistic or random), the relation between old and new states is not a single-valued function, and the requirement needed to obtain physical reversibility becomes a slightly weaker condition, namely that the size of a given ensemble of possible initial computational states does not decrease, on average, as the computation proceeds forwards.

Landauer's principle (and indeed, the second law of thermodynamics) can also be understood to be a direct logical consequence of the underlying reversibility of physics, as is reflected in the general Hamiltonian formulation of mechanics, and in the unitary time-evolution operator of quantum mechanics more specifically.

See all
User Avatar
No comments yet.