Hubbry Logo
Law of excluded middleLaw of excluded middleMain
Open search
Law of excluded middle
Community hub
Law of excluded middle
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Law of excluded middle
Law of excluded middle
from Wikipedia

In logic, the law of excluded middle or the principle of excluded middle states that for every proposition, either this proposition or its negation is true.[1][2] It is one of the three laws of thought, along with the law of noncontradiction and the law of identity; however, no system of logic is built on just these laws, and none of these laws provides inference rules, such as modus ponens or De Morgan's laws. The law is also known as the law/principle of the excluded third, in Latin principium tertii exclusi. Another Latin designation for this law is tertium non datur or "no third [possibility] is given". In classical logic, the law is a tautology.

In contemporary logic the principle is distinguished from the semantical principle of bivalence, which states that every proposition is either true or false. The principle of bivalence always implies the law of excluded middle, while the converse is not always true. A commonly cited counterexample uses statements unprovable now, but provable in the future to show that the law of excluded middle may apply when the principle of bivalence fails.[3]

History

[edit]

Aristotle

[edit]

The earliest known formulation is in Aristotle's discussion of the principle of non-contradiction, first proposed in On Interpretation,[4] where he says that of two contradictory propositions (i.e. where one proposition is the negation of the other) one must be true, and the other false.[5] He also states it as a principle in the Metaphysics book 4, saying that it is necessary in every case to affirm or deny,[6] and that it is impossible that there should be anything between the two parts of a contradiction.[7]

Aristotle wrote that ambiguity can arise from the use of ambiguous names, but cannot exist in the facts themselves:

It is impossible, then, that "being a man" should mean precisely "not being a man", if "man" not only signifies something about one subject but also has one significance. … And it will not be possible to be and not to be the same thing, except in virtue of an ambiguity, just as if one whom we call "man", and others were to call "not-man"; but the point in question is not this, whether the same thing can at the same time be and not be a man in name, but whether it can be in fact. (Metaphysics 4.4, W. D. Ross (trans.), GBWW 8, 525–526).

Aristotle's assertion that "it will not be possible to be and not to be the same thing" would be written in propositional logic as ~(P ∧ ~P). In modern so called classical logic, this statement is equivalent to the law of excluded middle (P ∨ ~P), through distribution of the negation in Aristotle's assertion. The former claims that no statement is both true and false, while the latter requires that any statement is either true or false. (Refer to the List of logic symbols for the meaning of symbols used in this article).

But Aristotle also writes, "since it is impossible that contradictories should be at the same time true of the same thing, obviously contraries also cannot belong at the same time to the same thing" (Book IV, CH 6, p. 531). He then proposes that "there cannot be an intermediate between contradictories, but of one subject we must either affirm or deny any one predicate" (Book IV, CH 7, p. 531). In the context of Aristotle's traditional logic, this is a remarkably precise statement of the law of excluded middle, P ∨ ~P.

Yet in On Interpretation Aristotle seems to deny the law of excluded middle in the case of future contingents, in his discussion on the sea battle.

Leibniz

[edit]

Its usual form, "Every judgment is either true or false" [footnote 9] …"(from Kolmogorov in van Heijenoort, p. 421) footnote 9: "This is Leibniz's very simple formulation (see Nouveaux Essais, IV,2)" (ibid p 421)

Bertrand Russell and Principia Mathematica

[edit]

The principle was stated as a theorem of propositional logic by Russell and Whitehead in Principia Mathematica as:

.[8]

So just what is "truth" and "falsehood"? At the opening PM quickly announces some definitions:

Truth-values. The "truth-value" of a proposition is truth if it is true and falsehood if it is false* [*This phrase is due to Frege] … the truth-value of "p ∨ q" is truth if the truth-value of either p or q is truth, and is falsehood otherwise … that of "~ p" is the opposite of that of p …" (pp. 7–8)

This is not much help. But later, in a much deeper discussion ("Definition and systematic ambiguity of Truth and Falsehood" Chapter II part III, p. 41 ff), PM defines truth and falsehood in terms of a relationship between the "a" and the "b" and the "percipient". For example "This 'a' is 'b'" (e.g. "This 'object a' is 'red'") really means "'object a' is a sense-datum" and "'red' is a sense-datum", and they "stand in relation" to one another and in relation to "I". Thus what we really mean is: "I perceive that 'This object a is red'" and this is an undeniable-by-3rd-party "truth".

PM further defines a distinction between a "sense-datum" and a "sensation":

That is, when we judge (say) "this is red", what occurs is a relation of three terms, the mind, and "this", and "red". On the other hand, when we perceive "the redness of this", there is a relation of two terms, namely the mind and the complex object "the redness of this" (pp. 43–44).

Russell reiterated his distinction between "sense-datum" and "sensation" in his book The Problems of Philosophy (1912), published at the same time as PM (1910–1913):

Let us give the name of "sense-data" to the things that are immediately known in sensation: such things as colours, sounds, smells, hardnesses, roughnesses, and so on. We shall give the name "sensation" to the experience of being immediately aware of these things … The colour itself is a sense-datum, not a sensation. (p. 12)

Russell further described his reasoning behind his definitions of "truth" and "falsehood" in the same book (Chapter XII, Truth and Falsehood).

Consequences of the law of excluded middle in Principia Mathematica

[edit]

From the law of excluded middle, formula ✸2.1 in Principia Mathematica, Whitehead and Russell derive some of the most powerful tools in the logician's argumentation toolkit. (In Principia Mathematica, formulas and propositions are identified by a leading asterisk and two numbers, such as "✸2.1".)

✸2.1 ~pp "This is the Law of excluded middle" (PM, p. 101).

The proof of ✸2.1 is roughly as follows: "primitive idea" 1.08 defines pq = ~pq. Substituting p for q in this rule yields pp = ~pp. Since pp is true (this is Theorem 2.08, which is proved separately), then ~pp must be true.

✸2.11 p ∨ ~p (Permutation of the assertions is allowed by axiom 1.4)
✸2.12 p → ~(~p) (Principle of double negation, part 1: if "this rose is red" is true then it's not true that "'this rose is not-red' is true".)
✸2.13 p ∨ ~{~(~p)} (Lemma together with 2.12 used to derive 2.14)
✸2.14 ~(~p) → p (Principle of double negation, part 2)
✸2.15 (~pq) → (~qp) (One of the four "Principles of transposition". Similar to 1.03, 1.16 and 1.17. A very long demonstration was required here.)
✸2.16 (pq) → (~q → ~p) (If it's true that "If this rose is red then this pig flies" then it's true that "If this pig doesn't fly then this rose isn't red.")
✸2.17 ( ~p → ~q ) → (qp) (Another of the "Principles of transposition".)
✸2.18 (~pp) → p (Called "The complement of reductio ad absurdum. It states that a proposition which follows from the hypothesis of its own falsehood is true" (PM, pp. 103–104).)

Most of these theorems—in particular ✸2.1, ✸2.11, and ✸2.14—are rejected by intuitionism. These tools are recast into another form that Kolmogorov cites as "Hilbert's four axioms of implication" and "Hilbert's two axioms of negation" (Kolmogorov in van Heijenoort, p. 335).

Propositions ✸2.12 and ✸2.14, "double negation": The intuitionist writings of L. E. J. Brouwer refer to what he calls "the principle of the reciprocity of the multiple species, that is, the principle that for every system the correctness of a property follows from the impossibility of the impossibility of this property" (Brouwer, ibid, p. 335).

This principle is commonly called "the principle of double negation" (PM, pp. 101–102). From the law of excluded middle (✸2.1 and ✸2.11), PM derives principle ✸2.12 immediately. We substitute ~p for p in 2.11 to yield ~p ∨ ~(~p), and by the definition of implication (i.e. 1.01 p → q = ~p ∨ q) then ~p ∨ ~(~p)= p → ~(~p). QED (The derivation of 2.14 is a bit more involved.)

Reichenbach

[edit]

It is correct, at least for bivalent logic—i.e. it can be seen with a Karnaugh map—that this law removes "the middle" of the inclusive-or used in his law (3). And this is the point of Reichenbach's demonstration that some believe the exclusive-or should take the place of the inclusive-or.

About this issue (in admittedly very technical terms) Reichenbach observes:

The tertium non datur
29. (x)[f(x) ∨ ~f(x)]
is not exhaustive in its major terms and is therefore an inflated formula. This fact may perhaps explain why some people consider it unreasonable to write (29) with the inclusive-'or', and want to have it written with the sign of the exclusive-'or'
30. (x)[f(x) ⊕ ~f(x)], where the symbol "⊕" signifies exclusive-or[9]
in which form it would be fully exhaustive and therefore nomological in the narrower sense. (Reichenbach, p. 376)

In line (30) the "(x)" means "for all" or "for every", a form used by Russell and Reichenbach; today the symbolism is usually x. Thus an example of the expression would look like this:

  • (pig): (Flies(pig) ⊕ ~Flies(pig))
  • (For all instances of "pig" seen and unseen): ("Pig does fly" or "Pig does not fly" but not both simultaneously)

Formalists versus Intuitionists

[edit]

From the late 1800s through the 1930s, a bitter, persistent debate raged between Hilbert and his followers versus Hermann Weyl and L. E. J. Brouwer. Brouwer's philosophy, called intuitionism, started in earnest with Leopold Kronecker in the late 1800s.

Hilbert intensely disliked Kronecker's ideas:

Kronecker insisted that there could be no existence without construction. For him, as for Paul Gordan [another elderly mathematician], Hilbert's proof of the finiteness of the basis of the invariant system was simply not mathematics. Hilbert, on the other hand, throughout his life was to insist that if one can prove that the attributes assigned to a concept will never lead to a contradiction, the mathematical existence of the concept is thereby established (Reid p. 34)

It was his [Kronecker's] contention that nothing could be said to have mathematical existence unless it could actually be constructed with a finite number of positive integers (Reid p. 26)

The debate had a profound effect on Hilbert. Reid indicates that Hilbert's second problem (one of Hilbert's problems from the Second International Conference in Paris in 1900) evolved from this debate (italics in the original):

In his second problem, [Hilbert] had asked for a mathematical proof of the consistency of the axioms of the arithmetic of real numbers.
To show the significance of this problem, he added the following observation:
"If contradictory attributes be assigned to a concept, I say that mathematically the concept does not exist" (Reid p. 71)

Thus, Hilbert was saying: "If p and ~p are both shown to be true, then p does not exist", and was thereby invoking the law of excluded middle cast into the form of the law of contradiction.

And finally constructivists … restricted mathematics to the study of concrete operations on finite or potentially (but not actually) infinite structures; completed infinite totalities … were rejected, as were indirect proof based on the Law of Excluded Middle. Most radical among the constructivists were the intuitionists, led by the erstwhile topologist L. E. J. Brouwer (Dawson p. 49)

The rancorous debate continued through the early 1900s into the 1920s; in 1927 Brouwer complained about "polemicizing against it [intuitionism] in sneering tones" (Brouwer in van Heijenoort, p. 492). But the debate was fertile: it resulted in Principia Mathematica (1910–1913), and that work gave a precise definition to the law of excluded middle, and all this provided an intellectual setting and the tools necessary for the mathematicians of the early 20th century:

Out of the rancor, and spawned in part by it, there arose several important logical developments; Zermelo's axiomatization of set theory (1908a), that was followed two years later by the first volume of Principia Mathematica, in which Russell and Whitehead showed how, via the theory of types: much of arithmetic could be developed by logicist means (Dawson p. 49)

Brouwer reduced the debate to the use of proofs designed from "negative" or "non-existence" versus "constructive" proof:

According to Brouwer, a statement that an object exists having a given property means that, and is only proved, when a method is known which in principle at least will enable such an object to be found or constructed …
Hilbert naturally disagreed.
"pure existence proofs have been the most important landmarks in the historical development of our science," he maintained. (Reid p. 155)
Brouwer refused to accept the logical principle of the excluded middle, His argument was the following:
"Suppose that A is the statement "There exists a member of the set S having the property P." If the set is finite, it is possible—in principle—to examine each member of S and determine whether there is a member of S with the property P or that every member of S lacks the property P." For finite sets, therefore, Brouwer accepted the principle of the excluded middle as valid. He refused to accept it for infinite sets because if the set S is infinite, we cannot—even in principle—examine each member of the set. If, during the course of our examination, we find a member of the set with the property P, the first alternative is substantiated; but if we never find such a member, the second alternative is still not substantiated.
Since mathematical theorems are often proved by establishing that the negation would involve us in a contradiction, this third possibility which Brouwer suggested would throw into question many of the mathematical statements currently accepted.
"Taking the Principle of the Excluded Middle from the mathematician," Hilbert said, "is the same as … prohibiting the boxer the use of his fists."
"The possible loss did not seem to bother Weyl … Brouwer's program was the coming thing, he insisted to his friends in Zürich." (Reid, p. 149)

In his lecture in 1941 at Yale and the subsequent paper, Gödel proposed a solution: "that the negation of a universal proposition was to be understood as asserting the existence … of a counterexample" (Dawson, p. 157)

Gödel's approach to the law of excluded middle was to assert that objections against "the use of 'impredicative definitions'" had "carried more weight" than "the law of excluded middle and related theorems of the propositional calculus" (Dawson p. 156). He proposed his "system Σ … and he concluded by mentioning several applications of his interpretation. Among them were a proof of the consistency with intuitionistic logic of the principle ~ (∀A: (A ∨ ~A)) (despite the inconsistency of the assumption ∃ A: ~ (A ∨ ~A))" (Dawson, p. 157)

The debate seemed to weaken: mathematicians, logicians and engineers continue to use the law of excluded middle (and double negation) in their daily work.

Intuitionist definitions of the law (principle) of excluded middle

[edit]

The following highlights the deep mathematical and philosophic problem behind what it means to "know", and also helps elucidate what the "law" implies (i.e. what the law really means). Their difficulties with the law emerge: that they do not want to accept as true implications drawn from that which is unverifiable (untestable, unknowable) or from the impossible or the false. (All quotes are from van Heijenoort, italics added).

Brouwer offers his definition of "principle of excluded middle"; we see here also the issue of "testability":

On the basis of the testability just mentioned, there hold, for properties conceived within a specific finite main system, the "principle of excluded middle", that is, the principle that for every system every property is either correct [richtig] or impossible, and in particular the principle of the reciprocity of the complementary species, that is, the principle that for every system the correctness of a property follows from the impossibility of the impossibility of this property. (335)[citation needed]

Kolmogorov's definition cites Hilbert's two axioms of negation

  1. A → (~AB)
  2. (AB) → { (~AB) → B}
Hilbert's first axiom of negation, "anything follows from the false", made its appearance only with the rise of symbolic logic, as did the first axiom of implication … while … the axiom under consideration [axiom 5] asserts something about the consequences of something impossible: we have to accept B if the true judgment A is regarded as false …
Hilbert's second axiom of negation expresses the principle of excluded middle. The principle is expressed here in the form in which is it used for derivations: if B follows from A as well as from ~A, then B is true. Its usual form, "every judgment is either true or false" is equivalent to that given above".
From the first interpretation of negation, that is, the interdiction from regarding the judgment as true, it is impossible to obtain the certitude that the principle of excluded middle is true … Brouwer showed that in the case of such transfinite judgments the principle of excluded middle cannot be considered obvious
footnote 9: "This is Leibniz's very simple formulation (see Nouveaux Essais, IV,2). The formulation "A is either B or not-B" has nothing to do with the logic of judgments.
footnote 10: "Symbolically the second form is expressed thus
A ∨ ~A

where ∨ means "or". The equivalence of the two forms is easily proved (p. 421)

Examples

[edit]

For example, if P is the proposition:

Socrates is mortal.

then the law of excluded middle holds that the logical disjunction:

Either Socrates is mortal, or it is not the case that Socrates is mortal.

is true by virtue of its form alone. That is, the "middle" position, that Socrates is neither mortal nor not-mortal, is excluded by logic, and therefore either the first possibility (Socrates is mortal) or its negation (it is not the case that Socrates is mortal) must be true.

An example of an argument that depends on the law of excluded middle follows.[10] We seek to prove that

there exist two irrational numbers and such that is rational.

It is known that is irrational (see proof). Consider the number

.

Clearly (excluded middle) this number is either rational or irrational. If it is rational, the proof is complete, and

and .

But if is irrational, then let

and .

Then

,

and 2 is certainly rational. This concludes the proof.

In the above argument, the assertion "this number is either rational or irrational" invokes the law of excluded middle. An intuitionist, for example, would not accept this argument without further support for that statement. This might come in the form of a proof that the number in question is in fact irrational (or rational, as the case may be); or a finite algorithm that could determine whether the number is rational.

Non-constructive proofs over the infinite

[edit]

The above proof is an example of a non-constructive proof disallowed by intuitionists:

The proof is non-constructive because it doesn't give specific numbers and that satisfy the theorem but only two separate possibilities, one of which must work. (Actually is irrational but there is no known easy proof of that fact.) (Davis 2000:220)

(Constructive proofs of the specific example above are not hard to produce; for example and are both easily shown to be irrational, and ; a proof allowed by intuitionists).

By non-constructive Davis means that "a proof that there actually are mathematic entities satisfying certain conditions would not have to provide a method to exhibit explicitly the entities in question." (p. 85). Such proofs presume the existence of a totality that is complete, a notion disallowed by intuitionists when extended to the infinite—for them the infinite can never be completed:

In classical mathematics there occur non-constructive or indirect existence proofs, which intuitionists do not accept. For example, to prove there exists an n such that P(n), the classical mathematician may deduce a contradiction from the assumption for all n, not P(n). Under both the classical and the intuitionistic logic, by reductio ad absurdum this gives not for all n, not P(n). The classical logic allows this result to be transformed into there exists an n such that P(n), but not in general the intuitionistic … the classical meaning, that somewhere in the completed infinite totality of the natural numbers there occurs an n such that P(n), is not available to him, since he does not conceive the natural numbers as a completed totality.[11] (Kleene 1952:49–50)

David Hilbert and Luitzen E. J. Brouwer both give examples of the law of excluded middle extended to the infinite. Hilbert's example: "the assertion that either there are only finitely many prime numbers or there are infinitely many" (quoted in Davis 2000:97); and Brouwer's: "Every mathematical species is either finite or infinite." (Brouwer 1923 in van Heijenoort 1967:336). In general, intuitionists allow the use of the law of excluded middle when it is confined to discourse over finite collections (sets), but not when it is used in discourse over infinite sets (e.g. the natural numbers). Thus intuitionists absolutely disallow the blanket assertion: "For all propositions P concerning infinite sets D: P or ~P" (Kleene 1952:48).[12]

Putative counterexamples to the law of excluded middle include the liar paradox or Quine's paradox. Certain resolutions of these paradoxes, particularly Graham Priest's dialetheism as formalised in LP, have the law of excluded middle as a theorem, but resolve out the Liar as both true and false. In this way, the law of excluded middle is true, but because truth itself, and therefore disjunction, is not exclusive, it says next to nothing if one of the disjuncts is paradoxical, or both true and false.

Criticisms

[edit]

The Catuṣkoṭi (tetralemma) is an ancient alternative to the law of excluded middle, which examines all four possible assignments of truth values to a proposition and its negation. It has been important in Indian logic and Buddhist logic as well as the ancient Greek philosophical school known as Pyrrhonism.

Many modern logic systems replace the law of excluded middle with the concept of negation as failure. Instead of a proposition's being either true or false, a proposition is either true or not able to be proved true.[13] These two dichotomies only differ in logical systems that are not complete. The principle of negation as failure is used as a foundation for autoepistemic logic, and is widely used in logic programming. In these systems, the programmer is free to assert the law of excluded middle as a true fact, but it is not built-in a priori into these systems.

Mathematicians such as L. E. J. Brouwer and Arend Heyting have also contested the usefulness of the law of excluded middle in the context of modern mathematics.[14]

In mathematical logic

[edit]

In modern mathematical logic, the excluded middle has been argued to result in possible self-contradiction. It is possible in logic to make well-constructed propositions that can be neither true nor false; a common example of this is the "Liar's paradox",[15] the statement "this statement is false", which is argued to itself be neither true nor false. Arthur Prior has argued that the Paradox is not an example of a statement that cannot be true or false. The law of excluded middle still holds here as the negation of this statement "This statement is not false", can be assigned true. In set theory, such a self-referential paradox can be constructed by examining the set "the set of all sets that do not contain themselves". This set is unambiguously defined, but leads to a Russell's paradox:[16][17] does the set contain, as one of its elements, itself? However, in the modern Zermelo–Fraenkel set theory, this type of contradiction is no longer admitted. Furthermore, paradoxes of self reference can be constructed without even invoking negation at all, as in Curry's paradox.[citation needed]

Analogous laws

[edit]

Some systems of logic have different but analogous laws. For some finite n-valued logics, there is an analogous law called the law of excluded n+1th. If negation is cyclic and "∨" is a "max operator", then the law can be expressed in the object language by (P ∨ ~P ∨ ~~P ∨ ... ∨ ~...~P), where "~...~" represents n−1 negation signs and "∨ ... ∨" n−1 disjunction signs. It is easy to check that the sentence must receive at least one of the n truth values (and not a value that is not one of the n).

Other systems reject the law entirely.[specify]

Law of the weak excluded middle

[edit]

A particularly well-studied intermediate logic is given by De Morgan logic, which adds the axiom to intuitionistic logic, which is sometimes called the law of the weak excluded middle.

This is equivalent to a few other statements:

  • Satisfying all of De Morgan's laws including

See also

[edit]
  • Argument to moderation – Opposite logical fallacy to excluded middle
  • Brouwer–Hilbert controversy – Foundational controversy in twentieth-century mathematics: an account on the formalist-intuitionist divide around the Law of the excluded middle
  • Consequentia mirabilis – Pattern of reasoning in propositional logic
  • Constructive set theory
  • Diaconescu's theorem – Theorem in mathematical logic
  • Dichotomy – Splitting of a whole into exactly two non-overlapping parts; dyadic relations and processes
  • Homogeneity (linguistics) – Semantic property of plurals: cases where LEM appears to fail in natural language
  • Law of excluded fourth – System including an indeterminate value
  • Law of excluded middle is untrue in many-valued logic – Propositional calculus in which there are more than two truth values such as ternary logic – System including an indeterminate value and fuzzy logic – System for reasoning about vagueness
  • Laws of thought – Axioms of rational discourse
  • Limited principle of omniscience – Mathematical concept
  • Logical graph – Type of diagrammatic notation for propositional logic: a graphical syntax for propositional logic
  • Logical determinism: the application excluded middle to modal – Type of formal logic propositions
  • Mathematical constructivism
  • Non-affirming negation in the Prasangika – Doctrinal distinction within Tibetan Buddhism school of Buddhism, another system in which the law of excluded middle is untrue
  • Proof by contradiction
  • Peirce's law – Axiom used in logic and philosophy: another way of turning intuition classical

Footnotes

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
The law of the excluded middle (LEM), also known as the principle of bivalence, is a foundational in asserting that for any P, either P or its ¬P must be true, excluding any intermediate truth value or third option. This principle, formally expressed as P ∨ ¬P, underpins the binary truth valuation system of classical propositional and predicate logics, where every well-formed statement is definitively true or false. Originating in , it was explicitly formulated by in his Metaphysics (Book IV, Chapter 7), where he argues that "there cannot be an intermediate between contradictories, but of one subject we must either affirm or deny any one predicate," emphasizing its role in avoiding ambiguity in predication and reasoning. In systems, such as those developed by and , the LEM is a derivable from basic and serves as a cornerstone for indirect proof methods, including , by enabling the assumption that if ¬P leads to contradiction, then P must hold. It aligns with the semantic framework of two-valued logic, where truth tables assign T (true) or F (false) to all propositions without gap or overlap. However, the principle's universality has been challenged in non-classical logics; for instance, in pioneered by , the LEM is not generally accepted because it demands constructive evidence for at least one disjunct, rejecting it as an in favor of propositions verifiable through intuition or proof. This rejection impacts and , where intuitionistic approaches support and constructive proofs without relying on the LEM's assumption of existential completeness. The LEM's implications extend beyond formal logic to , influencing debates on , , and future contingents—such as Aristotle's own sea-battle example, where he questioned its application to undetermined future events without endorsing a full rejection. In contemporary contexts, it remains central to and , while alternatives like paraconsistent logics explore systems tolerant of contradictions without violating the LEM entirely.

Definition and Basics

Formal Statement in Classical Logic

In classical logic, the law of excluded middle states that for any proposition PP, either PP is true or its negation ¬P\neg P is true, formally expressed as the disjunction P¬PP \lor \neg P. This principle is a tautology in classical propositional logic, holding true under every possible assignment of truth values to the atomic propositions involved. In classical predicate logic, it applies similarly to all well-formed sentences, ensuring that no sentence lacks a definite truth value. The law is grounded in the principle of bivalence, which asserts that every possesses exactly one —either true or false—with no allowance for intermediate, undefined, or additional values. This bivalent semantics directly validates the excluded middle as a logical necessity, as the disjunction exhausts all possible cases without overlap or gap. Distinct from other connectives, the law of excluded middle specifically utilizes disjunction combined with to affirm exhaustive alternatives, whereas material implication—defined as PQ¬PQP \to Q \equiv \neg P \lor Q—governs conditional relationships between distinct propositions. As a core or derivable , it underpins the deductive structure of classical logical systems, enabling proofs that rely on case analysis between a proposition and its denial.

Symbolic and Semantic Interpretations

In classical propositional logic, the law of excluded middle is symbolically expressed as the P¬PP \lor \neg P, where PP is any and \lor denotes disjunction, ¬\neg denotes . This is a tautology, meaning it is true under every possible truth assignment to its atomic components. To demonstrate this, consider the for P¬PP \lor \neg P, which evaluates the formula across all possible truth values of PP in bivalent logic:
PP¬P\neg PP¬PP \lor \neg P
TFT
FTT
The final column is always true, confirming the tautological status regardless of whether PP is true or false. Semantically, in bivalent semantics where propositions take only true or false values, every interpretation assigns truth to P¬PP \lor \neg P because at least one disjunct must hold: if PP is true, the first disjunct is true; if PP is false, the second is true. This ensures the law's validity across all models in classical logic. The law extends to predicate logic, where for any formula ϕ(x)\phi(x) with free variable xx and any domain, the universal quantification x(ϕ(x)¬ϕ(x))\forall x \, (\phi(x) \lor \neg \phi(x)) holds true in every classical model, reflecting bivalence at the predicate level.

Historical Development

Ancient and Medieval Roots

The law of excluded middle finds its earliest philosophical articulation in thought, particularly through 's work in the fourth century BCE. In his Metaphysics (Book Gamma, or IV), formulates the closely related principle of non-contradiction, stating that "it is impossible for the same thing to belong and not to belong at the same time to the same thing and in the same respect" (1005b19–20). This principle implies the excluded middle by rejecting any intermediate state between a predicate's affirmation and for a given subject, as argues that "there cannot be an intermediate between contradictories, but of one subject we must either affirm or deny any one predicate" (1011b23–24). He defends this as an indemonstrable essential to rational discourse, positing that denying it leads to absurdities where all assertions become indistinguishable. Aristotle's syllogistic logic, developed in the , presupposes the excluded middle within its deductive framework. In this system, demonstrations rely on categorical propositions where terms are either affirmed or denied without middle ground, ensuring that syllogisms proceed from definite predications to valid conclusions (e.g., I.1–7). The structure of the , with its major, minor, and middle terms, embodies the bivalent commitment that contradictory opposites exhaust the possibilities for any attribute, integrating the principle into the mechanics of inference without explicit symbolic notation. During the medieval period, scholastic philosophers refined and theologized Aristotle's ideas, viewing the excluded middle—often termed tertium non datur ("a third is not given")—as an indemonstrable first principle of reason. Thomas Aquinas, in the thirteenth century, incorporated it into his synthesis of Aristotelian logic and Christian theology, treating it alongside non-contradiction as a self-evident axiom foundational to both speculative and practical knowledge. In his Summa Theologiae (I-II, q. 94, a. 2), Aquinas describes the first indemonstrable principles, such as non-contradiction, as immediate and undeniable, derived from the intellect's grasp of being and non-being, and essential for theological demonstrations that affirm God's existence without contradiction. This integration elevated the principle beyond pure logic, embedding it in proofs for divine simplicity and the incompatibility of divine attributes with their negations, as seen in Aquinas's commentaries on Aristotle's Posterior Analytics.

Modern Classical Formulations

In the 17th century, advanced logical principles within his philosophical framework, particularly in New Essays Concerning Human Understanding (written around 1704, published posthumously in 1765), where he defended innate principles including identity and non-contradiction as underlying definite knowledge. Responding to John Locke's skepticism about innate ideas, Leibniz argued that these principles, along with the principle of sufficient reason, operate unconsciously in the mind, enabling rational cognition without requiring explicit awareness; for instance, he countered Locke's claim that uneducated individuals do not contemplate such principles by asserting their dispositional presence in all intelligent beings. This characterization positioned these principles, which imply bivalence, not merely as logical tautologies but as foundational cognitive tools for distinguishing truth from falsehood in human understanding. George Boole's development of in The Mathematical Analysis of Logic () marked a pivotal formalization, integrating the directly into a analogous to ordinary . Boole expressed it as the identity x+xˉ=1x + \bar{x} = 1, where xx represents a logical variable and xˉ\bar{x} its , signifying that every and its denial exhaust all possibilities, yielding unity in the domain. This formulation transformed the from a philosophical axiom into an operational equation, facilitating the reduction of deductive reasoning to symbolic manipulation and laying the groundwork for later computational logics. Boole's approach emphasized the law's role in ensuring the completeness of logical operations within finite, interpretable systems. Gottlob Frege's (1879) introduced the first rigorous predicate calculus, explicitly incorporating the law of excluded middle as a core to underpin the derivation of arithmetic from pure logic. In this two-dimensional notation, Frege treated the law as indispensable for bivalence, stating that for any content AA, either AA or its negation holds without intermediary, enabling the quantification over predicates and the elimination of ambiguities in traditional syllogistic logic. This axiomatization allowed Frege to prove theorems of , highlighting the law's necessity for extending logic to infinite domains while maintaining formal consistency. Bertrand Russell and Alfred North Whitehead's Principia Mathematica (volumes I–III, 1910–1913) culminated this evolution by presenting the law as a primitive (*2.1) within their ramified , essential for reducing to logic. They formulated it generally as ϕ¬ϕ\vdash \phi \lor \lnot \phi for any ϕ\phi, discussing its implications for handling infinity through the and the theory of types to avoid paradoxes. Russell particularly acknowledged the law's non-intuitive character for infinite sets, noting in the introduction that while evident for finite aggregates, its universal application assumes the existence of completed infinities, which lacks direct intuitive justification but is required for mathematical rigor. This reflection underscored the law's foundational yet provisional status in formal systems.

Role in Logical Systems

Acceptance in Classical Logic

In classical logic, the law of excluded middle (LEM) occupies a foundational axiomatic position, ensuring the system's adherence to bivalent truth values. In Hilbert-style proof systems, LEM is explicitly adopted as one of the core axioms, alongside axioms for implication, conjunction, and disjunction, enabling the derivation of all classical tautologies through minimal inference rules such as . This axiomatic embedding guarantees the soundness and completeness of the system for propositional and classical logic, where LEM serves as an indispensable primitive to capture the exhaustive nature of propositions. In contrast, systems for classical logic often incorporate LEM either directly as an axiom or indirectly through derived rules that enforce its validity, such as allowing the introduction of disjunctions based on case analysis without constructive justification. A pivotal aspect of LEM's acceptance lies in its contribution to the completeness of . Gödel's , proved in 1929, establishes that every semantically valid formula in is provable within the , a result that inherently depends on LEM to equate syntactic provability with semantic truth in bivalent models. Without LEM, the theorem does not hold in the same form, as intuitionistic variants of require alternative semantics, such as Kripke models, to achieve analogous completeness properties. This reliance underscores LEM's necessity for 's ability to prove all logical truths exhaustively. Within classical systems, LEM exhibits a deep logical equivalence to double negation elimination, the principle that ¬¬PP\neg \neg P \leftrightarrow P for any proposition PP. This equivalence means that adopting double negation elimination as an axiom or rule immediately yields LEM as a theorem, and vice versa, through standard derivations involving reductio ad absurdum and disjunction introduction. Such interconnections highlight LEM's role in reinforcing the stability of negation and disjunction in classical frameworks. Finally, is precisely augmented by LEM, transforming a constructive base—where proofs must exhibit explicit constructions—into a non-constructive system that permits indirect reasoning and exhaustive case splits. This extension preserves intuitionistic theorems while adding the full expressive power of classical inference, including proofs by contradiction that rely on LEM's bivalence.

Integration in Formal Proof Systems

In , a foundational for developed by , the law of excluded middle is integrated as an initial , Γ ⊢ P ∨ ¬P, which serves as a starting point for derivations without requiring further justification from atomic axioms. This initial enables the application of structural rules like weakening and contraction, as well as logical rules for disjunction and , to propagate the principle throughout proofs, distinguishing classical from intuitionistic variants where such a sequent is not admissible. For instance, from this sequent, one can derive more complex classical tautologies by combining it with the right disjunction rule (∨R), which splits the succedent into cases for P or ¬P. The proof technique, central to classical reasoning, explicitly relies on the law of excluded middle to conclude the truth of a from the assumption of its leading to a contradiction. To prove P, assume ¬P and derive a contradiction (⊥); this yields ¬¬P via implication introduction. Then, applying the law of excluded middle as P ∨ ¬P, the case ¬P is ruled out by the contradiction, leaving P as the only possibility via (or case analysis). Without the law of excluded middle, this step fails in intuitionistic systems, where ¬¬P does not imply P, limiting reductio to weaker forms. In automated theorem proving, the law of excluded middle underpins resolution algorithms, which are refutation-based methods for determining propositional satisfiability in classical logic. Resolution operates on clausal form, where every proposition is expressed as a disjunction of literals, and the principle ensures that for any literal L, either L or ¬L holds, allowing complete pairwise resolution steps to reduce sets of clauses until the empty clause (contradiction) is reached or satisfiability is confirmed. This completeness, established by Robinson's resolution principle, relies on the bivalence implied by the law of excluded middle to guarantee that unsatisfiable formulas yield a refutation.

Criticisms and Rejections

Intuitionistic Perspectives

, developed by in the early 20th century, fundamentally rejects the law of excluded middle on the grounds that it endorses non-constructive proofs of existence, which intuitionists deem invalid since mathematical truth must be established through explicit mental constructions rather than mere logical assumption. Brouwer argued that statements about infinite mathematical objects cannot be asserted as true or false without a constructive method to verify them, rendering the principle P¬PP \vee \neg P unacceptable for transfinite domains where such constructions may be impossible. This rejection stems from Brouwer's philosophy that is a free creation of the human mind, free from objective reality, and thus confined to what can be intuited and constructed step by step. Arend Heyting formalized Brouwer's ideas in the 1930s by developing intuitionistic propositional and predicate logic, which excludes the full law of excluded middle from its axioms and instead permits it only for decidable predicates where a construction can distinguish between the proposition and its negation. In Heyting's system, proofs are defined via the Brouwer-Heyting-Kolmogorov (BHK) interpretation, where a proof of disjunction requires demonstrating one disjunct constructively, but no general axiom forces every proposition into this binary choice without evidence. This formalization ensures that intuitionistic logic remains faithful to constructive principles, avoiding the classical commitment to bivalence for undecidable statements. Andrey Kolmogorov provided an early interpretation of intuitionistic logic in 1925, viewing the law of excluded middle as applicable only to the "problem of the alternative" in finite, concrete cases where direct verification is feasible, but invalid for infinite or transfinite judgments lacking such effective methods. Kolmogorov emphasized that in intuitionism, the principle cannot serve as a universal axiom of logic because transfinite arguments, such as those involving infinite sequences, do not yield obvious truth values without construction. A key critique within intuitionism targets infinite domains, where effective decision procedures are often absent, making assertions like P¬PP \vee \neg P unverifiable and thus philosophically untenable; for instance, statements about the continuum or undecidable problems cannot be resolved without a method to construct either the proof or its refutation. This limitation underscores 's emphasis on constructive mathematics, where the law's classical acceptance would import non-intuitive existential claims into foundational proofs.

Other Non-Classical Challenges

Paraconsistent logics represent a significant departure from classical logic by permitting contradictions without leading to the principle of explosion, where a contradiction implies every statement. In some of these systems, the law of excluded middle (LEM) is weakened or restricted, particularly in variants with intermediate truth values, while others accept LEM fully since a contradiction can still satisfy P ∨ ¬P without causing explosion. These approaches have been influential in handling inconsistent data, ensuring that reasoning remains coherent even when contradictory information arises. Fuzzy logic extends this rejection by adopting a multivalued truth framework, where propositions are assigned truth values along a continuum from 0 to 1 rather than strictly true or false. Under this semantics, the disjunction P¬PP \lor \neg P does not necessarily evaluate to 1, as the truth value of the disjunction—typically computed as max(v(P),1v(P))\max(v(P), 1 - v(P))—is less than 1 when v(P)v(P) is intermediate, such as 0.5. This failure of LEM accommodates and gradations in truth, making suitable for modeling imprecise or uncertain concepts without forcing binary decisions. In quantum logic, pioneered by Garrett Birkhoff and John von Neumann, the algebraic structure shifts to orthomodular lattices, which underlie the propositional calculus for quantum mechanics. Here, the distributive laws that underpin LEM in classical Boolean algebras do not hold universally, replaced instead by weaker orthomodularity conditions that reflect non-commutative aspects of quantum observables. This non-distributivity implies that LEM cannot be generally asserted, as quantum propositions may lack the sharp complementarity assumed in classical logic, aligning the system with empirical observations in quantum physics where superposition defies classical bivalence. The relevance of rejecting LEM extends to , particularly in constructive logics embedded within , which prioritize computable proofs over non-constructive existence claims. Systems like the Coq proof assistant, based on the Calculus of Inductive Constructions, eschew LEM to ensure all proofs correspond to total, executable programs, avoiding undecidable or infinite searches that LEM might implicitly endorse. By restricting to intuitionistic principles, these frameworks guarantee termination and totality in verification tasks, such as formalizing software correctness or mathematical theorems. In the , non-classical logics challenging LEM have found applications in AI for reasoning under , where classical bivalence struggles with incomplete or conflicting data. Paraconsistent and fuzzy approaches, for example, enable robust knowledge representation in systems handling noisy inputs, such as in expert systems or in autonomous agents, by avoiding overcommitment to excluded alternatives. These developments underscore LEM's limitations in probabilistic or evidential contexts, fostering AI models that better mirror real-world ambiguity.

Examples and Applications

Constructive vs. Non-Constructive Proofs

The law of excluded middle (LEM) plays a pivotal role in distinguishing constructive proofs, which explicitly construct mathematical objects or decisions, from non-constructive proofs, which establish existence or truth without providing such constructions. In , LEM asserts that for any PP, either PP or ¬P\neg P holds, enabling proof by cases that may not specify which case applies. This contrasts with constructive approaches, such as , where disjunctions require identifying and exhibiting a specific proof for one disjunct. A classic finite example illustrates this difference: proving that every nn is either even or odd. Define even(nn) as kN(n=2k)\exists k \in \mathbb{N} (n = 2k) and odd(nn) as kN(n=2k+1)\exists k \in \mathbb{N} (n = 2k + 1). Using LEM, one asserts even(nn) \vee ¬\negeven(nn) for arbitrary nn, and since ¬\negeven(nn) is equivalent to odd(nn) (as the definitions partition the naturals), it follows that even(nn) \vee odd(nn). This yields the universal statement nN(even(n)odd(n))\forall n \in \mathbb{N} \, (\text{even}(n) \vee \text{odd}(n)). The non-constructive aspect arises because LEM permits this proof by cases without exhibiting which disjunct holds for a given nn; it relies on the logical principle rather than an algorithmic decision procedure. In contrast, a proceeds by induction: base case n=0n=0 is even (witness k=0k=0); inductive step assumes a witness for nn, yielding one for n+1n+1 (if even, then odd with k=n/2k = n/2; if odd, then even with k=(n+1)/2k = (n+1)/2). This extracts an to decide parity, such as repeated division by 2, aligning with intuitionistic requirements. Similarly, for integers, the statement nZ(even(n)odd(n))\forall n \in \mathbb{Z} \, (\text{even}(n) \vee \text{odd}(n)) follows non-constructively via LEM applied to even(nn), with ¬\negeven(nn) implying odd(nn) by the exhaustive definitions. While decidable in practice, this derivation highlights LEM's role in enabling case analysis without constructive witnesses, underscoring its utility in finite, decidable domains.

Implications in Infinite Domains

In infinite domains, the law of excluded middle (LEM) facilitates non-constructive proofs by allowing the assertion that for any PP, either PP or ¬P\neg P holds, even without an to determine which. This is pivotal in and , where it enables the establishment of existence results over uncountably infinite sets without providing explicit constructions, contrasting with the requirements of constructive . Such applications often lead to powerful theorems but raise concerns about their effectiveness in foundational systems that reject LEM. A prominent example is the Heine-Borel theorem, which states that in Rn\mathbb{R}^n, a set is compact if and only if it is closed and bounded. The proof for infinite open covers relies on LEM to exhaustively consider cases in selecting finite subcovers, as the classical argument assumes that for each point in the set, it belongs to some cover element, and uses non-constructive choice to guarantee finiteness without an algorithm for extraction. In constructive settings, this theorem holds only for specific subclasses of covers, such as those satisfying uniform continuity conditions, highlighting its dependence on classical principles. The least upper bound property of the real numbers—that every nonempty bounded above subset has a supremum—also invokes LEM in its classical justification, particularly when deriving it from the Archimedean property. The Archimedean property posits that for any positive reals xx and yy, there exists a natural number nn such that nx>ynx > y. To prove the existence of a least upper bound for a bounded set SRS \subseteq \mathbb{R}, one considers the proposition PP: "there exists an upper bound less than some fixed value." Applying LEM (P¬PP \lor \neg P) allows pinpointing the infimum of upper bounds via bisection, but without LEM, this process may not terminate effectively, limiting the property to "located" sets in constructive analysis. Cantor's diagonal argument demonstrates the uncountability of the real numbers by assuming a supposed of all reals in [0,1][0,1] as infinite decimals and constructing a new real differing in the nnth digit from the nnth listed number. This explicitly specifies a method to compute the digits (e.g., choosing a digit different from the listed one at each position, with care to avoid equivalences like 0.999... = 1.0 by selecting from {1,2,...,8}). The argument is constructive and valid in intuitionistic logic, as it directly produces a differing real without relying on LEM, though it establishes the non-existence of an effective . Intuitionists, following Brouwer, reject these proofs as non-effective, arguing that LEM fails for infinite domains where no finite decides PP or ¬P\neg P, rendering claims vacuous without constructions. For instance, the Heine-Borel theorem's lacks intuitionistic validity for arbitrary covers, the supremum may not exist for non-located sets, and Cantor's uncountability is fully acceptable without modification, constructively establishing that no covers all reals. This critique underscores the tension between classical efficiency and intuitionistic rigor in handling infinities.

Law of Non-Contradiction

The law of non-contradiction, a of , asserts that for any PP, it cannot be the case that both PP and its ¬P\neg P are true simultaneously; formally, ¬(P¬P)\neg (P \land \neg P). This ensures the consistency of logical systems by prohibiting contradictions, thereby supporting core inferential rules such as the and indirect proofs. In , it operates alongside bivalent truth values, where propositions are either true or false without intermediate states. Aristotle, in his Metaphysics (Book Gamma), prioritizes as the most certain and primary principle of all reasoning, describing it as "the same attribute cannot at the same time belong and not belong to the same subject and in the same respect." He views it as foundational for scientific inquiry and demonstrations, non-hypothetical, and prior to other axioms, including the law of excluded middle, which follows from it in his framework. This prioritization underscores its role as the bedrock for rational thought, implying the excluded middle by ruling out any third possibility beyond affirmation or denial of a predicate. In , and the law of excluded middle are interderivable, with their equivalence established via and ; specifically, assuming non-contradiction allows derivation of excluded middle (P¬PP \lor \neg P) from the double negation of a contradiction. This interdependence highlights how rejecting one in non-classical systems, such as , impacts the other, though non-contradiction retains broader acceptance. Philosophically, serves as the foundation for rational discourse, enabling coherent communication, argumentation, and the rejection of incoherent positions without which meaningful inquiry collapses. It reflects the structure of reality, ensuring that contradictory claims cannot coexist, thus underpinning ethical, metaphysical, and epistemological discussions across traditions.

Weak and Double Negation Variants

The , formulated as P¬¬PP \lor \neg\neg P, represents a weakened variant of the law of excluded middle that is provable in certain intermediate logics extending intuitionistic propositional calculus (IPC) but not in IPC itself. This principle, also known as the weak law of the excluded middle, axiomatizes logics such as KC (Jankov logic or De Morgan logic), which adds ¬P¬¬P\neg P \lor \neg\neg P to IPC and is complete with respect to Kripke frames of topwidth 1. In these systems, WEM allows disjunction between a proposition and its without committing to the full bivalence of , enabling reasoning that avoids non-constructive assumptions while still accommodating limited forms of indirect proof. A prominent example of an intuitionistic extension accepting WEM is Gödel-Dummett logic (LC), defined by adding the axiom schema (PQ)(QP)(P \to Q) \lor (Q \to P) to IPC. This logic implies WEM, as the linear ordering of truth values in its semantics ensures that for any proposition, either it implies its or the converse holds in a disjunctive sense, but LC remains strictly weaker than classical propositional calculus (CPC) since the full law of excluded middle P¬PP \lor \neg P is not derivable. Intermediate logics in general occupy the lattice between IPC and CPC, and those validating WEM—such as generalizations ϕk\phi_k involving iterated disjunctions of negated implications—characterize frame classes with bounded topwidth, providing a semantic for non-classical reasoning. Closely related is the double negation elimination principle, ¬¬PP\neg\neg P \to P, which serves as a key bridge to when adjoined to IPC. In intuitionistic systems, introduction P¬¬PP \to \neg\neg P holds, but elimination does not; adding it recovers CPC, as it implies the full law of excluded middle via the derivation: assume ¬(P¬P)\neg(P \lor \neg P), then by , ¬¬(P¬P)P¬P\neg\neg(P \lor \neg P) \to P \lor \neg P, and ex falso quodlibet yields a contradiction resolving to the disjunction. Thus, while permits P¬¬PP \lor \neg\neg P without full elimination, the stronger ¬¬PP\neg\neg P \to P enforces classical bivalence, highlighting how intermediate logics like KC validate yet reject this implication to preserve constructivity. These variants find application in resource-sensitive frameworks such as and , where full excluded middle may violate contraction or relevance principles, but weakened forms like support controlled resource usage in proofs without unrestricted duplication.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.