Hubbry Logo
List of rules of inferenceList of rules of inferenceMain
Open search
List of rules of inference
Community hub
List of rules of inference
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
List of rules of inference
List of rules of inference
from Wikipedia

This is a list of rules of inference, logical laws that relate to mathematical formulae.

Introduction

[edit]

Rules of inference are syntactical transform rules which one can use to infer a conclusion from a premise to create an argument. A set of rules can be used to infer any valid conclusion if it is complete, while never inferring an invalid conclusion, if it is sound. A sound and complete set of rules need not include every rule in the following list, as many of the rules are redundant, and can be proven with the other rules.

Discharge rules permit inference from a subderivation based on a temporary assumption. Below, the notation

indicates such a subderivation from the temporary assumption to .

Rules for negations

[edit]
Reductio ad absurdum (or Negation Introduction)
Reductio ad absurdum (related to the law of excluded middle)
Ex contradictione quodlibet

Rules for conditionals

[edit]
Deduction theorem (or Conditional Introduction)
Modus ponens (a type of Conditional Elimination)
Modus tollens (a type of Conditional Elimination)

Rules for conjunctions

[edit]
Adjunction (or Conjunction Introduction)
Simplification (or Conjunction Elimination)

Rules for disjunctions

[edit]
Addition (or Disjunction Introduction)
Case analysis (or Proof by Cases or Argument by Cases or Disjunction elimination)
Disjunctive syllogism
Constructive dilemma

Rules for biconditionals

[edit]
Biconditional introduction
Biconditional elimination

Rules of classical predicate calculus

[edit]

In the following rules, is exactly like except for having the term wherever has the free variable .

Universal Generalization (or Universal Introduction)

Restriction 1: is a variable which does not occur in .
Restriction 2: is not mentioned in any hypothesis or undischarged assumptions.

Universal Instantiation (or Universal Elimination)

Restriction: No free occurrence of in falls within the scope of a quantifier quantifying a variable occurring in .

Existential Generalization (or Existential Introduction)

Restriction: No free occurrence of in falls within the scope of a quantifier quantifying a variable occurring in .

Existential Instantiation (or Existential Elimination)

Restriction 1: is a variable which does not occur in .
Restriction 2: There is no occurrence, free or bound, of in .
Restriction 3: is not mentioned in any hypothesis or undischarged assumptions.

The following are special cases of universal generalization and existential elimination; these occur in substructural logics, such as linear logic.

Rule of weakening (or monotonicity of entailment) (aka no-cloning theorem)
Rule of contraction (or idempotency of entailment) (aka no-deleting theorem)

Table: Rules of Inference

[edit]

The rules above can be summed up in the following table.[1] The "Tautology" column shows how to interpret the notation of a given rule.

Rules of inference Tautology Name
Modus ponens
Modus tollens
Hypothetical syllogism
Absorption
Conjunction introduction
Conjunction elimination
Disjunction introduction
Disjunction elimination
Disjunctive syllogism
Disjunctive simplification
Resolution
Biconditional introduction

All rules use the basic logic operators. A complete table of "logic operators" is shown by a truth table, giving definitions of all the possible (16) truth functions of 2 boolean variables (p, q):

p q  0   1   2   3   4   5   6   7   8   9  10 11 12 13 14 15
T T F F F F F F F F T T T T T T T T
T F F F F F T T T T F F F F T T T T
F T F F T T F F T T F F T T F F T T
F F F T F T F T F T F T F T F T F T

where T = true and F = false, and, the columns are the logical operators:

Each logic operator can be used in an assertion about variables and operations, showing a basic rule of inference. Examples:

  • The column-14 operator (OR), shows Addition rule: when p=T (the hypothesis selects the first two lines of the table), we see (at column-14) that pq=T.
    We can see also that, with the same premise, another conclusions are valid: columns 12, 14 and 15 are T.
  • The column-8 operator (AND), shows Simplification rule: when pq=T (first line of the table), we see that p=T.
    With this premise, we also conclude that q=T, pq=T, etc. as shown by columns 9–15.
  • The column-11 operator (IF/THEN), shows Modus ponens rule: when pq=T and p=T only one line of the truth table (the first) satisfies these two conditions. On this line, q is also true. Therefore, whenever p → q is true and p is true, q must also be true.

Machines and well-trained people use this look at table approach to do basic inferences, and to check if other inferences (for the same premises) can be obtained.[citation needed]

Example 1

[edit]

Consider the following assumptions: "If it rains today, then we will not go on a canoe today. If we do not go on a canoe trip today, then we will go on a canoe trip tomorrow. Therefore (Mathematical symbol for "therefore" is ), if it rains today, we will go on a canoe trip tomorrow". To make use of the rules of inference in the above table we let be the proposition "If it rains today", be "We will not go on a canoe today" and let be "We will go on a canoe trip tomorrow". Then this argument is of the form:

Example 2

[edit]

Consider a more complex set of assumptions: "It is not sunny today and it is colder than yesterday". "We will go swimming only if it is sunny", "If we do not go swimming, then we will have a barbecue", and "If we will have a barbecue, then we will be home by sunset" lead to the conclusion "We will be home by sunset." Proof by rules of inference: Let be the proposition "It is sunny today", the proposition "It is colder than yesterday", the proposition "We will go swimming", the proposition "We will have a barbecue", and the proposition "We will be home by sunset". Then the hypotheses become and . Using our intuition we conjecture that the conclusion might be . Using the Rules of Inference table we can prove the conjecture easily:

Step Reason
1. Hypothesis
2. Simplification using Step 1
3. Hypothesis
4. Modus tollens using Step 2 and 3
5. Hypothesis
6. Modus ponens using Step 4 and 5
7. Hypothesis
8. Modus ponens using Step 6 and 7

See also

[edit]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
A list of rules of inference enumerates the fundamental, sound principles in formal logic that enable the valid derivation of new statements from established premises, ensuring that conclusions logically follow without introducing falsehoods. These rules underpin in systems such as propositional logic and predicate logic, where they facilitate the construction of proofs by transforming compound statements through tautological equivalences or direct inferences. In propositional logic, prominent rules include (from PP and PQP \to Q, infer QQ), (from ¬Q\neg Q and PQP \to Q, infer ¬P\neg P), and (from PQP \to Q and QRQ \to R, infer PRP \to R), which handle connectives like implication, , and disjunction. Predicate logic builds upon these by incorporating quantifier-specific rules, such as (from xP(x)\forall x \, P(x), infer P(c)P(c) for any constant cc) and existential generalization (from P(c)P(c), infer xP(x)\exists x \, P(x)), allowing inferences about individuals and classes within a domain. Together, these rules support rigorous argumentation in , , and , with their application often structured in sequential proofs that justify each step to verify validity.

Fundamentals

Definition and Scope

A is a valid argument form that permits the derivation of new propositions from given premises within a formal logical system, serving as a foundational mechanism for . These rules function as templates or schemas that ensure the resulting conclusion preserves the truth of the premises when applied correctly. To apply them, one must first understand basic elements such as propositions, which are declarative statements that can be true or false; logical connectives, including (¬), conjunction (∧), disjunction (∨), implication (→), and biconditional (↔); and, in more expressive systems, quantifiers like the universal (∀) and existential (∃) that bind variables to predicates. Rules of inference operate on the syntactic level, providing schema-based derivations through formal proofs in a deductive system, where a conclusion is mechanically obtainable from via specified steps. This syntactic approach differs from semantic entailment, which evaluates whether the truth of the necessitates the truth of the conclusion across all possible interpretations or models of the , without reference to proof procedures. In well-designed logical systems, ensures that syntactic derivations yield semantically valid results, while completeness guarantees that all semantic entailments can be captured syntactically. The scope of rules of inference encompasses both classical logics—such as propositional and predicate logic—and non-classical variants, including , which rejects certain classical principles like the , and substructural logics that omit or restrict structural rules like contraction or weakening. Beyond theoretical foundations, these rules are essential in practical applications like , where computational systems apply inference rules systematically to verify mathematical statements. They also underpin AI reasoning mechanisms, enabling to perform valid deductions in domains such as expert systems and automated planning.

Historical Context

The origins of rules of inference trace back to the 4th century BCE, when developed as foundational patterns for in . In his , systematically classified valid syllogistic moods, such as the first-figure Barbara syllogism ("; is a man; therefore, is mortal"), demonstrating how conclusions follow necessarily from categorical premises sharing a common term. This work represented the earliest comprehensive effort to identify and validate structures, influencing logical thought for over two millennia. In the late 19th and early 20th centuries, the formalization of logic advanced significantly through axiomatic systems pioneered by , , and , shifting rules toward and rigorous mathematical foundations. Frege's (1879) introduced a notation and principles, including a generalized , to express and derive truths in predicate logic without reliance on ambiguities. Russell, collaborating with in (1910–1913), employed axiomatic methods with primitive rules like and to reduce mathematics to logic, incorporating to avoid paradoxes. Concurrently, Hilbert's axiomatic approach, as outlined in Grundzüge der theoretischen Logik (1928, with ), emphasized finite rules and consistency proofs, formalizing propositional and predicate logics through schemas like implication introduction. A pivotal shift occurred in the 1930s with Gerhard Gentzen's introduction of systems, which prioritized inference rules over extensive axiomatic lists to mirror intuitive reasoning processes. In his Untersuchungen über das logische Schließen (1934–1935), Gentzen formulated introduction and elimination rules for logical connectives, such as those for implication (from A and A → B infer B) and conjunction, proving properties like cut-elimination to ensure derivations used only necessary subformulas. This framework emphasized the constructive nature of proofs, influencing subsequent developments in . Post-1950s, rules of inference extended into via the , which established an between logical proofs and computational programs. Haskell Curry's foundational work on in the 1930s–1960s, culminating in Foundations of (1963), linked deduction to functional abstraction, while William Howard's 1969 manuscript (published 1980) formalized the "propositions as types" principle, mapping rules to terms. This correspondence highlighted how inference rules underpin programming languages, enabling type checking as proof validation. During this era, classical propositional and predicate logics were fully formalized as axiomatic and systems.

Propositional Logic

Negation Rules

In propositional logic, (¬) is a unary connective that inverts the of a , and its associated rules facilitate deriving conclusions involving falsity or contradiction within classical systems. These rules are sound and complete in classical propositional logic, meaning they preserve truth and can derive all valid inferences from axioms or assumptions. Central to negation rules is the treatment of contradiction (⊥) as deriving any , reflecting the explosion principle where inconsistency implies everything. Double negation elimination allows inferring a P from its double negation ¬¬P, formally ¬¬P ⊢ P. This rule is valid in because ¬¬P is logically equivalent to P, as verified by truth tables where both expressions share the same across all assignments: if P is true, ¬¬P is true; if P is false, ¬¬P is false. The equivalence holds due to the bivalence of , where are strictly true or false, enabling the elimination without loss of validity. Ex falso quodlibet, or the principle of explosion, states that from a contradiction ⊥, any P can be inferred: ⊥ ⊢ P. This rule, also expressed as deriving ¬P from ⊥ in some contexts, captures the idea that falsehood implies anything, preventing triviality in consistent systems. It is a of classical deduction, derivable from the rule combined with , and is rejected in paraconsistent logics to handle inconsistencies non-explosively. Negation introduction, known as , permits inferring ¬P if assuming P leads to a contradiction: P ⊢ ⊥ therefore ⊢ ¬P. This indirect proof method assumes the negation's opposite and derives absurdity to affirm the negation, valid under classical assumptions where contradiction entails falsehood. The , P ∨ ¬P, underpins this and other negation rules by asserting every proposition is true or false, ensuring no "middle" ground and enabling exhaustive case analysis in proofs. De Morgan's laws provide equivalences for negating compound propositions: ¬(P ∧ Q) ⊢ ¬P ∨ ¬Q, and its converse ¬P ∨ ¬Q ⊢ ¬(P ∧ Q), with the dual for disjunction ¬(P ∨ Q) ⊢ ¬P ∧ ¬Q. These bidirectional rules allow pushing inward, distributing it over conjunction and disjunction, and are tautologies in confirmed by truth tables showing identical truth values for equivalent forms.

Implication Rules

Implication rules in propositional logic govern inferences involving the conditional connective \rightarrow, which represents hypothetical reasoning where the truth of the antecedent implies the truth of the consequent. These rules enable the derivation of new propositions from established conditionals and facts, forming the backbone of deductive arguments in . They are sound, meaning they preserve truth: if the are true, the conclusion must be true.

Modus Ponens

Modus ponens, also known as the rule of detachment, allows the inference of the consequent QQ from a conditional PQP \rightarrow Q and the antecedent PP. Formally, given PQP \rightarrow Q and PP, conclude QQ. This rule is justified semantically by its corresponding tautology [P(PQ)]Q[P \land (P \rightarrow Q)] \rightarrow Q, which holds in all truth assignments. To verify via , consider the premises PQP \rightarrow Q and PP; the only case where both are true occurs when PP is true and QQ is true, ensuring the conclusion QQ is true. The full truth table is: PQPQPQTTTTTTFFTFFTTFTFFTFF\begin{array}{ccccc} P & Q & P \rightarrow Q & P & Q \\ \hline \mathrm{T} & \mathrm{T} & \mathrm{T} & \mathrm{T} & \mathrm{T} \\ \mathrm{T} & \mathrm{F} & \mathrm{F} & \mathrm{T} & \mathrm{F} \\ \mathrm{F} & \mathrm{T} & \mathrm{T} & \mathrm{F} & \mathrm{T} \\ \mathrm{F} & \mathrm{F} & \mathrm{T} & \mathrm{F} & \mathrm{F} \\ \end{array} No row shows the premises true and the conclusion false, confirming validity. For example, if "If it rains, the ground is wet" (PQP \rightarrow Q) and "It rains" (PP), then "The ground is wet" (QQ).

Modus Tollens

Modus tollens permits inferring the of the antecedent ¬P\neg P from a conditional PQP \rightarrow Q and the of the consequent ¬Q\neg Q. Formally, given PQP \rightarrow Q and ¬Q\neg Q, conclude ¬P\neg P. This is a valid form, supported by the tautology [(PQ)¬Q]¬P[(P \rightarrow Q) \land \neg Q] \rightarrow \neg P. The rule relies on the contrapositive equivalence PQ¬Q¬PP \rightarrow Q \equiv \neg Q \rightarrow \neg P, which is established by equivalence where both expressions share the same truth values except when PP is true and QQ is false. Applying to the contrapositive yields the result. For instance, if "If it rains, the ground is wet" (PQP \rightarrow Q) and "The ground is not wet" (¬Q\neg Q), then "It does not rain" (¬P\neg P).

Hypothetical Syllogism

Hypothetical syllogism, or the chain rule, infers a transitive conditional from two linked conditionals. Formally, given PQP \rightarrow Q and QRQ \rightarrow R, conclude PRP \rightarrow R. This follows from the tautology [(PQ)(QR)](PR)[(P \rightarrow Q) \land (Q \rightarrow R)] \rightarrow (P \rightarrow R), which is true under all interpretations since the antecedent's truth forces the consequent via substitution. Semantically, if PP implies QQ and QQ implies RR, then PP cannot be true without RR being true, preserving the implication. An example is: "If it rains, clouds form" (PQP \rightarrow Q); "If clouds form, it is " (QRQ \rightarrow R); therefore, "If it rains, it is overcast" (PRP \rightarrow R).

Constructive Dilemma

Constructive dilemma combines conditionals with a disjunction to yield a disjunctive conclusion. Formally, given (PQ)(RS)(P \rightarrow Q) \land (R \rightarrow S) and PRP \lor R, conclude QSQ \lor S. This is a valid , derivable from and applied to cases. The justification stems from exhaustive case analysis: if PP holds, then QQ by the first conditional; if RR holds, then SS by the second, yielding the disjunction in either scenario. For example: "(If it rains, stay in) and (if sunny, go out)" ((PQ)(RS)(P \rightarrow Q) \land (R \rightarrow S)); "It is raining or sunny" (PRP \lor R); therefore, "Stay in or go out" (QSQ \lor S).

Absorption

Absorption strengthens a conditional by incorporating the antecedent into the consequent. Formally, given PQP \rightarrow Q, conclude P(PQ)P \rightarrow (P \land Q). This is justified by the tautology (PQ)[P(PQ)](P \rightarrow Q) \rightarrow [P \rightarrow (P \land Q)], which holds because assuming PP true and PQP \rightarrow Q true forces QQ true, making PQP \land Q true. The rule reflects that the original implication already commits to QQ under PP, so conjoining PP adds no new falsifying possibility. An illustrative case: From "If studying, one passes" (PQP \rightarrow Q), infer "If studying, one studies and passes" (P(PQ)P \rightarrow (P \land Q)).

Conjunction Rules

In propositional logic, conjunction rules facilitate the combination of propositions using the connective ∧ (logical "and") and the subsequent manipulation of such combinations. These rules form the core of involving conjunctive statements, enabling the synthesis of multiple into a single assertion and the extraction of individual components for further reasoning. They are fundamental in systems, where they ensure soundness and completeness relative to classical truth-functional semantics. The conjunction introduction rule (∧I), a basic synthesis rule, permits inferring the conjunction of two propositions from their separate establishment. Formally, given PP and QQ, one may conclude PQP \wedge Q: PQPQ(I)\frac{P \quad Q}{P \wedge Q} \quad (\wedge I) This rule reflects the intuitive notion that both propositions must hold for their joint assertion to be valid, and it was originally formulated as part of Gerhard Gentzen's framework in 1934. Conjunction elimination (∧E), also known as simplification, allows extracting either conjunct from a given conjunction as a standalone . This yields two symmetric inferences: from PQP \wedge Q, one may derive PP: PQP(E1)\frac{P \wedge Q}{P} \quad (\wedge E_1) or symmetrically QQ via E2\wedge E_2. These elimination rules enable the decomposition of complex s, streamlining proofs by isolating components, and originate from the same system developed by Gentzen. De Morgan's law for conjunction establishes a key equivalence relating , conjunction, and disjunction: the negation of a disjunction is logically equivalent to the conjunction of the negations, ¬(PQ)¬P¬Q\neg(P \vee Q) \equiv \neg P \wedge \neg Q. As an inference rule, this permits deriving ¬P¬Q\neg P \wedge \neg Q from ¬(PQ)\neg(P \vee Q), with the converse holding as well. These laws, which highlight the duality between conjunction and disjunction under , were first articulated by in his 1847 treatise on formal logic. The exportation rule addresses nested implications involving conjunctions, allowing the restructuring of a conditional with a conjunctive antecedent into an iterated implication: (P[Q](/page/Q))RP([Q](/page/Q)R)(P \wedge [Q](/page/Q)) \to R \equiv P \to ([Q](/page/Q) \to R). This equivalence supports the redistribution of conjuncts in hypothetical reasoning, preserving logical validity. It is a standard derived rule in propositional logic systems, valid under classical semantics. Under truth-functional semantics, the conjunction connective ∧ is defined such that P[Q](/page/Q)P \wedge [Q](/page/Q) is true precisely when both PP and QQ are true, and false otherwise; this binary truth condition underpins the validity of all conjunction rules in classical propositional logic. De Morgan's laws further illustrate the interplay between conjunction and disjunction through negation.

Disjunction Rules

Disjunction rules in propositional logic govern the use of the disjunction connective (∨), which asserts that at least one of the disjuncts is true. These rules facilitate the introduction of disjunctions into proofs, their elimination through case analysis or negation, and transformations involving negation, enabling the construction of valid arguments by handling alternatives. Disjunction introduction allows the derivation of a disjunction from one of its disjuncts. Formally, from premise PP, one may infer PQP \lor Q for any formula QQ; symmetrically, from QQ, infer QPQ \lor P. This rule reflects the inclusive nature of disjunction, where adding an alternative does not alter the truth if the original disjunct holds. Disjunctive syllogism provides a method for eliminating one disjunct via . Given premises PQP \lor Q and ¬P\neg P, the conclusion QQ follows, as the falsity of PP forces QQ to be true to satisfy the disjunction. This rule is a derived in systems and is fundamental for resolving binary choices in arguments. The destructive dilemma extends disjunctive reasoning to hypothetical scenarios. With (PQ)(RS)(P \to Q) \land (R \to S) and ¬Q¬S\neg Q \lor \neg S, one infers ¬P¬R\neg P \lor \neg R, effectively dismantling the implications by denying their consequents and disjoining the negated antecedents. This rule is particularly useful in refuting paired hypotheses through alternative denials. De Morgan's law for disjunction relates to conjunction and disjunction. Specifically, from ¬(PQ)\neg (P \land Q), one derives ¬P¬Q\neg P \lor \neg Q; conversely, from ¬P¬Q\neg P \lor \neg Q, infer ¬(PQ)\neg (P \land Q). These equivalences, treated as bidirectional inference rules, allow pushing inward, transforming conjunctive negations into disjunctive forms and vice versa. Case analysis, or , supports proof strategies by resolving disjunctions through exhaustive consideration. Assuming PQP \lor Q, if RR is derived from PP alone and also from QQ alone, then RR follows unconditionally. This rule underpins dilemma resolution in proofs, contrasting with conjunction's joint affirmation by emphasizing alternatives.

Biconditional Rules

In propositional logic, the biconditional connective, denoted ↔, expresses between two propositions P and Q, asserting that P is true Q is true. This captures mutual implication, distinguishing it from the unidirectional implication (→). Rules of inference for the biconditional enable the derivation of equivalences from established implications and the extraction of implications from equivalences, forming a core component of systems. The biconditional introduction rule allows the inference of P ↔ Q from the conjunction of the two directional implications: (P → Q) ∧ (Q → P) ⊢ P ↔ Q. This rule formalizes the construction of an equivalence by verifying both directions of implication, ensuring the propositions hold under identical conditions. Complementing this, the biconditional elimination rule permits the extraction of either implication from the biconditional: P ↔ Q ⊢ P → Q and P ↔ Q ⊢ Q → P. These elimination steps decompose the equivalence into its constituent conditionals, facilitating further deductions in proofs. The biconditional can also be defined in terms of other connectives through logical equivalence: P ↔ Q ≡ (P ∧ Q) ∨ (¬P ∧ ¬Q). This formulation highlights the biconditional as true precisely when both propositions share the same truth value—either both true or both false—providing a truth-functional basis without relying on primitive ↔. In natural deduction, conditional proof supports biconditional derivations by structuring subproofs: to establish P ↔ Q, one assumes P and derives Q (yielding P → Q), then assumes Q and derives P (yielding Q → P), after which biconditional introduction applies. This method leverages assumption-based reasoning to build the required implications incrementally. Validation of these rules aligns with the biconditional's truth table, which confirms ↔ as the equality of truth values:
PQP ↔ Q
TTT
TFF
FTF
FFT
Here, P ↔ Q evaluates to true only when P and Q match (both true or both false), underpinning the introduction and elimination rules' soundness in propositional logic.

Predicate Logic

Quantifier Introduction and Elimination

In first-order predicate logic, the rules of inference for quantifiers allow the manipulation of universal (∀) and existential (∃) quantifiers to extend reasoning from specific instances to general statements or vice versa, building upon propositional logic by incorporating variables and domains. These rules ensure soundness and completeness in systems, where universal introduction and elimination handle generalizations and instantiations for ∀, while existential introduction and elimination manage assertions of existence and their consequences for ∃. Universal instantiation, also known as universal elimination, permits deriving a specific instance from a universally quantified statement. Formally, from the premise ∀x P(x), one infers P(c), where c is any term in the that is free for substitution in place of x to avoid variable capture. This rule applies without restrictions on c beyond ensuring the substitution yields a , allowing the universal claim to be applied to particular objects in the domain. For example, from ∀x (Human(x) → Mortal(x)), one may infer Mortal() by substituting the constant for x. Existential generalization, or existential introduction, reverses this process for existence claims by introducing ∃ from a specific instance. Formally, from P(c), one infers ∃x P(x), where c is a term substitutable for x without causing binding issues. This rule asserts that if a property holds for some particular object, then there exists at least one object with that , applicable to any valid term c. For instance, from Mortal(), one derives ∃x Mortal(x). The substitution must preserve the formula's structure, ensuring no unintended variable bindings occur. Universal introduction, or , enables inferring a universal quantifier from a statement about an arbitrary variable. Formally, assuming a set of Γ where the variable x is not free in any in Γ, and deriving P(x) from Γ, one infers ∀x P(x). The key restriction is that x must be treated as arbitrary, meaning it does not appear free in undischarged assumptions, preventing dependence on specific values. This rule formalizes induction over the domain, as in deriving ∀x ((x) → Mortal(x)) from a proof of (x) → Mortal(x) where x is arbitrary and unbound in . Variable restrictions ensure x remains a placeholder without prior bindings. Existential elimination, also called existential instantiation, extracts consequences from an existence claim by introducing a fresh . Formally, given ∃x P(x) and a subproof assuming P(y) (where y is a fresh variable not occurring free in Q, the undischarged assumptions, P(x), or ∃x P(x)), deriving Q, one infers Q. The freshness of y prevents capture errors and ensures the derivation does not rely on a specific identity for the existent object. For example, from ∃x (Human(x) ∧ Mortal(x)), assume Human(y) ∧ Mortal(y) and derive ∃z Mortal(z) (e.g., by existential generalization on Mortal(y)), then infer ∃z Mortal(z), with y chosen anew to avoid conflicts. This scoped use of y maintains the rule's validity across derivations. Central to all quantifier rules are scope and binding conventions to prevent variable capture, where a substituted term's free variables become unintentionally bound by an outer quantifier. In formal systems, substitutions require terms to be "free for" the variable, meaning no free variable in the term is captured by a quantifier in the formula's scope. For instance, substituting a term containing a free variable z into P(x) yielding ∀z Q(x, z) would capture z if not checked, invalidating the inference; thus, rules mandate renaming bound variables or using fresh names to resolve such conflicts. These safeguards ensure the rules preserve and avoid paradoxes in predicate logic derivations.

Predicate-Specific Inference Rules

In predicate logic, equality is treated as a special binary predicate symbol, subject to specific inference rules that ensure its reflexive and substitutive properties. The equality introduction rule, also known as reflexivity, states that for any term tt, one may infer t=tt = t without premises. This rule establishes the foundational property that every entity is identical to itself, forming the basis for all subsequent equalities in derivations. Complementing this is the equality elimination rule, or substitution rule, which permits replacing one side of an established equality with the other within any predicate or term. Formally, given premises a=ba = b and a P(a)P(a) where PP is any predicate containing the term aa, one infers P(b)P(b). This rule supports the indiscernibility of identicals by allowing consistent substitution across atomic formulas, preserving truth in interpretations. For instance, if x=yx = y and Loves(z,x)Loves(z, x), then Loves(z,y)Loves(z, y). In systems without function symbols, substitution applies directly to variables or constants; with functions, it extends to nested terms while respecting binding scopes. Leibniz's law provides a generalization of equality elimination, capturing the principle of the indiscernibility of identicals across all predicates. It is expressed as the xy(x=y(P(x)P(y)))\forall x \forall y (x = y \to (P(x) \leftrightarrow P(y))), where PP ranges over all predicate symbols in the language. This schema ensures that identical terms share all properties definable by the predicates, serving as a foundational in theories with equality. Derived rules from this schema allow substitution in complex formulas, such as replacing xx with yy in quantified contexts under the equality x=yx = y, provided no variable capture occurs. Leibniz's law underpins the congruence properties of equality, making it essential for proving theorems in and . In classical predicate logic with fixed domains, a Barcan-like formula xP(x)yP(y)\forall x \, P(x) \to \forall y \, P(y) holds tautologically under the assumption of uniform domains, as the bound variables xx and yy are merely placeholders with identical scopes. This validity arises because classical interpretations maintain constant domains across the entire structure, rendering variable renaming inconsequential and ensuring the implication is always true regardless of the predicate PP. While the full Barcan formula originates in modal extensions to handle varying domains—where it corresponds to the condition that accessible worlds do not introduce new individuals—its classical counterpart reinforces the stability of quantification over predicates without modal operators. Rules involving function symbols and term substitution are central to Herbrand semantics, which interprets via ground terms without reference to external structures. In a Herbrand universe, the set of all ground terms is generated by applying function symbols to constants and themselves, forming a countable domain where each function ff of arity nn maps nn-tuples of terms to new terms. The substitution rule in this context allows replacing variables in formulas with terms from the Herbrand universe, enabling the instantiation of universal quantifiers to ground instances for theorem proving. For example, given a function ff and term tt, f(t)f(t) denotes a unique ground term, and substitutions like σ={xf(c)}\sigma = \{x \mapsto f(c)\} transform formulas while preserving satisfiability in Herbrand models. These rules facilitate resolution-based deduction by unifying terms through most general unifiers, ensuring completeness for unsatisfiability checks in clausal form. Quantifiers enable the application of predicates over these terms, but the substitution mechanics are predicate-independent.

Non-Classical Logics

Substructural Logic Rules

Substructural logics constitute a class of non-classical logical systems that modify or omit certain structural rules of classical sequent calculus, such as weakening, contraction, and exchange, to model resource-sensitive reasoning or relevance constraints. These restrictions prevent unrestricted manipulation of assumptions, ensuring that premises are used in a controlled manner, which contrasts with classical logic where such rules allow free addition, duplication, or reordering of hypotheses. The weakening rule, which permits deriving X,YAX, Y \vdash A from XAX \vdash A by adding irrelevant assumptions YY, is rejected in many substructural logics to enforce between and conclusions. In relevant logic, this restriction ensures that every contributes to the derivation; for instance, classical theorems like pqpp \vdash q \to p fail because qq bears no to pp. Anderson and Belnap formalized this in their foundational work, emphasizing that entailment requires shared content between antecedent and consequent. Similarly, in , weakening's absence treats logical as consumable resources, preventing their gratuitous introduction to reflect finite computational or physical constraints. Contraction, the rule allowing XAX \vdash A from X,XAX, X \vdash A by merging duplicate assumptions, is also often absent, prohibiting automatic reuse of premises without explicit licensing. exemplifies this by disallowing premise duplication, which avoids paradoxes arising from infinite reuse, such as those in self-referential contexts; instead, the modality !A!A (of course) introduces reusability for specific formulas, enabling controlled resource renewal in proofs. Girard introduced this mechanism to refine while preserving its expressive power for concurrent processes. The exchange rule, or permutation, permits reordering premises in multi-premise inferences, such as deriving Y,XAY, X \vdash A from X,YAX, Y \vdash A. While present in many substructural systems, it is omitted in non-commutative variants like the Lambek calculus, where premise order models syntactic composition in linguistics, ensuring X,YAX, Y \vdash A does not imply Y,XAY, X \vdash A. In relevant logic, implication introduction imposes a relevance condition: to derive PQP \to Q from assumption PP, the derivation of QQ must utilize PP, preventing irrelevant discharges that weaken inferential connections. This rule, central to systems like , maintains logical discipline by requiring premise involvement, as detailed in Anderson and Belnap's analysis of entailment. Cut-elimination theorems in substructural sequent calculi adapt Gentzen's classical result by accounting for restricted structural rules, often yielding analytic proofs without the cut rule XAY,ABX,YB\frac{X \vdash A \quad Y, A \vdash B}{X, Y \vdash B}. In linear logic, cut-elimination holds despite the absence of weakening and contraction, facilitating normalization in resource-aware proofs. For relevant logics, systems like those in FL (full Lambek) demonstrate cut-admissibility under constraints, with algebraic studies confirming elimination in contraction-free settings.

Intuitionistic Logic Rules

, developed as a for constructive mathematics, employs rules of inference that emphasize the existence of explicit proofs or constructions for s, diverging from by rejecting non-constructive principles. The Brouwer-Heyting-Kolmogorov (BHK) interpretation provides the foundational semantic understanding, where a proof of a corresponds to a mental : for implication ABA \to B, it is a method transforming any proof of AA into a proof of BB; for ¬A\neg A, it is a construction leading to a contradiction from any supposed proof of AA; and for disjunction ABA \vee B, it requires deciding which disjunct holds via a proof of AA or BB. This interpretation, articulated by in his early works on , formalized by Arend Heyting in 1930, and refined by in his problem-solving calculus, ensures that inferences are valid only if they preserve constructivity. The rules for implication introduction and elimination in intuitionistic logic mirror those in classical logic, maintaining compatibility for basic hypothetical reasoning while adhering to constructive constraints. Implication introduction allows deriving ϕψ\phi \to \psi by assuming ϕ\phi and deriving ψ\psi within that scope, discharging the assumption upon conclusion. Implication elimination, or , permits inferring ψ\psi from ϕψ\phi \to \psi and ϕ\phi. However, intuitionistic logic excludes , ((PQ)P)P((P \to Q) \to P) \to P, which is classically valid but non-constructive, as it does not provide an effective method for deriving PP without assuming the antecedent's proof conditions. Negation is treated as an implication to falsehood, defined as ¬PP\neg P \equiv P \to \bot, where \bot represents or contradiction. Negation introduction proceeds by assuming PP and deriving \bot, yielding ¬P\neg P; elimination from ¬P\neg P and PP yields \bot. This definition precludes double negation elimination, as ¬¬P⊬P\neg \neg P \not\vdash P in general, since a construction showing the inconsistency of ¬P\neg P (i.e., a proof that PP \to \bot leads to \bot) does not constructively establish PP itself—only that PP is not impossible. Disjunction elimination requires exhaustive constructive cases: from PQP \vee Q, a proof of RR assuming PP, and a proof of RR assuming QQ, one infers RR. Under the BHK interpretation, this rule demands an effective decision procedure to select the appropriate case based on the proof of the disjunction, ensuring the inference yields a uniform construction for RR. Intuitionistic logic rejects the law of excluded middle, P¬PP \vee \neg P, as a general theorem, since it would assert that for any proposition, either it or its negation is constructively provable—a claim Brouwer deemed unfounded without prior resolution of the problem. This rejection aligns with the constructive ethos, avoiding assumptions of decidability for all statements.

Applications and Examples

Summary Table of Key Rules

The following table provides a concise overview of selected key rules of inference in propositional, predicate, and non-classical logics, presented in natural deduction style for clarity.
Rule NameSchemaLogic TypeBrief Description
Modus Ponens (Implication Elimination)A,ABBA, A \to B \vdash BPropositionalInfers the consequent from the antecedent and an implication.
Conjunction IntroductionA,BABA, B \vdash A \land BPropositionalForms a conjunction from its separate conjuncts.
Conjunction EliminationABAA \land B \vdash A (or BB)PropositionalExtracts a conjunct from a conjunction.
Disjunction IntroductionAABA \vdash A \lor B (or from BB)PropositionalIntroduces a disjunction by adding a disjunct.
Disjunction EliminationAB,AC,BCCA \lor B, A \vdash C, B \vdash C \vdash CPropositionalInfers a common conclusion from a disjunction and subproofs for each case.
Negation IntroductionSubproof: A¬AA \vdash \bot \vdash \neg APropositionalInfers the negation of a hypothesis leading to contradiction (reductio ad absurdum).
Ex Falso QuodlibetA,¬ABA, \neg A \vdash B (any BB)PropositionalDerives any formula from a contradiction.
Universal InstantiationxA(x)A(t)\forall x \, A(x) \vdash A(t)PredicateApplies a universal quantifier to a specific term.
Universal GeneralizationSubproof (arbitrary aa): A(a)xA(x)A(a) \vdash \forall x \, A(x)PredicateGeneralizes an instance to a universal quantification under suitable conditions.
Existential InstantiationxA(x)\exists x \, A(x), subproof A(a)CCA(a) \vdash C \vdash C ( aa fresh)PredicateExtracts a witness from an existential to derive a conclusion.
Existential GeneralizationA(t)xA(x)A(t) \vdash \exists x \, A(x)PredicateForms an existential from a specific instance.
Intuitionistic Implication IntroductionSubproof: ABABA \vdash B \vdash A \to B (constructive discharge)Non-classical (Intuitionistic)Infers an implication only via a constructive proof from antecedent to consequent.
Intuitionistic Negation EliminationIf ΓA\Gamma \vdash A and Γ¬A\Gamma \vdash \neg A, then ΓB\Gamma \vdash B (for any BB)Non-classical (Intuitionistic)Derives any conclusion from premises entailing a formula and its negation, preserving constructivity.
Weakening (Structural)ΓA\Gamma \vdash A implies Γ,BA\Gamma, B \vdash ANon-classical (Substructural, e.g., Relevance)Adds irrelevant premises (rejected in relevance logics to enforce premise relevance).
Contraction (Structural)Γ,A,AB\Gamma, A, A \vdash B implies Γ,AB\Gamma, A \vdash BNon-classical (Substructural, e.g., Linear)Reuses a premise multiple times (absent in linear logic to model resource sensitivity).
1 Note: Negation rules vary between classical and intuitionistic logics; classical systems allow elimination (¬¬AA\neg \neg A \vdash A), while intuitionistic logics require constructive justification and treat negation as implication to falsehood (¬AA\neg A \equiv A \to \bot).

Practical Illustrations

In propositional logic, rules of inference enable the construction of valid proofs from premises to conclusions. Consider the tautology (P(PQ))Q(P \land (P \to Q)) \to Q, which can be derived using conjunction elimination (E\land E) and modus ponens (E\to E). The proof proceeds as follows: Assume the antecedent P(PQ)P \land (P \to Q) (line 1). Apply E\land E to derive PP (line 2) and PQP \to Q (line 3). Then, apply E\to E (modus ponens) to lines 2 and 3 to obtain QQ (line 4). By conditional proof, this discharges the assumption and yields the implication (lines 1–4). This illustrates how these rules systematically affirm the consequent from a conjunction involving an implication, ensuring soundness in deductive reasoning. In predicate logic, rules like universal instantiation (UI) and modus tollens (E\to E via contrapositive) facilitate derivations involving quantified statements. For the syllogism "All humans are mortal" (x(Human(x)Mortal(x))\forall x (Human(x) \to Mortal(x))) and "Zeus is not mortal" (¬Mortal(z)\neg Mortal(z)), the goal is to infer "Zeus is not human" (¬Human(z)\neg Human(z)). First, apply UI to the universal premise, yielding Human(z)Mortal(z)Human(z) \to Mortal(z). Then, apply modus tollens using this implication and ¬Mortal(z)\neg Mortal(z) to conclude ¬Human(z)\neg Human(z). This derivation highlights the contrapositive application of the universal claim, demonstrating how predicate rules extend propositional inferences to individuals while preserving validity. Intuitionistic logic rejects the law of the excluded middle (P¬PP \lor \neg P) because it requires constructive evidence for disjunctions, which may not exist for undecidable statements. A classic failure case is , PP: "Every even integer greater than 2 is the sum of two primes" (nN(n>2n evenp,qP(p+q=n))\forall n \in \mathbb{N} (n > 2 \land n \text{ even} \to \exists p, q \in \mathbb{P} (p + q = n))). Classically, either PP or ¬P\neg P holds, but intuitionistically, without a proof of PP or a for ¬P\neg P, neither disjunct can be asserted constructively. This shows how the rule fails in non-classical settings, emphasizing the need for explicit constructions over mere non-contradiction. Rules of inference underpin formal verification tools, such as SAT solvers and type checkers in programming languages. In SAT solvers, propositional rules like resolution—derived from and conjunction introduction—enable efficient clause learning and conflict-driven to determine of formulas in . Similarly, type checking employs rules akin to , such as those in the Hindley-Milner system, where rules for and generalization infer types from program expressions, ensuring without explicit annotations. These applications demonstrate the practical scalability of rules in automating logical consistency checks. Common pitfalls in applying inference rules include quantifier scope errors, which alter logical meaning. For instance, misplacing scope in xyP(x,y)\forall x \exists y \, P(x,y) (for every xx, there exists a yy such that PP) versus yxP(x,y)\exists y \forall x \, P(x,y) (there exists a yy for every xx such that PP) can lead to invalid generalizations, such as assuming a single yy works universally when it does not. Another error involves variable capture during instantiation, where reusing a variable in UI binds it unexpectedly, invalidating the step. These issues underscore the importance of precise scoping to avoid unsound derivations.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.