Hubbry Logo
Logical matrixLogical matrixMain
Open search
Logical matrix
Community hub
Logical matrix
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Logical matrix
Logical matrix
from Wikipedia

A logical matrix, binary matrix, relation matrix, Boolean matrix, or (0, 1)-matrix is a matrix with entries from the Boolean domain B = {0, 1}. Such a matrix can be used to represent a binary relation between a pair of finite sets. It is an important tool in combinatorial mathematics and theoretical computer science.

Matrix representation of a relation

[edit]

If R is a binary relation between the finite indexed sets X and Y (so RX ×Y), then R can be represented by the logical matrix M whose row and column indices index the elements of X and Y, respectively, such that the entries of M are defined by

In order to designate the row and column numbers of the matrix, the sets X and Y are indexed with positive integers: i ranges from 1 to the cardinality (size) of X, and j ranges from 1 to the cardinality of Y. See the article on indexed sets for more detail.

The transpose of the logical matrix of a binary relation corresponds to the converse relation.[1]

Example

[edit]

The binary relation R on the set {1, 2, 3, 4} is defined so that aRb holds if and only if a divides b evenly, with no remainder. For example, 2R4 holds because 2 divides 4 without leaving a remainder, but 3R4 does not hold because when 3 divides 4, there is a remainder of 1. The following set is the set of pairs for which the relation R holds.

{(1, 1), (1, 2), (1, 3), (1, 4), (2, 2), (2, 4), (3, 3), (4, 4)}.

The corresponding representation as a logical matrix is

which includes a diagonal of ones, since each number divides itself.

Other examples

[edit]

Some properties

[edit]
Multiplication of two logical matrices using Boolean algebra.

The matrix representation of the equality relation on a finite set is the identity matrix I, that is, the matrix whose entries on the diagonal are all 1, while the others are all 0. More generally, if relation R satisfies IR, then R is a reflexive relation.

If the Boolean domain is viewed as a semiring, where addition corresponds to logical OR and multiplication to logical AND, the matrix representation of the composition of two relations is equal to the matrix product of the matrix representations of these relations. This product can be computed in expected time O(n2).[3]

Frequently, operations on binary matrices are defined in terms of modular arithmetic mod 2—that is, the elements are treated as elements of the Galois field . They arise in a variety of representations and have a number of more restricted special forms. They are applied e.g. in XOR-satisfiability.

The number of distinct m-by-n binary matrices is equal to 2mn, and is thus finite.

Lattice

[edit]

Let n and m be given and let U denote the set of all logical m × n matrices. Then U has a partial order given by

In fact, U forms a Boolean algebra with the operations and & or between two matrices applied component-wise. The complement of a logical matrix is obtained by swapping all zeros and ones for their opposite.

Every logical matrix A = (Aij) has a transpose AT = (Aji). Suppose A is a logical matrix with no columns or rows identically zero. Then the matrix product, using Boolean arithmetic, contains the m × m identity matrix, and the product contains the n × n identity.

As a mathematical structure, the Boolean algebra U forms a lattice ordered by inclusion; additionally it is a multiplicative lattice due to matrix multiplication.

Every logical matrix in U corresponds to a binary relation. These listed operations on U, and ordering, correspond to a calculus of relations, where the matrix multiplication represents composition of relations.[4]

Logical vectors

[edit]
Group-like structures
Total Associative Identity Divisible Commutative
Partial magma Unneeded Unneeded Unneeded Unneeded Unneeded
Semigroupoid Unneeded Required Unneeded Unneeded Unneeded
Small category Unneeded Required Required Unneeded Unneeded
Groupoid Unneeded Required Required Required Unneeded
Magma Required Unneeded Unneeded Unneeded Unneeded
Quasigroup Required Unneeded Unneeded Required Unneeded
Unital magma Required Unneeded Required Unneeded Unneeded
Loop Required Unneeded Required Required Unneeded
Semigroup Required Required Unneeded Unneeded Unneeded
Monoid Required Required Required Unneeded Unneeded
Group Required Required Required Required Unneeded
Abelian group Required Required Required Required Required

If m or n equals one, then the m × n logical matrix (mij) is a logical vector or bit string. If m = 1, the vector is a row vector, and if n = 1, it is a column vector. In either case the index equaling 1 is dropped from denotation of the vector.

Suppose and are two logical vectors. The outer product of P and Q results in an m × n rectangular relation

A reordering of the rows and columns of such a matrix can assemble all the ones into a rectangular part of the matrix.[5]

Let h be the vector of all ones. Then if v is an arbitrary logical vector, the relation R = v hT has constant rows determined by v. In the calculus of relations such an R is called a vector.[5] A particular instance is the universal relation .

For a given relation R, a maximal rectangular relation contained in R is called a concept in R. Relations may be studied by decomposing into concepts, and then noting the induced concept lattice.

Consider the table of group-like structures, where "unneeded" can be denoted 0, and "required" denoted by 1, forming a logical matrix To calculate elements of , it is necessary to use the logical inner product of pairs of logical vectors in rows of this matrix. If this inner product is 0, then the rows are orthogonal. In fact, small category is orthogonal to quasigroup, and groupoid is orthogonal to magma. Consequently there are zeros in , and it fails to be a universal relation.

Row and column sums

[edit]

Adding up all the ones in a logical matrix may be accomplished in two ways: first summing the rows or first summing the columns. When the row sums are added, the sum is the same as when the column sums are added. In incidence geometry, the matrix is interpreted as an incidence matrix with the rows corresponding to "points" and the columns as "blocks" (generalizing lines made of points). A row sum is called its point degree, and a column sum is the block degree. The sum of point degrees equals the sum of block degrees.[6]

An early problem in the area was "to find necessary and sufficient conditions for the existence of an incidence structure with given point degrees and block degrees; or in matrix language, for the existence of a (0, 1)-matrix of type v × b with given row and column sums".[6] This problem is solved by the Gale–Ryser theorem.

See also

[edit]

Notes

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
In algebraic logic, a logical matrix is a semantic structure consisting of an algebra A\mathbf{A} and a non-empty subset DD of its carrier set AA, where the elements of DD are designated values that represent the acceptable or "true" truth values for formulas in a given logic. Logical matrices generalize the truth-table semantics of classical propositional logic to arbitrary algebraic structures, enabling the modeling of relations in a precise algebraic framework. A set of Γ\Gamma entails a ϕ\phi in a logic L\mathbf{L} if, for every logical matrix A,D\langle \mathbf{A}, D \rangle that models L\mathbf{L}, any valuation sending all elements of Γ\Gamma to designated elements in DD also sends ϕ\phi to DD. This approach connects syntactic deduction to algebraic properties, such as filters and congruences, and is particularly powerful for protoalgebraic logics where reduced matrices (obtained via the Leibniz operator ΩA(D)\mathbf{\Omega}_{\mathbf{A}}(D)) capture the logic's full semantics. The concept of logical matrices originated with in the 1920s, though it built on implicit ideas in Jan Łukasiewicz's work on many-valued logics, and was further formalized by figures like Jerzy Łoś and Roman Suszko in the mid-20th century. They form the foundation of matrix semantics, which applies to a wide range of non-classical logics, including intuitionistic, modal, and paraconsistent systems, by associating each logic with a class of such matrices that validate its theorems and rules. In practice, the Lindenbaum matrix—derived from the of the logic quotiented by its —provides a , ensuring every propositional logic admits a complete matrix semantics.

Fundamentals

Definition

A logical matrix is an m×nm \times n matrix A=(aij)A = (a_{ij}) whose entries aija_{ij} belong to the two-element {0,1}\{0, 1\}, equipped with operations such as meet (\wedge), join (\vee), and complement (¬\neg). This is a special case of more general matrices over algebras BB, where matrix operations use the algebraic operations of BB (such as for meet and union for join) rather than standard arithmetic, distinguishing them from real-valued matrices that rely on addition and multiplication over the reals. In this structure, 00 acts as the , 11 as , \wedge corresponds to logical AND (), \vee to logical OR (union), and ¬\neg to logical NOT (complement). This yields a binary or (0,1)-matrix, often termed a logical matrix in contexts involving binary relations. More generally, matrices over any BB, such as the power set algebra βk\beta_k of subsets of a kk-element set (with \cap as meet, \cup as join, and set complement as negation) or algebras of intervals over the reals ordered by inclusion, can be considered, though the term "logical matrix" is typically reserved for the {0,1} case. The dimensions mm and nn typically reflect the sizes of the domain and sets in applications like relation representation.

Relation Representation

A logical matrix provides a standard way to encode a binary relation RX×YR \subseteq X \times Y between finite sets XX and YY, where X=m|X| = m and Y=n|Y| = n. Assuming fixed enumerations X={x1,,xm}X = \{x_1, \dots, x_m\} and Y={y1,,yn}Y = \{y_1, \dots, y_n\}, the corresponding m×nm \times n logical matrix A=[aij]A = [a_{ij}] is defined such that aij=1a_{ij} = 1 if (xi,yj)R(x_i, y_j) \in R and aij=0a_{ij} = 0 otherwise. This construction yields a between the set of all binary relations RX×YR \subseteq X \times Y and the set of all m×nm \times n matrices over {0,1}\{0,1\}, as each possible relation corresponds uniquely to a choice of entries in the matrix based on membership in RR. The original relation can be recovered from the matrix by determining that (xi,yj)[R](/page/R)(x_i, y_j) \in [R](/page/R) if and only if aij=1a_{ij} = 1. When XX and YY have different cardinalities, the resulting logical matrices are rectangular (non-square), reflecting the asymmetry in the dimensions of the Cartesian product X×YX \times Y.

Examples

Basic Example

A fundamental example of a logical matrix is the two-valued matrix for classical propositional logic. The algebra A\mathbf{A} is the with carrier set A={0,1}A = \{0, 1\}, where 0 represents falsehood and 1 truth. The operations are defined as follows: conjunction \land by min(x,y)\min(x, y) or xyxy, disjunction \lor by max(x,y)\max(x, y) or x+yxyx + y - xy, and negation ¬\lnot by 1x1 - x. The set of designated values is D={1}D = \{1\}. In this matrix A,D\langle \mathbf{A}, D \rangle, a is valid if every valuation (homomorphism from the formula algebra to A\mathbf{A}) maps it to 1 whenever the premises are mapped to 1. This captures the truth-table semantics of , where tautologies always evaluate to 1. For instance, the p¬pp \lor \lnot p evaluates to 1 for any assignment to pp. To verify, consider the implicit in the operations: for pqp \land q, the values are 1 only if both are 1, matching classical AND.

Further Examples

For many-valued logics, consider Łukasiewicz's three-valued logic. The carrier set is A={0,12,1}A = \{0, \frac{1}{2}, 1\}, with conjunction \land defined as min(x,y)\min(x, y), disjunction \lor as max(x,y)\max(x, y), and implication \to as min(1,1x+y)\min(1, 1 - x + y). The designated set can be D={1}D = \{1\} (strong) or D={12,1}D = \{\frac{1}{2}, 1\} (weak), depending on the variant. This matrix models intermediate truth values, useful for vague or uncertain statements. Another example is the matrix for , based on a A\mathbf{A} (e.g., the chain A={0<a<1}A = \{0 < a < 1\} for a simple finite case). Operations include implication defined as the relative pseudocomplement: xy=max{zxzy}x \to y = \max\{z \mid x \land z \leq y\}, with D={1}D = \{1\}. Unlike , excluded middle p¬pp \lor \lnot p does not always evaluate to 1, reflecting intuitionistic rejection of proof by contradiction without constructive evidence. These examples demonstrate how logical matrices adapt s to define semantics for non-classical logics, with designated values determining validity.

Properties

Core Properties

Logical matrices provide a semantic framework for propositional logics through their and designated values. A key property is the definition of the consequence relation: for a class of matrices M\mathcal{M}, a set of formulas Γ\Gamma entails ϕ\phi if, in every matrix A,DM\langle \mathbf{A}, D \rangle \in \mathcal{M}, every valuation v:FormAv: \mathrm{Form} \to A such that v(γ)Dv(\gamma) \in D for all γΓ\gamma \in \Gamma also satisfies v(ϕ)Dv(\phi) \in D. This ensures that the semantics capture the logic's deductive rules in an algebraically precise manner. Another fundamental property involves reduced matrices, which eliminate redundancies in the carrier set. A matrix A,D\langle \mathbf{A}, D \rangle is reduced if the Leibniz congruence ΩA(D)={(a,b)A×AϕForm,Aϕ(a)ϕ(b)    aD    bD}\mathbf{\Omega}_{\mathbf{A}}(D) = \{(a, b) \in A \times A \mid \forall \phi \in \mathrm{Form}, \, \mathbf{A} \models \phi(a) \leftrightarrow \phi(b) \implies a \in D \iff b \in D\} is the equality relation on AA. The reduced form is obtained by quotienting A\mathbf{A} by this congruence, and for protoalgebraic logics, the reduced matrices fully determine the semantics. Logical matrices are particularly useful in protoalgebraic logics, where there exists a set of equivalence formulas Δ(p,q)\Delta(p, q) such that the Leibniz congruence for any filter coincides with the relation defined by Δ\Delta. This property bridges syntax and semantics, allowing the logic to be characterized by algebraic filters and congruences. Filters in the algebra A\mathbf{A} are subsets closed under the logic's operations and containing designated values, playing a analogous to theories in syntactic terms.

Lattice Structure

In the context of logical matrices, the underlying algebras A\mathbf{A} are often lattice-based, such as algebras for or Heyting algebras for , where the carrier AA forms a lattice under meet and join operations. The set of all congruences on A\mathbf{A} forms an algebraic lattice, ordered by inclusion, with the trivial congruence as the bottom element and the full relation as the top; joins and meets correspond to the transitive closures and intersections of relations, respectively. This lattice structure facilitates the study of quotient matrices and semantic equivalences. Additionally, the collection of filters on A\mathbf{A} (upward closed subsets compatible with the designated set DD) inherits a lattice structure under intersection (meet) and union-generated filters (join), forming a complete lattice that models the theories of the logic. For example, in classical propositional logic, the filters of the Boolean algebra correspond to the prime filters, yielding the Stone representation. This lattice-theoretic perspective underscores the connections between logical matrices and order theory in algebraic semantics.

Extensions

Logical Vectors

In the context of Boolean-valued logical matrices used for relation representation, a logical vector can be viewed as an n×1n \times 1 or 1×n1 \times n matrix with entries from the Boolean algebra {[0](/page/0),1}\{[0](/page/0), 1\}. These arise as rows or columns of such matrices, capturing characteristic functions of subsets or preimages/images under relations. Operations on logical vectors are component-wise Boolean operations: disjunction \vee (OR), conjunction \wedge (AND), and negation ¬\neg. For example, for vectors a=(1,0,1)a = (1, 0, 1) and b=(0,1,1)b = (0, 1, 1), ab=(1,1,1)a \vee b = (1, 1, 1). The Boolean inner product a,b=i=1n(aibi)\langle a, b \rangle = \bigvee_{i=1}^n (a_i \wedge b_i) indicates if their supports overlap.

Row and Column Sums

For logical matrices representing binary relations or directed graphs over the domain, the cardinal row sum si=jaijs_i = \sum_j a_{ij} gives the out-degree of element ii, and the column sum tj=iaijt_j = \sum_i a_{ij} the in-degree. In the semiring (addition \vee, multiplication \wedge), the row sum si=jaijs_i = \bigvee_j a_{ij} is 1 if the row has at least one 1, indicating existence of relations. For the matrix A=(101010110),A = \begin{pmatrix} 1 & 0 & 1 \\ 0 & 1 & 0 \\ 1 & 1 & 0 \end{pmatrix},
Add your contribution
Related Hubs
User Avatar
No comments yet.