Hubbry Logo
Finite-state machineFinite-state machineMain
Open search
Finite-state machine
Community hub
Finite-state machine
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Finite-state machine
Finite-state machine
from Wikipedia

Combinational logicFinite-state machinePushdown automatonTuring machineAutomata theory
Classes of automata
(Clicking on each layer gets an article on that subject)

A finite-state machine (FSM) or finite-state automaton (FSA, plural: automata), finite automaton, or simply a state machine, is a mathematical model of computation. It is an abstract machine that can be in exactly one of a finite number of states at any given time. The FSM can change from one state to another in response to some inputs; the change from one state to another is called a transition.[1] An FSM is defined by a list of its states, its initial state, and the inputs that trigger each transition. Finite-state machines are of two types—deterministic finite-state machines and non-deterministic finite-state machines.[2][better source needed] For any non-deterministic finite-state machine, an equivalent deterministic one can be constructed.[citation needed]

The behavior of state machines can be observed in many devices in modern society that perform a predetermined sequence of actions depending on a sequence of events with which they are presented. Simple examples are vending machines, which dispense products when the proper combination of coins is deposited; elevators, whose sequence of stops is determined by the floors requested by riders; traffic lights, which change sequence when cars are waiting; and combination locks, which require the input of a sequence of numbers in the proper order.

The finite-state machine has less computational power than some other models of computation such as the Turing machine.[3] The computational power distinction means there are computational tasks that a Turing machine can do but an FSM cannot. This is because an FSM's memory is limited by the number of states it has. A finite-state machine has the same computational power as a Turing machine that is restricted such that its head may only perform "read" operations, and always has to move from left to right. FSMs are studied in the more general field of automata theory.

Example: coin-operated turnstile

[edit]
State diagram for a turnstile
A turnstile

An example of a simple mechanism that can be modeled by a state machine is a turnstile.[4][5] A turnstile, used to control access to subways and amusement park rides, is a gate with three rotating arms at waist height, one across the entryway. Initially the arms are locked, blocking the entry, preventing patrons from passing through. Depositing a coin or token in a slot on the turnstile unlocks the arms, allowing a single customer to push through. After the customer passes through, the arms are locked again until another coin is inserted.

Considered as a state machine, the turnstile has two possible states: Locked and Unlocked.[4] There are two possible inputs that affect its state: putting a coin in the slot (coin) and pushing the arm (push). In the locked state, pushing on the arm has no effect; no matter how many times the input push is given, it stays in the locked state. Putting a coin in – that is, giving the machine a coin input – shifts the state from Locked to Unlocked. In the unlocked state, putting additional coins in has no effect; that is, giving additional coin inputs does not change the state. A customer pushing through the arms gives a push input and resets the state to Locked.

The turnstile state machine can be represented by a state-transition table, showing for each possible state, the transitions between them (based upon the inputs given to the machine) and the outputs resulting from each input:

Current State Input Next State Output
Locked coin Unlocked Unlocks the turnstile so that the customer can push through.
push Locked None
Unlocked coin Unlocked None
push Locked When the customer has pushed through, locks the turnstile.

The turnstile state machine can also be represented by a directed graph called a state diagram (above). Each state is represented by a node (circle). Edges (arrows) show the transitions from one state to another. Each arrow is labeled with the input that triggers that transition. An input that doesn't cause a change of state (such as a coin input in the Unlocked state) is represented by a circular arrow returning to the original state. The arrow into the Locked node from the black dot indicates it is the initial state.

Concepts and terminology

[edit]

A state is a description of the status of a system that is waiting to execute a transition. A transition is a set of actions to be executed when a condition is fulfilled or when an event is received. For example, when using an audio system to listen to the radio (the system is in the "radio" state), receiving a "next" stimulus results in moving to the next station. When the system is in the "CD" state, the "next" stimulus results in moving to the next track. Identical stimuli trigger different actions depending on the current state.

In some finite-state machine representations, it is also possible to associate actions with a state:

  • an entry action: performed when entering the state, and
  • an exit action: performed when exiting the state.

Representations

[edit]
Fig. 1 UML state chart example (a toaster oven)
Fig. 2 SDL state machine example
Fig. 3 Example of a simple finite-state machine

State/Event table

[edit]

Several state-transition table types are used. The most common representation is shown below: the combination of current state (e.g. B) and input (e.g. Y) shows the next state (e.g. C). By itself, the table cannot completely describe the action, so it is common to use footnotes. Other related representations may not have this limitation. For example, an FSM definition including the full action's information is possible using state tables (see also virtual finite-state machine).

State-transition table
  Current
state
Input
State A State B State C
Input X ... ... ...
Input Y ... State C ...
Input Z ... ... ...

UML state machines

[edit]

The Unified Modeling Language has a notation for describing state machines. UML state machines overcome the limitations[citation needed] of traditional finite-state machines while retaining their main benefits. UML state machines introduce the new concepts of hierarchically nested states and orthogonal regions, while extending the notion of actions. UML state machines have the characteristics of both Mealy machines and Moore machines. They support actions that depend on both the state of the system and the triggering event, as in Mealy machines, as well as entry and exit actions, which are associated with states rather than transitions, as in Moore machines.[citation needed]

SDL state machines

[edit]

The Specification and Description Language is a standard from ITU that includes graphical symbols to describe actions in the transition:

  • send an event
  • receive an event
  • start a timer
  • cancel a timer
  • start another concurrent state machine
  • decision

SDL embeds basic data types called "Abstract Data Types", an action language, and an execution semantic in order to make the finite-state machine executable.[citation needed]

Other state diagrams

[edit]

There are a large number of variants to represent an FSM such as the one in figure 3.

Usage

[edit]

In addition to their use in modeling reactive systems presented here, finite-state machines are significant in many different areas, including electrical engineering, linguistics, computer science, philosophy, biology, mathematics, video game programming, and logic. Finite-state machines are a class of automata studied in automata theory and the theory of computation. In computer science, finite-state machines are widely used in modeling of application behavior (control theory), design of hardware digital systems, software engineering, compilers, network protocols, and computational linguistics.

Classification

[edit]

Finite-state machines can be subdivided into acceptors, classifiers, transducers and sequencers.[6]

Acceptors

[edit]
Fig. 4: Acceptor FSM: parsing the string "nice".
Fig. 5: Representation of an acceptor; this example shows one that determines whether a binary number has an even number of 0s, where S1 is an accepting state and S2 is a non accepting state.

Acceptors (also called detectors or recognizers) produce binary output, indicating whether or not the received input is accepted. Each state of an acceptor is either accepting or non accepting. Once all input has been received, if the current state is an accepting state, the input is accepted; otherwise it is rejected. As a rule, input is a sequence of symbols (characters); actions are not used. The start state can also be an accepting state, in which case the acceptor accepts the empty string. The example in figure 4 shows an acceptor that accepts the string "nice". In this acceptor, the only accepting state is state 7.

A (possibly infinite) set of symbol sequences, called a formal language, is a regular language if there is some acceptor that accepts exactly that set.[7] For example, the set of binary strings with an even number of zeroes is a regular language (cf. Fig. 5), while the set of all strings whose length is a prime number is not.[8]

An acceptor could also be described as defining a language that would contain every string accepted by the acceptor but none of the rejected ones; that language is accepted by the acceptor. By definition, the languages accepted by acceptors are the regular languages.

The problem of determining the language accepted by a given acceptor is an instance of the algebraic path problem—itself a generalization of the shortest path problem to graphs with edges weighted by the elements of an (arbitrary) semiring.[9][10][jargon]

An example of an accepting state appears in Fig. 5: a deterministic finite automaton (DFA) that detects whether the binary input string contains an even number of 0s.

S1 (which is also the start state) indicates the state at which an even number of 0s has been input. S1 is therefore an accepting state. This acceptor will finish in an accept state, if the binary string contains an even number of 0s (including any binary string containing no 0s). Examples of strings accepted by this acceptor are ε (the empty string), 1, 11, 11..., 00, 010, 1010, 10110, etc.

Classifiers

[edit]

Classifiers are a generalization of acceptors that produce n-ary output where n is strictly greater than two.[11]

Transducers

[edit]
Fig. 6 Transducer FSM: Moore model example
Fig. 7 Transducer FSM: Mealy model example

Transducers produce output based on a given input and/or a state using actions. They are used for control applications and in the field of computational linguistics.

In control applications, two types are distinguished:

Moore machine
The FSM uses only entry actions, i.e., output depends only on state. The advantage of the Moore model is a simplification of the behaviour. Consider an elevator door. The state machine recognizes two commands: "command_open" and "command_close", which trigger state changes. The entry action (E:) in state "Opening" starts a motor opening the door, the entry action in state "Closing" starts a motor in the other direction closing the door. States "Opened" and "Closed" stop the motor when fully opened or closed. They signal to the outside world (e.g., to other state machines) the situation: "door is open" or "door is closed".
Mealy machine
The FSM also uses input actions, i.e., output depends on input and state. The use of a Mealy FSM leads often to a reduction of the number of states. The example in figure 7 shows a Mealy FSM implementing the same behaviour as in the Moore example (the behaviour depends on the implemented FSM execution model and will work, e.g., for virtual FSM but not for event-driven FSM). There are two input actions (I:): "start motor to close the door if command_close arrives" and "start motor in the other direction to open the door if command_open arrives". The "opening" and "closing" intermediate states are not shown.

Sequencers

[edit]

Sequencers (also called generators) are a subclass of acceptors and transducers that have a single-letter input alphabet. They produce only one sequence, which can be seen as an output sequence of acceptor or transducer outputs.[6]

Determinism

[edit]

A further distinction is between deterministic (DFA) and non-deterministic (NFA, GNFA) automata. In a deterministic automaton, every state has exactly one transition for each possible input. In a non-deterministic automaton, an input can lead to one, more than one, or no transition for a given state. The powerset construction algorithm can transform any nondeterministic automaton into a (usually more complex) deterministic automaton with identical functionality.

A finite-state machine with only one state is called a "combinatorial FSM". It only allows actions upon transition into a state. This concept is useful in cases where a number of finite-state machines are required to work together, and when it is convenient to consider a purely combinatorial part as a form of FSM to suit the design tools.[12]

Alternative semantics

[edit]

There are other sets of semantics available to represent state machines. For example, there are tools for modeling and designing logic for embedded controllers.[13] They combine hierarchical state machines (which usually have more than one current state), flow graphs, and truth tables into one language, resulting in a different formalism and set of semantics.[14] These charts, like Harel's original state machines,[15] support hierarchically nested states, orthogonal regions, state actions, and transition actions.[16]

Mathematical model

[edit]

In accordance with the general classification, the following formal definitions are found.

A deterministic finite-state machine or deterministic finite-state acceptor is a quintuple , where:

  • is the input alphabet (a finite non-empty set of symbols);
  • is a finite non-empty set of states;
  • is an initial state, an element of ;
  • is the state-transition function: (in a nondeterministic finite automaton it would be , i.e. would return a set of states);
  • is the set of final states, a (possibly empty) subset of .

For both deterministic and non-deterministic FSMs, it is conventional to allow to be a partial function, i.e. does not have to be defined for every combination of and . If an FSM is in a state , the next symbol is and is not defined, then can announce an error (i.e. reject the input). This is useful in definitions of general state machines, but less useful when transforming the machine. Some algorithms in their default form may require total functions.

A finite-state machine has the same computational power as a Turing machine that is restricted such that its head may only perform "read" operations, and always has to move from left to right. That is, each formal language accepted by a finite-state machine is accepted by such a kind of restricted Turing machine, and vice versa.[17]

A finite-state transducer is a sextuple , where:

  • is the input alphabet (a finite non-empty set of symbols);
  • is the output alphabet (a finite non-empty set of symbols);
  • is a finite non-empty set of states;
  • is the initial state, an element of ;
  • is the state-transition function: ;
  • is the output function.

If the output function depends on the state and input symbol () that definition corresponds to the Mealy model, and can be modelled as a Mealy machine. If the output function depends only on the state () that definition corresponds to the Moore model, and can be modelled as a Moore machine. A finite-state machine with no output function at all is known as a semiautomaton or transition system.

If we disregard the first output symbol of a Moore machine, , then it can be readily converted to an output-equivalent Mealy machine by setting the output function of every Mealy transition (i.e. labeling every edge) with the output symbol given of the destination Moore state. The converse transformation is less straightforward because a Mealy machine state may have different output labels on its incoming transitions (edges). Every such state needs to be split in multiple Moore machine states, one for every incident output symbol.[18]

Optimization

[edit]

Optimizing an FSM means finding a machine with the minimum number of states that performs the same function. The fastest known algorithm doing this is the Hopcroft minimization algorithm.[19][20] Other techniques include using an implication table, or the Moore reduction procedure.[21] Additionally, acyclic FSAs can be minimized in linear time.[22]

Implementation

[edit]

Hardware applications

[edit]
Fig. 9 The circuit diagram for a 4-bit TTL counter, a type of state machine

In a digital circuit, an FSM may be built using a programmable logic device, a programmable logic controller, logic gates and flip flops or relays. More specifically, a hardware implementation requires a register to store state variables, a block of combinational logic that determines the state transition, and a second block of combinational logic that determines the output of an FSM.

In a Medvedev machine, the output is directly connected to the state flip-flops minimizing the time delay between flip-flops and output.[23][24]

Through state encoding for low power state machines may be optimized to minimize power consumption.

Software applications

[edit]

The following concepts are commonly used to build software applications with finite-state machines:

Finite-state machines and compilers

[edit]

Finite automata are often used in the frontend of programming language compilers. Such a frontend may comprise several finite-state machines that implement a lexical analyzer and a parser. Starting from a sequence of characters, the lexical analyzer builds a sequence of language tokens (such as reserved words, literals, and identifiers) from which the parser builds a syntax tree. The lexical analyzer and the parser handle the regular and context-free parts of the programming language's grammar.[25]

See also

[edit]

References

[edit]

Sources

[edit]

Further reading

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
A finite-state machine (FSM), also known as a finite automaton, is an abstract consisting of a of states, an input alphabet, a transition function that maps states and inputs to next states, a start state, and a set of accepting states. It processes input strings sequentially, changing states based on each symbol, and accepts the string if it ends in an accepting state, thereby recognizing languages in the Chomsky hierarchy's regular class. This model captures systems with limited memory, where behavior depends only on the current state and input, without retaining unbounded history. The origins of finite-state machines trace back to early 20th-century efforts in modeling neural activity and logical computation. In 1943, Warren McCulloch and proposed finite automata as a simplified model of neural networks in their paper "A Logical Calculus of the Ideas Immanent in Nervous Activity," laying foundational work for and . The concept was formalized in during the 1950s, with George H. Mealy and independently developing output-producing variants in 1955–1956, known as Mealy and Moore machines, which extend basic FSMs to produce outputs alongside state transitions. and provided a rigorous mathematical framework in 1959, proving the equivalence between deterministic finite automata and nondeterministic ones, earning them the 1976 for contributions to automata theory. Finite-state machines come in several variants tailored to specific needs. Deterministic finite automata (DFAs) have a unique transition for each state-input pair, ensuring predictable behavior, while nondeterministic finite automata (NFAs) allow multiple or (no-input) transitions, offering more concise representations but equivalent computational power to DFAs via subset construction. Transducer variants like Mealy machines generate outputs on transitions, and Moore machines on states, enabling modeling of reactive systems. FSMs underpin regular expressions, as shown by Stephen Kleene in 1956, linking them to in compilers and lexical analyzers. In practice, finite-state machines are fundamental to , modeling everything from simple protocols and vending machines to complex hardware controllers and software parsers. They enable efficient design of digital circuits via state minimization techniques, such as those from the Myhill-Nerode theorem (1957–1958), which equates language distinguishability to state count. Despite their simplicity, FSMs form the basis for more powerful models like pushdown automata, highlighting their role in the hierarchy of .

Introduction

Definition and Motivation

A finite-state machine is an abstract that represents systems capable of exhibiting sequential behavior through a limited set of conditions known as states. At any given moment, the machine resides in exactly one state, which encapsulates the relevant history of prior inputs, and transitions to another state based on the current input. This structure allows the machine to produce outputs dependent on both the current state and input, providing a framework for modeling processes with finite where the system's response to future inputs is determined solely by its present condition rather than an unlimited record of the past. The concept traces back to Warren McCulloch and Walter Pitts' 1943 model of neural activity, with significant formalization in the 1950s as part of early efforts in automata theory to model digital circuits and recognize patterns in sequences, such as those encountered in language processing. Early foundations were laid by Warren McCulloch and Walter Pitts in 1943, who modeled neural activity using finite automata in their paper "A Logical Calculus of the Ideas Immanent in Nervous Activity," laying foundational work for cybernetics and automata theory. Stephen Kleene's 1956 work formalized finite automata, building on McCulloch and Pitts' models, and introduced regular expressions as a means to represent events in nerve nets, linking them to the computational capabilities of simple neural models and laying the groundwork for understanding sequential logic in hardware design. Building on this, Michael Rabin and Dana Scott's 1959 paper formalized finite automata for classifying input tapes and solving decision problems, emphasizing their role in theoretical computer science and circuit synthesis. Key developments in during the 1950s and 1960s further solidified the finite-state machine's foundations, particularly through its connection to regular languages—the class of patterns recognizable by such machines. Kleene's introduction of regular expressions in his 1956 paper provided an algebraic notation for describing these languages, enabling precise specifications of state transitions and influencing compiler design and text processing. This era's advancements highlighted the model's utility in abstracting real-world systems like communication protocols and control devices, where exhaustive memory is neither necessary nor feasible. The emphasis on finiteness stems from the model's inherent limitation to a fixed number of states, which bounds the and computational power compared to universal models like the . Unlike , which simulate arbitrary computation via an infinite tape for unbounded storage, finite-state machines are constrained to recognize only regular languages and cannot handle computations requiring indefinite , such as those involving nested structures or . This restriction makes them ideal for efficient, predictable implementations in resource-limited environments like embedded systems.

Turnstile Example

A coin-operated provides a classic illustration of a finite-state machine in operation, modeling in environments such as or amusement parks. The turnstile begins in a locked state, preventing passage until a valid input is received. Inserting a unlocks the mechanism, allowing a user to push through and advance to the next position, after which it relocks to require another payment for subsequent entries. This setup demonstrates how the machine's behavior depends on its current state and incoming events, ensuring controlled and predictable responses. The turnstile FSM features two primary states: locked, where entry is barred, and unlocked, where passage is permitted. The inputs consist of two events: , representing payment insertion, and push, simulating the physical to rotate the arms. Outputs are implicit and state-dependent; in the locked state, no passage occurs, while in the unlocked state, the turnstile rotates to allow one person through, effectively granting access. These elements capture the machine's reactive nature without explicit signaling beyond mechanical feedback. The transitions define how inputs alter states, with invalid inputs typically resulting in no change to maintain security. A coin input in the locked state triggers a transition to unlocked, enabling entry. Conversely, a push input in the unlocked state returns it to locked after allowing passage. If a push occurs in the locked state, the machine remains locked, ignoring the attempt. A coin input in the unlocked state also leaves it unchanged, as payment is unnecessary post-unlock. These rules ensure the FSM only advances on valid sequences, rejecting unauthorized actions. To handle error conditions, such as repeated invalid pushes or forced entry attempts, the model can be extended with a jammed or violation state for . In this augmented FSM, abnormal events like pushing a locked or excessive force trigger a transition to the violation state, where an alarm activates and the mechanism halts. Recovery occurs via a reset event, returning to the locked state and deactivating the alarm, thus incorporating handling without disrupting the core logic. The following table summarizes the transitions for the basic turnstile FSM:
Current StateInputNext StateAction/Output
LockedUnlockedUnlock
LockedPushLockedNo action (ignore)
UnlockedPushLockedAllow passage, relock
UnlockedUnlockedNo action (ignore)
This representation highlights the deterministic flow, where each state-input pair yields a unique outcome.

Fundamental Concepts

States, Inputs, and Outputs

A finite-state machine (FSM) consists of a of states, denoted as QQ, which represent the distinct conditions or modes in which the machine can exist at any given time. Exactly one state is active at any moment, capturing the system's configuration or "memory" of past inputs. Among these, there is a designated initial state q0Qq_0 \in Q, from which the machine begins operation upon receiving its first input. For FSMs functioning as acceptors or recognizers, a FQF \subseteq Q identifies the accepting states, where the machine halts after processing an input string if it determines the input is valid according to the defined language. The inputs to an FSM are drawn from a finite input Σ\Sigma, a set of or events that can trigger changes in the machine's . These inputs serve as the stimuli processed sequentially, with each representing a discrete event or signal that the machine responds to. The term "event" is often used more broadly to encompass any input that causes a state change, including external signals in reactive systems. Synonymous includes "" for individual elements of Σ\Sigma and "conditions" interchangeably with states in descriptive contexts. Outputs are relevant primarily in transducer models of FSMs, where the machine not only changes states but also produces responses from an output alphabet Γ\Gamma. In the Mealy model, outputs are generated as a total function λ:Q×ΣΓ\lambda: Q \times \Sigma \to \Gamma, depending on both the current state and the input, allowing immediate reaction during transitions. This approach, introduced by George H. Mealy, enables outputs to vary dynamically with incoming stimuli. Conversely, in the Moore model, outputs are determined solely by the current state via a total function λ:QΓ\lambda: Q \to \Gamma, producing consistent responses for each state regardless of the triggering input. This model, proposed by , simplifies output logic by associating it directly with states. Some FSM designs distinguish active states, where outputs or actions occur, from idle states, in which the machine awaits input without generating responses. For illustration, consider a FSM with states "locked" and "unlocked," where inputs from Σ={[coin](/page/Coin),push}\Sigma = \{ \text{[coin](/page/Coin)}, \text{push} \} drive transitions, and outputs indicate access granted or denied.

Transitions and Events

In finite-state machines (FSMs), transitions define the dynamic behavior by specifying how the machine moves from one state to another in response to inputs. A transition is a mapping that, given the current state and an input, determines the next state and may also produce an output. This mechanism ensures that the FSM evolves predictably based on its configuration, with transitions often triggered synchronously at clock edges in hardware implementations. In event-driven or reactive systems, transitions are initiated by events, which serve as the inputs prompting state changes. Events can include signals, user actions, or environmental stimuli, and the FSM responds by firing a valid transition if the event is applicable to the current state; otherwise, the event is typically ignored to maintain stability. For instance, in a turnstile system, a "coin" event in the locked state triggers a transition to the unlocked state, while a "push" event in the same state is ignored, preventing unauthorized passage without altering the state. Nondeterminism introduces the possibility of multiple transitions from the same state for a given input, allowing the FSM to branch to several potential next states, which models or parallelism in . This contrasts with deterministic FSMs, where each input uniquely determines the next state, and is explored further in classifications of FSM types. Sequences of transitions form paths through the state space, enabling complex behaviors such as cycles, where the machine loops back to a previous state to repeat actions. In the turnstile example, a cycle emerges from the unlocked state via a "push" event back to locked, allowing repeated use after and modeling ongoing operation. These paths capture the FSM's response to input sequences, ensuring consistent handling of repetitive or iterative scenarios. Dead states act as sinks in the FSM, representing error conditions where no further valid transitions are possible, trapping the machine indefinitely regardless of subsequent inputs. They are commonly used for error handling, such as in protocol implementations where invalid sequences lead to a fault state requiring external reset. In design, dead states help isolate failures without disrupting the overall model.

Mathematical Model

Formal Components

A finite-state machine, in its basic form as an acceptor or recognizer, is formally defined as a quintuple M=(Q,Σ,δ,q0,F)M = (Q, \Sigma, \delta, q_0, F), where QQ is a of states, Σ\Sigma is a finite input , δ:Q×ΣQ\delta: Q \times \Sigma \to Q is the transition function that maps a current state and an input to a next state, q0Qq_0 \in Q is the initial state, and FQF \subseteq Q is the set of accepting states. This structure assumes that both QQ and Σ\Sigma are finite, ensuring the machine's computational resources remain bounded, and that δ\delta is a total function, defined for every pair in Q×ΣQ \times \Sigma. For machines that produce outputs, known as finite-state transducers, the basic tuple is extended with a finite output alphabet Γ\Gamma and an output function λ\lambda. In a , outputs depend solely on the current state, with λ:QΓ\lambda: Q \to \Gamma; the full tuple is thus (Q,Σ,Γ,δ,q0,λ)(Q, \Sigma, \Gamma, \delta, q_0, \lambda), where accepting states FF may be omitted if the machine is not used as an acceptor. This model ensures that the output is determined immediately upon entering a state, independent of the triggering input. In contrast, a associates outputs with transitions, so λ:Q×ΣΓ\lambda: Q \times \Sigma \to \Gamma; the becomes (Q,Σ,Γ,δ,q0,λ)(Q, \Sigma, \Gamma, \delta, q_0, \lambda), or equivalently, the transition function can incorporate outputs as δ:Q×ΣQ×Γ\delta: Q \times \Sigma \to Q \times \Gamma. Here, the output is produced based on both the current state and the input symbol, allowing for more immediate response to inputs but potentially leading to different output timing compared to Moore machines. As with the basic FSM, finiteness of QQ, Σ\Sigma, and Γ\Gamma is required, and functions are total for completeness.

Transition Function and Semantics

The transition function, denoted as δ\delta, is a central component of a finite-state machine (FSM) that specifies how the machine changes states in response to inputs. For a deterministic FSM (DFA), δ\delta is a total function δ:Q×ΣQ\delta: Q \times \Sigma \to Q, mapping each pair of a current state qQq \in Q and an input symbol σΣ\sigma \in \Sigma to a unique next state in QQ. In a nondeterministic FSM (NFA), δ\delta is a partial function δ:Q×Σ2Q\delta: Q \times \Sigma \to 2^Q, where 2Q2^Q is the power set of QQ, allowing the machine to transition to a set of possible next states (or none, if partial). The of an FSM are defined by its computation or run on an input w=σ1σ2σnΣw = \sigma_1 \sigma_2 \dots \sigma_n \in \Sigma^*, starting from the initial state q0Qq_0 \in Q. The run proceeds iteratively: the state after the first symbol is q1=δ(q0,σ1)q_1 = \delta(q_0, \sigma_1), then q2=δ(q1,σ2)q_2 = \delta(q_1, \sigma_2), and so on, yielding a final state qn=δ(qn1,σn)q_n = \delta(q_{n-1}, \sigma_n) after all symbols. For a DFA, the run produces a unique sequence of states; for an NFA, it may produce a set of possible state sequences, with acceptance if at least one path reaches a state in the accepting set FQF \subseteq Q. The machine accepts ww if qnFq_n \in F. To handle outputs, FSMs are classified into Moore and Mealy types, which differ in how outputs are generated during a run. In a , outputs are associated solely with states, producing an output sequence where each output depends only on the current state (e.g., output at step kk is a function of qkq_k). In a , outputs are associated with state-input pairs, so the output at each step depends on both the current state and the input symbol (e.g., output for σk\sigma_k is a function of qk1q_{k-1} and σk\sigma_k). The recognized by an FSM MM, denoted L(M)L(M), consists of all strings wΣw \in \Sigma^* that lead to an accepting state under the extended transition function δ^:Q×Σ2Q\hat{\delta}: Q \times \Sigma^* \to 2^Q (or to QQ for DFAs), defined recursively as δ^(q,ϵ)={q}\hat{\delta}(q, \epsilon) = \{q\} and δ^(q,wσ)=qδ^(q,w)δ(q,σ)\hat{\delta}(q, w\sigma) = \bigcup_{q' \in \hat{\delta}(q, w)} \delta(q', \sigma), with wL(M)w \in L(M) if δ^(q0,w)F\hat{\delta}(q_0, w) \cap F \neq \emptyset. Kleene's theorem establishes the equivalence between the languages recognized by FSMs and those described by regular expressions, showing that FSMs capture exactly the regular languages.

Representations

State Diagrams and Visual Notations

State diagrams provide a graphical representation of finite-state machines (FSMs), visually depicting the where states are shown as nodes, typically circles or rounded rectangles, and transitions between states are illustrated as directed edges or arrows. These edges are labeled with inputs that trigger the transition, and in output-producing FSMs, such as Mealy machines, the labels may also include corresponding outputs, for example, "coin/unlock" to denote an input event and its resulting action. This notation allows designers to intuitively map the sequence of states and events, grounding the abstract components of states, transitions, and the transition function in a comprehensible visual form. In object-oriented modeling, UML state machines extend traditional state diagrams with advanced features to handle more complex behaviors. States in UML are represented as rounded rectangles, which can contain nested substates for , and transitions are arrows annotated with event triggers, guard conditions ( expressions like [x > 0]), and actions (e.g., /setFlag). Entry and exit actions are specified within state compartments, executed automatically upon entering or leaving a state, while history states—denoted by a circle with an "H" for shallow history or "H*" for deep history—allow the machine to resume from the most recent active substate configuration after interruption. These extensions make UML state machines suitable for modeling reactive systems in . SDL (Specification and Description Language) state machines, standardized by for systems, employ a graphical notation where states are depicted as rectangular boxes within process diagrams, and transitions are flow lines triggered by input signals or spontaneous events. Channels, represented as lines connecting blocks or processes, define communication paths with unidirectional or bidirectional signal routes, while signals—discrete events with optional parameters—are the primary inputs that consume from input queues to fire transitions. This structure supports concurrent, distributed systems by modeling agents (e.g., processes) as extended FSMs with explicit interfaces via gates and channels. Other visual notations related to FSMs include Harel statecharts, which extend state diagrams with and to manage complexity in reactive systems; states can nest substates, and orthogonal regions allow parallel independent state machines within a , with transitions broadcast across components. Petri nets, while distinct, model concurrency using places (circles), transitions (bars), and tokens, offering a graph-based alternative to FSMs for systems with multiple simultaneous states but differing in their emphasis on rather than strict sequential transitions. The primary advantages of state diagrams and their variants lie in their intuitiveness for design and : they facilitate visual inspection of state sequences and transitions, enabling early detection of inconsistencies like conflicting inputs or unreachable states, which simplifies verification in complex FSMs compared to purely textual or tabular forms.

Tabular Representations

Tabular representations provide a structured, non-graphical method to specify finite-state machines (FSMs) by enumerating all possible state transitions in a matrix format, facilitating precise implementation and verification. Unlike visual diagrams, these tables offer an exhaustive listing of behaviors, making them suitable for automated processing and formal analysis. The is the primary tabular form, with rows representing the current states of the FSM and columns corresponding to possible inputs. Each cell in the table specifies the next state and, for output-producing FSMs, the associated output produced upon transition. This format encodes the FSM's transition function directly, allowing for straightforward translation into code or hardware logic, such as ROM-based implementations where the table serves as a lookup mechanism. For deterministic FSMs, each input-state pair yields a unique next state, ensuring the table is complete without ambiguity. A variant known as the state/event table emphasizes events—discrete occurrences that trigger transitions—over continuous inputs, which is particularly useful in reactive systems where behaviors respond to asynchronous signals like user actions or sensor triggers. In this representation, columns are labeled with events rather than inputs, and cells indicate the next state or no change (often denoted by a or self-reference), along with any actions. This approach highlights event-driven dynamics, making it easier to model systems with sporadic inputs, such as control software. Tabular representations offer several advantages, including exhaustiveness that ensures all transitions are explicitly defined, reducing the risk of overlooked behaviors compared to graphical notations. They are compact for FSMs with few states and inputs, enabling easy automation in tools for , verification, and code generation, and they support efficient hardware realization through direct mapping to . Additionally, tables clarify ignored events by including entries for non-effecting cases, promoting clarity in design. Consider the classic example, a coin-operated gate with two states: Locked and Unlocked. Inputs are coin insertion and push attempts, with outputs controlling the lock mechanism. The for this is as follows:
Current StateInputNext StateOutput
LockedcoinUnlockedunlock
LockedpushLockednone
UnlockedcoinUnlockednone
UnlockedpushLockedlock
This table fully captures the turnstile's behavior: a coin unlocks from Locked, a push locks from Unlocked, and extraneous actions yield no change. Despite their strengths, tabular representations face limitations in scalability for large state spaces, as the table size grows exponentially with the number of states and inputs, potentially leading to unwieldy matrices that hinder manual review. Unused states in encoded representations may also require additional logic to handle invalid entries, complicating implementation. For complex FSMs, complementary views like state diagrams may be needed for initial intuition, though tables excel in precision.

Classifications

Deterministic versus Nondeterministic

Finite-state machines can be classified as deterministic or nondeterministic based on how they handle transitions in response to inputs. In a deterministic finite-state machine (DFSM), the transition function δ:Q×ΣQ\delta: Q \times \Sigma \to Q is a total function, ensuring that for each state qQq \in Q and input symbol σΣ\sigma \in \Sigma, there is exactly one next state δ(q,σ)\delta(q, \sigma). This property guarantees a unique computation path for any input sequence, making DFSMs predictable and suitable for implementations requiring unambiguous behavior. In contrast, a nondeterministic finite-state machine (NFSM) allows multiple possible next states for a given state and input. Its transition function is defined as δ:Q×Σ2Q\delta: Q \times \Sigma \to 2^Q, where 2Q2^Q is the power set of QQ, meaning δ(q,σ)\delta(q, \sigma) yields a set of possible next states, potentially empty, singleton, or larger. This nondeterminism models choice or parallelism, where the machine is considered to accept an input if at least one path leads to an accepting state. An extension, the ϵ\epsilon-NFSM (or NFSM with epsilon transitions), further allows spontaneous transitions without consuming input, via δ:Q×(Σ{ϵ})2Q\delta: Q \times (\Sigma \cup \{\epsilon\}) \to 2^Q, where ϵ\epsilon denotes the and enables state changes for modeling spontaneity or optional steps. Despite these differences, NFSMs and ϵ\epsilon-NFSMs are equivalent to DFSMs in expressive power: every NFSM recognizes the same class of languages as some DFSM. This equivalence is established through the (or subset construction) algorithm, which converts an NFSM with nn states into a DFSM whose states are subsets of the original state set, resulting in up to 2n2^n states. The construction proceeds by defining the DFSM's initial state as the set of all reachable states from the NFSM's start via ϵ\epsilon-closures (if applicable), and its transitions by taking the union of next states under δ\delta followed by ϵ\epsilon-closures. While effective, this can lead to exponential state growth in the worst case, highlighting a between nondeterminism's conciseness in design and determinism's efficiency in simulation.

Acceptors and Recognizers

In automata theory, an acceptor, also known as a finite-state acceptor, is a finite-state machine that determines whether a given input string belongs to a specified language by processing the string sequentially and deciding acceptance or rejection based solely on the final state reached. Upon consuming the entire input, the machine accepts the string if it ends in one of a designated set of accepting states; otherwise, it rejects the string. This binary decision mechanism positions acceptors as fundamental models for recognizing patterns in input sequences without producing additional outputs. The term recognizer is often used synonymously with acceptor, referring to a finite-state machine that computes the language it recognizes, denoted as L(M)ΣL(M) \subseteq \Sigma^*, where Σ\Sigma is the input and MM is the machine. Both deterministic and nondeterministic finite-state machines can function as acceptors or recognizers, as the acceptance criterion depends on the structure of states and transitions rather than the nature of . The L(M)L(M) consists precisely of those strings that lead to an accepting state after processing. The class of languages recognized by finite-state acceptors is known as the regular languages, which are characterized by their ability to be described by regular expressions or generated by regular grammars. Regular languages exhibit closure properties under key operations: they are closed under union, meaning if L1L_1 and L2L_2 are regular, then L1L2L_1 \cup L_2 is regular; under , so L1L2L_1 \cap L_2 is regular; and under complement, so the complement ΣL\Sigma^* \setminus L is regular for any regular LL. These properties enable the construction of acceptors for complex languages by combining simpler ones via product constructions or expression manipulations. A subclass of acceptors known as classifiers extends the binary accept/reject decision to produce n-ary outputs, where n > 2, assigning input strings to one of multiple categories or class labels rather than a simple membership verdict. For instance, in tasks, a classifier might label strings as positive or negative based on final states, generalizing the acceptor's role in . A minimal example of an acceptor is the that recognizes binary strings of even length over the Σ={0,1}\Sigma = \{0, 1\}. This machine uses two states: an initial state qeq_e representing even length (accepting) and qoq_o representing odd length (non-accepting). Transitions alternate between these states on input 0 or 1: from qeq_e to qoq_o, and from qoq_o to qeq_e. The accepted is all even-length strings, such as ϵ\epsilon, 01, 10, but rejects odd-length strings like 0 or 11.

Transducers and Sequencers

Finite-state transducers extend finite-state machines by associating outputs with transitions, enabling them to map input strings over an input Σ\Sigma to output strings over an output Γ\Gamma. Formally, a (FST) is defined as a 5-tuple M=(Q,Σ,Γ,δ,s)M = (Q, \Sigma, \Gamma, \delta, s), where QQ is a of states, sQs \in Q is the start state, and δ:Q×ΣQ×Γ\delta: Q \times \Sigma \to Q \times \Gamma^* is the transition function that specifies both the next state and an output string (possibly empty) for each input symbol. This structure allows FSTs to recognize a while simultaneously translating it into another , distinguishing them from acceptors, which produce no output beyond acceptance. FSTs are particularly valuable in for morphological analysis, where they model the mapping between surface forms of words and their underlying structures. For instance, they can parse inflected words by relating input letter sequences to output analyses of stems and affixes, supporting bidirectional operations like recognition and generation. This application traces to foundational work by Kaplan and Kay, who demonstrated that phonological rewrite rules can be compiled into equivalent FSTs, enabling efficient processing of morphological rules as regular relations. Pioneering implementations, such as Koskenniemi's two-level morphology, further leveraged FSTs to handle constraints between lexical and surface representations in finite-state frameworks. Two prominent subtypes of transducers are Mealy and Moore machines, which differ primarily in the timing of output generation relative to state changes. In a Mealy machine, outputs are produced directly on transitions, depending on both the current state and the input symbol, allowing immediate response to inputs but potentially requiring more complex synchronization in hardware implementations. This model, introduced by Mealy in 1955, facilitates compact designs where outputs reflect transitional behavior. Conversely, a Moore machine generates outputs solely based on the current state, independent of the immediate input, which simplifies output logic but introduces a one-step delay since outputs update only after transitions complete. Moore's 1956 formulation emphasized this state-centric approach for analyzing sequential machine experiments. The choice between them often balances responsiveness against design simplicity; Mealy machines typically require fewer states for equivalent functionality, while Moore machines enhance stability by decoupling outputs from inputs. Sequencers represent another variant of output-generating finite-state machines, operating autonomously without external inputs to produce predefined output over time or internal events. These are essentially cyclic or autonomous FSMs where transitions occur based on state alone or fixed timing, generating periodic patterns such as in counters or behavior sequencing in control systems. For example, a simple sequencer might cycle through states to output a repeating binary like 1010, functioning as a with an empty input alphabet. A practical illustration of a is a that accepts coin inputs and selections to dispense products while returning change. Consider a offering peanuts (P, 65¢) or Doritos (D, 45¢), with inputs including quarters (q, 25¢), half-dollars (h, 50¢), and return (R). Starting in state s, input P followed by q, q, h (total 100¢) transitions through states tracking selection and accumulated value, ultimately outputting P upon R while issuing +35 (35¢ change in nickels). This Mealy-style mapping ensures outputs like product dispensation and change occur on relevant transitions, modeling real-time input-output relations.

Applications

Software and Compiler Design

Finite-state machines are integral to in design, where they facilitate the tokenization of by recognizing patterns such as identifiers, literals, and keywords defined via regular expressions. The standard approach begins with converting regular expressions into nondeterministic finite automata (NFAs) using , which builds the NFA through recursive composition of subexpressions with transitions for , union, and operations. This NFA is then transformed into a (DFA) via the subset construction algorithm, enabling linear-time scanning of the input stream to produce tokens without . In syntax analysis, shift-reduce parsers leverage finite-state machines to perform , maintaining a stack of symbols and using an underlying DFA to determine actions—shift the next input symbol onto the stack or reduce by applying a production rule—based on the current state and lookahead token. This FSM-driven control ensures efficient recognition of context-free grammars in languages like LR(0) and LALR(1), as implemented in tools such as or . Modern software libraries and tools continue to rely on FSMs for text processing in compilers. For instance, Flex, a widely used lexer generator, automates the creation of efficient C-based scanners by compiling user-specified regular expressions into DFAs, optimizing for speed through techniques like table-driven state transitions. An illustrative example is a DFA for keyword recognition in a programming language like C, where states represent partial matches (e.g., after reading 'i' for "if" or "int"), with distinct accepting states triggering token output upon full pattern completion, such as transitioning to accept "if" while rejecting invalid extensions like "iff".

Hardware and Control Systems

Finite-state machines form the basis of sequential circuits in digital hardware, where the current state is stored in flip-flops and transitions are computed using gates based on inputs and the present state. Synchronous designs synchronize these elements with a to ensure stable operation. The number of flip-flops required equals the logarithm base two of the number of states, with deriving next-state signals for the flip-flop inputs and output signals. The standard Huffman model encapsulates this hardware architecture, featuring flip-flops for state memory, for next-state and output generation, and a clock for , often including reset circuitry for initialization. For instance, a four-state controller might use two flip-flops to encode states in binary or , minimizing logic complexity through adjacent state assignments that share similar transitions. This model supports both Moore machines, where outputs depend solely on the current state for enhanced stability, and Mealy machines, where outputs also incorporate inputs for faster response. In control systems, finite-state machines orchestrate embedded hardware behaviors, such as in vending machines that transition through states like user selection, waiting for money insertion, product delivery, and servicing to manage multi-item dispensing and auto-billing. controllers similarly employ FSMs to alternate between states, for example, north-south green and east-west green, triggered by sensors detecting vehicles and timed to ensure safe intervals of 30 seconds per phase. implementations in these controllers promote output stability by tying light signals directly to states, avoiding glitches from input changes. Network protocols leverage FSMs for reliable communication, as exemplified by the Transmission Control Protocol (TCP), which uses a state machine with phases including LISTEN for incoming connections, SYN-SENT and SYN-RECEIVED for handshake, ESTABLISHED for data transfer, and closing states like FIN-WAIT-1, CLOSE-WAIT, LAST-ACK, and TIME-WAIT to manage termination and lingering packets. Transitions occur on events such as SYN or FIN segments, user calls like OPEN or CLOSE, or timeouts, ensuring ordered and error-checked octet streams. Finite-state machines are synthesized into field-programmable gate arrays (FPGAs) and application-specific integrated circuits () using hardware description languages like , where state encodings and transition logic are described in modules that tools infer into flip-flops and gates during the design flow. This process supports efficient implementation by modeling states, inputs, and outputs in behavioral or structural code, optimized for the target hardware's timing and area constraints. A practical example is an modeled as an FSM with states such as idle (waiting for requests), moving (ascending or descending), and doors open (loading/unloading passengers), where transitions respond to floor sensors, button presses, and safety interlocks to sequence motor activation and door operations safely. In a simplified two-floor variant, the FSM shifts from a ground state (red light on, green off) to a first-floor state (red off, green on) on an up input, and reverses on a down input, using to drive indicators.

Implementation and Optimization

Practical Implementation Techniques

Finite-state machines (FSMs) can be implemented in software using procedural constructs like , which directly encode the current state and transitions based on inputs. In this approach, a variable represents the current state, and a evaluates it to execute state-specific logic and determine the next state. This method is straightforward for simple FSMs, as seen in code where states are enumerated constants and transitions are handled within clock or event loops. For example, in applications, nested manage sequential behaviors like LED blinking patterns. For more modular software implementations, especially in , the State design encapsulates state-specific behaviors in separate classes, allowing the context object to delegate actions and transitions dynamically. This avoids monolithic conditional logic by defining an abstract State interface with methods for handling inputs, which concrete state classes implement. In , enums can extend this by associating each state with transition logic and actions, enabling type-safe FSMs where enum values override methods to process events and return the next state. For instance, an enum-based FSM for order processing might define states like PENDING and SHIPPED, each with a processPayment method that triggers transitions based on conditions. This approach enhances for medium-complexity systems. In hardware design, FSMs are realized using hardware description languages like or , where state registers store the current state as a binary-encoded value, and computes the next state and outputs. The typically involves a synchronous always block sensitive to the clock edge, using a case statement to map current states and inputs to next states, while output logic is derived separately as functions of the state and inputs. For example, in , a 2-bit register holds up to four states, with transitions encoded as assignments within the case to ensure glitch-free operation. This Moore or Mealy model is synthesized into flip-flops and logic gates for FPGA or ASIC deployment. follows a similar with processes for sequential and combinational elements. Specialized tools facilitate FSM implementation in domain-specific contexts, such as parser generation with or , which produce code for shift-reduce parsers based on LALR(1) grammars. These tools construct an underlying where states represent parser configurations, and tables dictate shifts (pushing tokens) or reduces (applying rules) on input symbols, effectively implementing the FSM for syntactic analysis. 's output includes state tables that drive the parsing loop, resolving conflicts via precedence declarations. Similarly, Stateflow in enables graphical FSM modeling for control systems, where charts define hierarchical states, transitions with guards, and actions in or code, integrated with blocks for simulation and code generation. States are drawn with entry/exit/do actions, and handles timing, supporting reusable subcharts for complex behaviors like mode switching in automotive systems. In embedded systems, hybrid FSM implementations combine polling-based state evaluation with interrupt-driven to handle asynchronous inputs efficiently. Here, the main loop polls for state transitions in a super-loop architecture, while interrupts populate event queues that trigger state changes, reducing latency for real-time responses without busy-waiting. This approach suits resource-constrained microcontrollers, where FSMs manage peripherals like sensors, using message queues to differentiate and avoid conflicts. For instance, timer interrupts can simulate clock ticks for the FSM, blending procedural code with hardware . A key challenge in practical FSM implementations, particularly for large designs, is state explosion, where the number of states grows exponentially due to concurrent components or intricate interactions, rendering exhaustive enumeration computationally infeasible. This occurs even in modest systems with multiple independent processes, as the state space multiplies combinatorially, complicating verification and synthesis. Mitigation involves hierarchical or abstraction techniques, but these limit full behavioral analysis.

State Minimization Methods

State minimization in finite-state machines (FSMs) seeks to construct an equivalent machine with the minimal number of states while preserving the original or output . This process reduces complexity, memory usage, and implementation costs without altering functionality, making it essential for optimizing designs in software and hardware. For deterministic finite-state machines (DFSMs), the Myhill-Nerode provides the theoretical foundation, establishing that the minimal DFSM recognizing a is unique up to . The defines state equivalence based on the Nerode right congruence: two states p and q are indistinguishable if, for every input string w, the machine starting from p accepts w if and only if it accepts w from q. This congruence partitions the states into equivalence classes, where each class represents a single state in the minimal machine. The partition refinement algorithm realizes this theorem for DFSMs by iteratively merging indistinguishable states. Initially proposed by Moore, it begins with a coarse partition—typically separating accepting and non-accepting states—and refines it by checking transitions: states in the same block are split if they lead to different blocks under some input symbol. Refinement continues until no further splits occur, yielding the minimal partition; the algorithm runs in O(n²) time for n states, where each iteration scans transitions to identify distinctions. Hopcroft's algorithm enhances this approach with a divide-and-conquer strategy, achieving O(n log n) . It maintains an initial partition and a queue of splitter sets (blocks that distinguish others), processing them to refine blocks more efficiently by only splitting when a non-trivial splitter is found, avoiding exhaustive pairwise checks. This improvement is particularly impactful for large automata, as demonstrated in analyses showing tight worst-case bounds. For nondeterministic finite-state machines (NDFSMs), direct minimization is computationally harder (), so the standard method involves first determinizing the NDFSM via subset construction to obtain a DFSM, potentially exponential in size, and then applying DFSM minimization techniques like Hopcroft's algorithm. This two-step process yields a minimal equivalent NDFSM only indirectly, as the resulting DFA's states correspond to subsets of the original, but it ensures behavioral equivalence for the recognized language.

Extensions and Alternatives

Alternative Semantics

Alternative semantics of finite-state machines (FSMs) extend the classical deterministic or nondeterministic models by incorporating elements of , timing, or partial membership, thereby relaxing strict while maintaining a finite state space. These variants address limitations in modeling real-world systems where inputs may be probabilistic, time-dependent, or imprecise, allowing for more flexible representations in applications like and real-time control. Unlike classical FSMs, which assume crisp transitions based on exact input symbols, alternative semantics introduce quantitative measures to handle variability. Probabilistic finite-state machines (PFSMs), also known as probabilistic automata, assign probabilities to transitions rather than deterministic outcomes, enabling the modeling of processes. Introduced by in 1963, PFSMs generalize finite automata by allowing each transition from a state on an input symbol to lead to a next state with an associated probability, where the sum of probabilities from any state on a given input equals one. These models are foundational to Markov chains and hidden Markov models (HMMs), where states may be unobserved, and transitions reflect probabilistic dependencies; for instance, PFSMs underpin sequence modeling in and bioinformatics by estimating the likelihood of input sequences. A key difference from classical FSMs is the shift from acceptance based on path existence to acceptance via probability thresholds, such as deeming a string accepted if the total probability of reaching an accepting state exceeds a predefined value. Timed automata represent another extension, integrating clock variables to constrain transitions based on elapsed time, suitable for specifying real-time behaviors in embedded systems. Formulated by Alur and Dill in 1994, a timed automaton augments a standard FSM with dense real-valued clocks that evolve uniformly and can be reset or tested against integer constants during transitions. This semantics relaxes the input-driven timing of classical FSMs by enforcing temporal guards, such as requiring a transition only after a clock exceeds 5 units but before it reaches 10, enabling verification of properties like bounded response times in protocols. Unlike purely discrete classical models, timed automata capture continuous time while preserving decidability for reachability through region graphs. Fuzzy finite-state machines (FFSMs) incorporate theory to allow states and transitions to have degrees of membership between 0 and 1, accommodating imprecise or uncertain environments. Pioneered by Wee and Fu in 1969, FFSMs define transition functions that output fuzzy subsets of states, reflecting partial activations rather than binary choices. This approach differs from classical FSMs by replacing deterministic or probabilistic selections with graded possibilities, often using max-min compositions for fuzzy relations, which proves useful in control systems for handling noisy sensor data. For example, in a probabilistic acceptor variant adapted for fuzzy inputs, an FFSM might assign membership degrees to states based on partial matches in noisy , such as identifying distorted handwriting where classical models fail due to exact symbol requirements. Overall, these semantics maintain finiteness but broaden applicability by softening the boundaries of state changes.

Beyond Finite States

Finite-state machines (FSMs) occupy the lowest level of the , corresponding to Type-3 grammars that generate . These models are limited to recognizing patterns without memory beyond a of states, as formalized in early work on linguistic structures. A key limitation of FSMs is their inability to recognize non-regular languages, such as the set {a^n b^n | n ≥ 0}, which requires matching equal numbers of symbols in a way that demands unbounded counting. This is proven using the , which states that any sufficiently long string in a can be divided into segments where the first can be repeated without leaving the language; applying it to a^n b^n with n larger than the pumping length yields strings outside the set, contradicting regularity. To address these limitations, pushdown automata extend FSMs by incorporating an unbounded stack, enabling recognition of context-free languages (Type-2 in the ), such as balanced parentheses or programming language syntax. This addition allows the model to handle nested or recursive structures that exceed finite memory. Hierarchical state machines, exemplified by Statecharts, introduce nested states and to manage in large systems, allowing substates and concurrent behaviors without exploding the state space exponentially. Developed as a visual extension of traditional FSMs, they facilitate modeling reactive systems like user interfaces or embedded controllers. In stochastic settings, infinite-state Markov chains generalize FSMs by permitting countably infinite states, modeling processes like queueing systems or random walks where the state space grows without bound while preserving the Markov property of memoryless transitions. These serve as limits of finite models in analyzing long-term behaviors in probabilistic systems. Extensions beyond finite states are necessary when applications involve nested dependencies, unbounded memory requirements, or infinite possibilities, such as in parsing context-free grammars or simulating continuous stochastic phenomena.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.