Hubbry Logo
Generator (computer programming)Generator (computer programming)Main
Open search
Generator (computer programming)
Community hub
Generator (computer programming)
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Generator (computer programming)
Generator (computer programming)
from Wikipedia

In computer science, a generator is a routine that can be used to control the iteration behaviour of a loop. All generators are also iterators.[1] A generator is very similar to a function that returns an array, in that a generator has parameters, can be called, and generates a sequence of values. However, instead of building an array containing all the values and returning them all at once, a generator yields the values one at a time, which requires less memory and allows the caller to get started processing the first few values immediately. In short, a generator looks like a function but behaves like an iterator.

Generators can be implemented in terms of more expressive control flow constructs, such as coroutines or first-class continuations.[2] Generators, also known as semicoroutines,[3] are a special case of (and weaker than) coroutines, in that they always yield control back to the caller (when passing a value back), rather than specifying a coroutine to jump to; see comparison of coroutines with generators.

Uses

[edit]

Generators are usually invoked inside loops.[4] The first time that a generator invocation is reached in a loop, an iterator object is created that encapsulates the state of the generator routine at its beginning, with arguments bound to the corresponding parameters. The generator's body is then executed in the context of that iterator until a special yield action is encountered; at that time, the value provided with the yield action is used as the value of the invocation expression. The next time the same generator invocation is reached in a subsequent iteration, the execution of the generator's body is resumed after the yield action, until yet another yield action is encountered. In addition to the yield action, execution of the generator body can also be terminated by a finish action, at which time the innermost loop enclosing the generator invocation is terminated. In more complicated situations, a generator may be used manually outside of a loop to create an iterator, which can then be used in various ways.

Because generators compute their yielded values only on demand, they are useful for representing streams, such as sequences that would be expensive or impossible to compute at once. These include e.g. infinite sequences and live data streams.

When eager evaluation is desirable (primarily when the sequence is finite, as otherwise evaluation will never terminate), one can either convert to a list, or use a parallel construction that creates a list instead of a generator. For example, in Python a generator g can be evaluated to a list l via l = list(g), while in F# the sequence expression seq { ... } evaluates lazily (a generator or sequence) but [ ... ] evaluates eagerly (a list).

In the presence of generators, loop constructs of a language – such as for and while – can be reduced into a single loop ... end loop construct; all the usual loop constructs can then be comfortably simulated by using suitable generators in the right way. For example, a ranged loop like for x = 1 to 10 can be implemented as iteration through a generator, as in Python's for x in range(1, 10). Further, break can be implemented as sending finish to the generator and then using continue in the loop.

Languages providing generators

[edit]

Generators first appeared in CLU (1975),[5] were a prominent feature in the string manipulation language Icon (1977) and are now available in Python (2001),[6] C#,[7] Ruby, PHP,[8] ECMAScript (as of ES6/ES2015), and other languages. In CLU and C#, generators are called iterators, and in Ruby, enumerators.

Lisp

[edit]

The final Common Lisp standard does not natively provide generators, yet various library implementations exist, such as SERIES documented in CLtL2 or pygen.

CLU

[edit]

A yield statement is used to implement iterators over user-defined data abstractions.[9]

string_chars = iter (s: string) yields (char);
  index: int := 1;
  limit: int := string$size (s);
  while index <= limit do
    yield (string$fetch(s, index));
    index := index + 1;
    end;
end string_chars;

for c: char in string_chars(s) do
   ...
end;

Icon

[edit]

Every expression (including loops) is a generator. The language has many generators built-in and even implements some of the logic semantics using the generator mechanism (logical disjunction or "OR" is done this way).

Printing squares from 0 to 20 can be achieved using a co-routine by writing:

   local squares, j
   squares := create (seq(0) ^ 2)
   every j := |@squares do
      if j <= 20 then
         write(j)
      else
         break

However, most of the time custom generators are implemented with the "suspend" keyword which functions exactly like the "yield" keyword in CLU.

C

[edit]

C does not have generator functions as a language construct, but, as they are a subset of coroutines, it is simple to implement them using any framework that implements stackful coroutines, such as libdill.[10] On POSIX platforms, when the cost of context switching per iteration is not a concern, or full parallelism rather than merely concurrency is desired, a very simple generator function framework can be implemented using pthreads and pipes.

C++

[edit]

It is possible to introduce generators into C++ using pre-processor macros. The resulting code might have aspects that are very different from native C++, but the generator syntax can be very uncluttered.[11] The set of pre-processor macros defined in this source allow generators defined with the syntax as in the following example:

Generator.hpp:

#pragma once

$generator(Descent) {
    // place the constructor of our generator, e.g. 
    // Descent(int minv, int maxv) { ... }
   
    // from $emit to $stop is a body of our generator:
    
    $emit(int) // will emit int values. Start of body of the generator.
        for (int i = 10; i > 0; --i) {
            $yield(i); // similar to yield in Python, returns next number in [1..10], reversed.
        }
    $stop; // stop, end of sequence. End of body of the generator.
};

This can then be iterated using:

import std;
import "Generator.hpp";

int main(int argc, char* argv[]) {
    Descent gen;
    // "get next" generator invocation
    for (int n; gen(n);) {
        std::print("next number is %d\n", n);
    }
    return 0;
}

Moreover, C++11 allows foreach loops to be applied to any class that provides the begin and end functions. It is then possible to write generator-like classes by defining both the iterable methods (begin and end) and the iterator methods (operator!=, operator++ and operator*) in the same class. For example, it is possible to write the following program:

import std;

int main() {
    for (int i: Range(10)) {
        std::println("{}", i);
    }
    return 0;
}

A basic range implementation would look like that:

class Range {
private:
    int last;
    int iter;
public:
    explicit Range(int end):
        last{end}, iter{0} {}

    // Iterable functions
    const Range& begin() const { 
        return *this; 
    }

    const Range& end() const { 
        return *this; 
    }

    // Iterator functions
    bool operator!=(const Range&) const { 
        return iter < last; 
    }

    void operator++() { 
        ++iter; 
    }

    int operator*() const { 
        return iter; 
    }
};

Furthermore, C++20 formally introduced support for coroutines,[12] which can be used to implement generators.[13] C++23 introduced std::generator[14] in the standard library, making it much easier to implement generators. For example, a basic range generator can be implemented as:

import std;

using std::generator;

generator<int> range(int n) {
    for (int i = 0; i < n; ++i) {
        co_yield i;
    }
}

It can be iterated using foreach loops:

import std;

using std::generator;

generator<int> range(int n) {
    for (int i = 0; i < n; ++i) {
        co_yield i;
    }
}

int main() {
    for (int i: range(10)) {
        std::println("{}", i);
    }
    return 0;
}

C#

[edit]

An example C# 2.0 generator (the yield is available since C# version 2.0): Both of these examples utilize generics, but this is not required. yield keyword also helps in implementing custom stateful iterations over a collection as discussed in this discussion.[15]

// Method that takes an iterable input (possibly an array)
// and returns all even numbers.
public static IEnumerable<int> GetEven(IEnumerable<int> numbers)
{
    foreach (int number in numbers)
    {
        if ((number % 2) == 0)
        {
            yield return number;
        }
    }
}

It is possible to use multiple yield return statements and they are applied in sequence on each iteration:

public class CityCollection : IEnumerable<string>
{
    public IEnumerator<string> GetEnumerator()
    {
        yield return "New York";
        yield return "Paris";
        yield return "London";
    }
}

Perl

[edit]

Perl does not natively provide generators, but support is provided by the Coro::Generator module which uses the Coro co-routine framework. Example usage:

use strict;
use warnings;
# Enable generator { BLOCK } and yield
use Coro::Generator;
# Array reference to iterate over
my $chars = ['A'...'Z'];

# New generator which can be called like a coderef.
my $letters = generator {
    my $i = 0;
    for my $letter (@$chars) {
        # get next letter from $chars
        yield $letter;
    }
};

# Call the generator 15 times.
print $letters->(), "\n" for (0..15);

Raku

[edit]

Example parallel to Icon uses Raku (formerly/aka Perl 6) Range class as one of several ways to achieve generators with the language.

Printing squares from 0 to 20 can be achieved by writing:

for (0 .. *).map(* ** 2) -> $i {
    last if $i > 20;
    say $i
}

However, most of the time custom generators are implemented with "gather" and "take" keywords in a lazy context.

Tcl

[edit]

In Tcl 8.6, the generator mechanism is founded on named coroutines.

proc generator {body} {
    coroutine gen[incr ::disambiguator] apply {{script} {
        # Produce the result of [generator], the name of the generator
        yield [info coroutine]
        # Do the generation
        eval $script
        # Finish the loop of the caller using a 'break' exception
        return -code break
    }} $body
}

# Use a simple 'for' loop to do the actual generation
set count [generator {
    for {set i 10} {$i <= 20} {incr i} {
        yield $i
    }
}]

# Pull values from the generator until it is exhausted
while 1 {
    puts [$count]
}

Haskell

[edit]

In Haskell, with its lazy evaluation model, every datum created with a non-strict data constructor is generated on demand. For example,

countFrom :: Integer -> [Integer]
countFrom n = n : countFrom (n + 1)

from10to20 :: [Integer]
from10to20 = takeWhile (<= 20) $ countFrom 10

primes :: [Integer]
primes = 2 : 3 : nextPrime 5
  where
    nextPrime n
        | notDivisible n = n : nextPrime (n + 2)
        | otherwise = nextPrime (n + 2)
    notDivisible n =
        all ((/= 0) . (rem n)) $ takeWhile ((<= n) . (^ 2)) $ tail primes

where (:) is a non-strict list constructor, cons, and $ is just a "called-with" operator, used for parenthesization. This uses the standard adaptor function,

takeWhile p [] = []
takeWhile p (x:xs) | p x = x : takeWhile p xs
                   | otherwise = []

which walks down the list and stops on the first element that doesn't satisfy the predicate. If the list has been walked before until that point, it is just a strict data structure, but if any part hadn't been walked through before, it will be generated on demand. List comprehensions can be freely used:

squaresUnder20 = takeWhile (<= 20) [x * x | x <- countFrom 10]
squaresForNumbersUnder20 = [x * x | x <- takeWhile (<= 20) $ countFrom 10]

Racket

[edit]

Racket provides several related facilities for generators. First, its for-loop forms work with sequences, which are a kind of a producer:

(for ([i (in-range 10 20)])
  (printf "i = ~s\n" i))

and these sequences are also first-class values:

(define 10-to-20 (in-range 10 20))
(for ([i 10-to-20])
  (printf "i = ~s\n" i))

Some sequences are implemented imperatively (with private state variables) and some are implemented as (possibly infinite) lazy lists. Also, new struct definitions can have a property that specifies how they can be used as sequences.

But more directly, Racket comes with a generator library for a more traditional generator specification. For example,

#lang racket
(require racket/generator)
(define (ints-from from)
  (generator ()
    (for ([i (in-naturals from)]) ; infinite sequence of integers from 0
      (yield i))))
(define g (ints-from 10))
(list (g) (g) (g)) ; -> '(10 11 12)

Note that the Racket core implements powerful continuation features, providing general (re-entrant) continuations that are composable, and also delimited continuations. Using this, the generator library is implemented in Racket.

PHP

[edit]
UML class diagram depiction of the inheritance chain of the Generator class in PHP

The community of PHP implemented generators in PHP 5.5. Details can be found in the original Request for Comments: Generators.

Infinite Fibonacci sequence:

function fibonacci(): Generator
{
    $last = 0;
    $current = 1;
    yield 1;
    while (true) {
        $current = $last + $current;
        $last = $current - $last;
        yield $current;
    }
}

foreach (fibonacci() as $number) {
    echo $number, "\n";
}

Fibonacci sequence with limit:

function fibonacci(int $limit): Generator 
{
    yield $a = $b = $i = 1;
 
    while (++$i < $limit) {
        yield $a = ($b = $a + $b) - $a;
    }
}

foreach (fibonacci(10) as $number) {
    echo "$number\n";
}

Any function which contains a yield statement is automatically a generator function.

Ruby

[edit]

Ruby supports generators (starting from version 1.9) in the form of the built-in Enumerator class.

# Generator from an Enumerator object
chars = Enumerator.new(['A', 'B', 'C', 'Z'])

4.times { puts chars.next }

# Generator from a block
count = Enumerator.new do |yielder|
  i = 0
  loop { yielder.yield i += 1 }
end

100.times { puts count.next }

Java

[edit]

Java has had a standard interface for implementing iterators since its early days, and since Java 5, the "foreach" construction makes it easy to loop over objects that provide the java.lang.Iterable interface. (The Java collections framework and other collections frameworks, typically provide iterators for all collections.)

record Pair(int a, int b) {};

Iterable<Integer> myIterable = Stream.iterate(new Pair(1, 1), p -> new Pair(p.b, p.a + p.b))
        .limit(10)
        .map(p -> p.a)::iterator;

myIterable.forEach(System.out::println);

Or get an Iterator from the Java 8 super-interface BaseStream of Stream interface.

record Pair(int a, int b) {};

// Save the iterator of a stream that generates fib sequence
Iterator<Integer> myGenerator = Stream
        // Generates Fib sequence
        .iterate(new Pair(1, 1), p -> new Pair(p.b, p.a + p.b))
        .map(p -> p.a).iterator();

// Print the first 5 elements
for (int i = 0; i < 5; i++) {
    System.out.println(myGenerator.next());
}

System.out.println("done with first iteration");

// Print the next 5 elements
for (int i = 0; i < 5; i++) {
    System.out.println(myGenerator.next());
}

Output:

1
1
2
3
5
done with first iteration
8
13
21
34
55

XL

[edit]

In XL, iterators are the basis of 'for' loops:

import IO = XL.UI.CONSOLE

iterator IntegerIterator (var out Counter : integer; Low, High : integer) written Counter in Low..High is
    Counter := Low
    while Counter <= High loop
        yield
        Counter += 1

// Note that I needs not be declared, because declared 'var out' in the iterator
// An implicit declaration of I as an integer is therefore made here
for I in 1..5 loop
    IO.WriteLn "I=", I

F#

[edit]

F# provides generators via sequence expressions, since version 1.9.1.[16] These can define a sequence (lazily evaluated, sequential access) via seq { ... }, a list (eagerly evaluated, sequential access) via [ ... ] or an array (eagerly evaluated, indexed access) via [| ... |] that contain code that generates values. For example,

seq { for b in 0 .. 25 do
          if b < 15 then
              yield b * b }

forms a sequence of squares of numbers from 0 to 14 by filtering out numbers from the range of numbers from 0 to 25.

Python

[edit]

Generators were added to Python in version 2.2 in 2001.[6] An example generator:

from typing import Iterator

def countfrom(n: int) -> Iterator[int]:
    while True:
        yield n
        n += 1

# Example use: printing out the integers from 10 to 20.
# Note that this iteration terminates normally, despite
# countfrom() being written as an infinite loop.

for i in countfrom(10):
    if i <= 20:
        print(i)
    else:
        break

# Another generator, which produces prime numbers indefinitely as needed.
import itertools

def primes() -> Iterator[int]:
    """Generate prime numbers indefinitely as needed."""
    yield 2
    n = 3
    p = [2]
    while True:
        # If dividing n by all the numbers in p, up to and including sqrt(n),
        # produces a non-zero remainder then n is prime.
        if all(n % f > 0 for f in itertools.takewhile(lambda f: f * f <= n, p)):
            yield n
            p.append(n)
        n += 2

In Python, a generator can be thought of as an iterator that contains a frozen stack frame. Whenever next() is called on the iterator, Python resumes the frozen frame, which executes normally until the next yield statement is reached. The generator's frame is then frozen again, and the yielded value is returned to the caller.

PEP 380 (implemented in Python 3.3) adds the yield from expression, allowing a generator to delegate part of its operations to another generator or iterable.[17]

Generator expressions

[edit]

Python has a syntax modeled on that of list comprehensions, called a generator expression that aids in the creation of generators. The following extends the first example above by using a generator expression to compute squares from the itertools.count() generator function:

from typing import Generator

squares: Generator[int, None, None] = (n * n for n in itertools.count(2))

for j in squares:
    if j <= 20:
        print(j)
    else:
        break

ECMAScript

[edit]

ECMAScript 6 (a.k.a. Harmony) introduced generator functions.

An infinite Fibonacci sequence can be written using a function generator:

function* fibonacci(limit) {
    let [prev, curr] = [0, 1];
    while (!limit || curr <= limit) {
        yield curr;
        [prev, curr] = [curr, prev + curr];
    }
}

// bounded by upper limit 10
for (const n of fibonacci(10)) {
    console.log(n);
}

// generator without an upper bound limit
for (const n of fibonacci()) {
    console.log(n);
    if (n > 10000) break;
}

// manually iterating
let fibGen = fibonacci();
console.log(fibGen.next().value); // 1
console.log(fibGen.next().value); // 1
console.log(fibGen.next().value); // 2
console.log(fibGen.next().value); // 3
console.log(fibGen.next().value); // 5
console.log(fibGen.next().value); // 8

// picks up from where you stopped
for (const n of fibGen) {
    console.log(n);
    if (n > 10000) break;
}

R

[edit]

The iterators package can be used for this purpose.[18][19]

library(iterators)

# Example ------------------
abc <- iter(c('a','b','c'))
nextElem(abc)

Smalltalk

[edit]

Example in Pharo Smalltalk:

The Golden ratio generator below returns to each invocation 'goldenRatio next' a better approximation to the Golden Ratio.

goldenRatio := Generator on: [ :g | | x y z r | 
	x := 0.
	y := 1.
	[  
		z := x + y.
		r := (z / y) asFloat.
		x := y.
		y := z.
		g yield: r
	] repeat	
].

goldenRatio next.

The expression below returns the next 10 approximations.

Character cr join: ((1 to: 10) collect: [ :dummy | ratio next ]).

See more in A hidden gem in Pharo: Generator.

See also

[edit]

Notes

[edit]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
In , a generator is a special type of function that produces a sequence of values lazily, yielding one value at a time upon request rather than computing and returning an entire collection upfront, which facilitates memory-efficient and supports potentially infinite sequences. This mechanism relies on suspending and resuming the function's execution at designated yield points, preserving its state between calls, and typically implements an protocol to integrate with loops and other constructs. The concept of generators originated in the Icon programming language, developed in the late 1970s at the by E. Griswold and colleagues, where expressions can produce multiple results through goal-directed evaluation and built-in generators like i to j for integer ranges or !x for iterating over collections. Icon's generators were deeply integrated, allowing any expression to act as a generator and supporting backtracking for multiple solutions, influencing subsequent languages. Generators gained widespread adoption starting with Python in version 2.2 (released December 2001), introduced via PEP 255 as a simpler alternative to manual iterator classes, using the yield keyword to create generator functions that return iterator objects. In Python, calling a generator function produces a generator object that advances via next(), suspending at each yield while maintaining local variables, and raising StopIteration when exhausted. This feature simplifies writing custom iterables for tasks like data streaming or infinite series computation. Other major languages adopted similar constructs: introduced generators in ECMAScript 2015 (ES6) with function* syntax and yield, enabling on-demand value production and integration with for...of loops, with broad browser support by 2016. C# added yield return in version 2.0 (2005) for creating enumerators in iterators, supporting lazy evaluation in LINQ queries. PHP implemented generators in version 5.5 (2013) using yield for traversable objects. These implementations emphasize efficiency for large datasets, avoiding the overhead of building full lists in memory. Beyond memory savings, generators promote lazy evaluation and , allowing pipelines of (e.g., via Python's itertools module for algebra) and handling asynchronous flows in extensions like 's async generators. They are particularly valuable in paradigms for tasks involving streams, simulations, or API responses, though they require careful handling of state and exceptions to avoid subtle bugs.

Fundamentals

Definition

In computer programming, a generator is a special type of function that can pause its execution and resume from the point of interruption, allowing it to produce a sequence of values over multiple invocations rather than returning a single result immediately. Generators enable the creation of iterable objects that yield values on demand, typically through a mechanism like the yield keyword, which suspends the function's execution while preserving its local state for future resumptions. This on-demand yielding supports lazy computation, where values are generated only when requested, often via iterator protocols in languages that implement them. Unlike regular functions, which execute fully upon invocation and terminate after returning a value, generators maintain internal state across suspensions and resumptions, effectively turning the function into a stateful that can produce multiple values sequentially. This distinction allows generators to handle ongoing computations without recomputing from scratch each time. For example, a simple generator function that yields successive integers starting from 0 can be expressed in as follows:

function infinite_integers(): i = 0 while true: yield i i = i + 1

function infinite_integers(): i = 0 while true: yield i i = i + 1

Calling this generator produces values like 0, 1, 2, and so on, one at a time upon each request.

Key Characteristics

Generators exhibit several defining behavioral traits that distinguish them from conventional functions, primarily through their ability to pause and resume execution. Unlike standard functions, which run to completion upon invocation and return a single value, generators produce a sequence of values over time via the yield mechanism, where execution suspends at a yield point and resumes from that exact location on subsequent calls. One fundamental characteristic is , whereby values are computed and yielded only when explicitly requested by the consumer, such as through . This on-demand evaluation prevents the upfront computation of entire result sets, thereby avoiding unnecessary processing for sequences that may not be fully traversed. Generators preserve internal state across suspensions and resumptions, maintaining the values of local variables, , and execution from the point of yielding. When a generator is invoked again, it picks up precisely where it left off, enabling the creation of stateful iterators without explicit management of persistence. They integrate seamlessly with iterator protocols in supporting languages, automatically implementing the necessary interfaces to function as iterables. For instance, a generator object responds to methods like next(), allowing it to be used directly in looping constructs without additional wrapper code. Execution in generators is inherently one-way and forward-progressing, with no built-in capability to rewind or revisit prior yields. Once a value is produced, the generator advances to the next yield point or terminates upon reaching a return or end, enforcing a linear consumption model. A key advantage is , particularly in usage, as generators generate values incrementally rather than storing an entire in . This makes them suitable for handling large-scale or potentially infinite data streams without risking exhaustion of system resources.

Relation to Other Constructs

Generators serve as a specialized form of , enabling the sequential production of values on demand while adhering to the protocol. Unlike traditional iterators, which often require explicit class implementations to manage state across method calls, generators achieve the same functionality through functions that incorporate suspension points, resulting in more concise and readable code for defining iterable sequences. Generators exhibit significant overlap with coroutines, as both mechanisms support pausing and resuming execution to preserve local state. However, generators are primarily designed for unidirectional data production in iterative contexts, whereas coroutines facilitate bidirectional communication, allowing values and exceptions to be sent back to the suspended routine, which extends their utility to scenarios like asynchronous programming. In contrast to threads, generators operate as , constructs that avoid the overhead of OS-level scheduling and primitives. Threads enable preemptive concurrency with potential for race conditions and locking, while generators rely on explicit yield points for control transfer, making them non-preemptive and suitable for user-level multitasking without parallelism. Conceptually, generators evolved from , higher-order functions that capture and manipulate execution context in paradigms. This lineage allows generators to be modeled as one-shot delimited continuations, where suspension corresponds to capturing a portion of the continuation for later resumption.
AspectGeneratorsIteratorsCoroutinesThreads
State ManagementPauses/resumes via yield, preserves local variablesMaintains state across next() calls, often via explicit objectsBidirectional pause/resume with value/exception passingIndependent stacks, preemptive context switches
Concurrency Model, single-threadedNone (sequential iteration), supports multitaskingPreemptive, true parallelism possible
Overhead, no OS involvementVaries by , user-levelHigher due to kernel scheduling and
Primary Use CasesLazy sequence generation, memory-efficient Traversing collectionsAsync I/O, simulations, bidirectional protocolsParallel , shared resource access

History

Early Concepts

The of generators in traces its roots to early ideas in and continuations developed during the and , particularly within the context of early research aimed at enabling flexible for problem-solving systems. , a foundational precursor to generators, were first conceptualized by Melvin E. Conway in 1958 and formally described in his 1963 paper on designing a separable transition-diagram for , where they were used to modularize lexical and syntactic analysis by allowing autonomous subroutines to pause execution and transfer control to adjacent modules along one-way communication paths. In this framework, coroutines facilitated pause-resume behavior through explicit control transfers, such as a "read" operation yielding control to another coroutine and resuming upon a corresponding "write," enabling without a dominant master routine. Building on this, continuations emerged in the early as a mathematical abstraction for capturing and manipulating program control points, heavily influenced by the formal semantics of and its impact on AI-oriented languages like . Key early proposals included Peter Landin's 1964 SECD machine model, which used a "J operator" to represent jumps as functional applications, effectively allowing programs to resume from arbitrary points and inspiring pause-resume mechanisms in implementations for symbolic computation in AI tasks. Adriaan van Wijngaarden's 1964 (CPS) transformation for further formalized this by appending continuation parameters to procedures, enabling explicit control over suspension and resumption, which drew inspiration from 1.5's pattern-matching functions for AI applications like theorem proving. These ideas gained practical traction in goal-directed programming paradigms during the late , notably through Carl Hewitt's Planner language introduced in at the International Joint Conference on Artificial Intelligence. Planner, designed for theorem proving and robotic manipulation in AI systems, incorporated as a core primitive where processes could suspend execution, explore alternative paths, and resume to generate multiple solutions on demand, closely resembling the yielding behavior later seen in generators. This backtracking mechanism allowed Planner programs to iteratively produce values through failure-driven search, tying into broader AI efforts to handle nondeterministic computation without exhaustive enumeration. A pivotal milestone in the evolution toward explicit generators occurred with the introduction of the Icon programming language in 1977 by Ralph E. Griswold and colleagues at the University of Arizona, building on experiences with SNOBOL and SL5 for string processing. Icon integrated generators as a built-in feature for goal-directed expression evaluation, where expressions could produce zero, one, or multiple results iteratively upon demand, enabling efficient backtracking and resumption for non-numerical applications like pattern matching and search. Despite these advances, early systems implementing coroutine-like and continuation-based mechanisms faced significant limitations, including a lack of across languages, which resulted in varied and control-transfer semantics, and insufficient hardware support for efficient suspension, often relying on software-simulated stack manipulation that hindered performance in resource-constrained environments.

Mid-20th Century Developments

In 1975, the programming language CLU, developed by and her team at MIT, introduced iterators as a core feature for abstract data types, providing yield-like semantics that allowed functions to produce sequences of values on demand without full computation upfront. These iterators enabled modular over structures, emphasizing data abstraction by encapsulating generation logic within the type definition, which influenced later designs for memory-efficient traversal. CLU's approach marked an early practical integration of generator concepts into a typed, imperative language, facilitating cleaner separation of from representation. During the 1970s and 1980s, variants explored generators through macros and continuations, particularly in Scheme and implementations. In Scheme, developed from the mid-1970s Lambda Papers, first-class continuations via call-with-current-continuation allowed simulation of generator behavior by capturing and resuming computation states, enabling non-local control for iterative production of values. , standardized in the 1980s, supported similar patterns through macros like LOOP or custom constructs that mimicked yielding via closures and dynamic environments, though these were often library-based rather than primitives. These techniques in dialects provided flexible, user-defined generators for symbolic processing and AI applications, leveraging the s' homoiconicity for meta-programming extensions. The Icon programming language, released in 1977 by Ralph and Madge Griswold at the University of Arizona, advanced generators for goal-directed computation, particularly in string scanning and pattern matching with backtracking. Icon's generators produced multiple values successively upon demand, suspending and resuming execution to support failure-driven search, where unsuccessful paths automatically backtracked to alternatives. This mechanism, integral to expressions like string scans, allowed concise implementations of non-deterministic algorithms, such as finding all occurrences in text without explicit recursion or loops. Icon's influence extended to academic explorations of declarative programming, highlighting generators' role in simplifying complex iteration over search spaces. In the 1980s, scripting languages like Tcl, introduced by in 1988, began experimenting with constructs resembling generators through command extensions and processing, though full emerged later in extensions. Early Tcl focused on embeddable scripting with proc definitions that could simulate iterative yielding via uplevel calls and variable scoping, aiding tool integration but lacking native suspension. Concurrently, late-1980s , created by in 1987, incorporated lazy-like operations in its core via and , which deferred expansion in scalar contexts, foreshadowing generator patterns for text without building full intermediate lists. These features in Tcl and Perl provided practical, lightweight approximations of generators for and data manipulation in Unix environments. The saw a shift toward formalizing generators in language designs, exemplified by Perl 6 (later Raku), whose planning began in the late under Larry Wall's leadership to address Perl 5's limitations. Perl 6's specification emphasized through sequences and the gather/take construct, allowing blocks to yield values incrementally for infinite or large datasets, integrated with hyperoperators for parallelizable . This design drew from functional influences, prioritizing on-demand computation to enhance expressiveness in list processing while maintaining Perl's pragmatic syntax. Implementations of generators in this era faced challenges including portability across diverse hardware and compilers, as non-native simulations via continuations or macros often required platform-specific tweaks. Performance overhead arose from repeated state saving and restoration in suspension mechanisms, particularly in interpreted environments without optimized runtime support, limiting adoption in performance-critical systems. These issues prompted ongoing research into native integration to balance flexibility with efficiency.

Modern Expansion

The integration of generators into mainstream programming languages accelerated in the , driven by the need for more efficient over large datasets and support for asynchronous operations, as hardware advancements like multicore processors and the rise of processing demanded memory-efficient constructs that avoided loading entire collections into memory. Python pioneered this expansion with the introduction of native generators in version 2.2 in December 2001, via the yield keyword, which allowed functions to produce iterator protocols seamlessly and boosted their adoption for in data-intensive tasks. This feature transformed generator functions into iterable objects, enabling on-demand value production without the overhead of full list materialization. Building on this momentum, C# incorporated iterators in version 2.0, released in November 2005 alongside 2.0, through the yield return statement, which simplified the implementation of IEnumerable interfaces for yielding collections incrementally. This made it easier to create custom iterators for , aligning with growing demands for scalable applications. Similarly, Ruby introduced fiber-based generators in version 1.9 in December 2007, providing lightweight coroutines that extended generator capabilities for concurrent-like programming without full threading overhead. These fibers later evolved into more versatile blocks, supporting generator patterns in scripting workflows. JavaScript followed with generator support in ECMAScript, initially proposed around 2006 and formally standardized in ECMAScript 2015 (ES6), using function* declarations and the yield operator to enable asynchronous iteration over iterables. This standardization facilitated non-blocking code patterns, crucial for web development's async needs. PHP advanced the trend in version 5.5 in June 2013 by adding generator classes with the send() method, allowing bidirectional communication for coroutine-style execution and enabling efficient handling of large result sets in web applications. Post-2015 enhancements in PHP 7 further refined this bidirectional flow, solidifying generators' role in memory-constrained environments. In the 2010s and 2020s, generators expanded into domain-specific languages like F#, which included sequence expressions with yield since its 1.0 release in 2005, optimizing lazy sequences for functional data processing. R adopted generator-like functionality through the coro package in 2020, providing iterators and adaptive generators for statistical computing on voluminous datasets. Experimental support emerged in Rust by 2025, with an unstable built-in iterator implementation for generators, reflecting ongoing efforts to enhance safe concurrency in systems programming. Overall, this proliferation stemmed from hardware improvements enabling parallel processing, the explosion of big data requiring lazy iteration, and the shift toward asynchronous paradigms in distributed systems.

Uses and Applications

Lazy Evaluation

In the context of generators, is an that postpones the computation of values until they are explicitly requested, enabling the production of on demand rather than eagerly computing and storing an entire in advance. This approach leverages the generator's ability to suspend and resume execution at yield points, ensuring that only the necessary portions of the computation are performed when iterating over the generator. Generators exemplify through on-demand generation of sequences, such as the numbers or prime numbers, where each value is computed and yielded only as needed without preallocating memory for the full list. For the , a generator can produce terms indefinitely by maintaining state between yields, allowing access to subsequent numbers without recomputing or storing prior ones in a list. Similarly, a lazy prime generator might use an algorithm like the adapted for incremental yielding, filtering composites as primes are requested. A key benefit of in generators is the ability to handle infinite sequences, such as all natural numbers, without causing memory exhaustion, as values are generated sequentially and discarded after use. This makes it feasible to work with conceptually unbounded data structures in practice, supporting applications like mathematical explorations or simulations that require arbitrary prefixes of infinite series. However, generators typically do not incorporate by default, which can lead to repeated computations if the same value is required multiple times across different parts of a program, unlike call-by-need strategies in languages like that cache results to avoid redundancy. The following illustrates a simple infinite generator for powers of 2 using :

function powers_of_two(): power = 1 while true: yield power power = power * 2

function powers_of_two(): power = 1 while true: yield power power = power * 2

This generator yields 1, 2, 4, 8, and so on, each value only when requested via .

Memory-Efficient

Generators facilitate memory-efficient over large finite by yielding individual elements on demand, enabling streaming from sources like files, databases, or network streams without requiring the entire to reside in memory at once. This mechanism pauses execution after each yield, resuming only when the next element is requested, thus maintaining a constant regardless of size. A practical example is parsing a large CSV file, where a generator function opens the file, reads it line by line, and yields each parsed row as a or . This approach processes terabyte-scale files row-by-row, applying transformations or aggregations incrementally and avoiding out-of-memory errors that would occur with full-file loading into data structures like lists. Generators excel in integration with pipelines, where they can be chained to perform sequential transformations such as mapping values, filtering criteria, or partial without materializing intermediate results. For instance, a might chain a file-reading generator with a filtering generator to select relevant records and a mapping generator to transform them, composing an efficient map-filter-reduce that operates in constant space. This promotes reusable components for complex data flows. Performance-wise, generators drastically lower peak memory usage—often by orders of magnitude compared to eager-loading alternatives—making them suitable for datasets exceeding available RAM, though they introduce minor overhead from function call suspensions and resumptions. In real-world scenarios, such as vast sites or analyzing server log files that surpass gigabytes, generators enable incremental extraction and processing, ensuring scalability for data volumes that would otherwise overwhelm system resources.

Asynchronous and Concurrent Programming

Generators function as lightweight coroutines in asynchronous and concurrent programming, enabling functions to suspend execution and yield control back to an during non-CPU-intensive operations such as I/O waits. This cooperative model allows multiple tasks to progress without the overhead of full context switches, making it suitable for handling concurrent operations in a single-threaded environment. A representative example involves simulating asynchronous I/O operations, where a generator yields during network waits to allow the event loop to process other tasks, avoiding the need for separate threads. For instance, in Python, a generator-based coroutine might look like this:

python

def async_network_fetch(): # Simulate starting a network request yield 'waiting_for_response' # Assume response received; in practice, driven by event loop data = 'received_data' yield data # Usage in event loop gen = async_network_fetch() print(next(gen)) # Output: waiting_for_response # Event loop would resume here after wait print(next(gen)) # Output: received_data

def async_network_fetch(): # Simulate starting a network request yield 'waiting_for_response' # Assume response received; in practice, driven by event loop data = 'received_data' yield data # Usage in event loop gen = async_network_fetch() print(next(gen)) # Output: waiting_for_response # Event loop would resume here after wait print(next(gen)) # Output: received_data

This approach enables non-blocking behavior for I/O-bound tasks. In Python, generators provide the foundational implementation for coroutines in the asyncio , which underpins the async/await introduced in version 3.5 for more readable asynchronous code. Similarly, in , generators support async generators that conform to the async iterable protocol, allowing seamless integration with promises for handling asynchronous data streams. Generators offer several advantages over traditional threading for concurrency: they incur lower memory overhead by sharing the call stack rather than allocating one per thread, simplify since cooperative yielding prevents race conditions without locks, and avoid contention in languages like Python. These benefits make them particularly effective for I/O-intensive applications where true parallelism is not required. Despite these strengths, generators in this role have limitations inherent to their cooperative design: they require explicit yield statements to relinquish control, and omitting them can block the entire , halting all concurrent tasks. Additionally, as they operate within a single thread, they do not achieve true CPU parallelism, limiting their use to I/O-bound rather than compute-intensive workloads. Post-2020, generators and the coroutines they enable have seen increased adoption in web servers for efficiently managing high volumes of concurrent requests, as seen in frameworks like that leverage asyncio to handle thousands of simultaneous connections with minimal resource use. This trend supports scalable, non-blocking architectures in modern cloud-native applications.

Implementation

Core Mechanics

In , a generator embodies the duality of being both a function-like construct and an object. When a generator function is invoked, it does not execute its body immediately; instead, it returns an object that encapsulates the generator's potential execution state. This separation allows the generator to be treated uniformly as an iterable in loop constructs or protocols across languages that support it. The core execution flow of a generator begins only upon the first invocation of its iterator's advancement method, commonly named next(). At this point, the generator's body starts executing from the beginning, proceeding until it encounters a suspension point—typically a yield expression that produces a value. Upon yielding, execution suspends, returning the value to the caller while preserving the generator's internal state. Subsequent calls to next() resume execution precisely from the suspension point, continuing until the next yield or the end of the function body. This on-demand progression enables lazy computation, where values are generated iteratively rather than all at once. Central to this model is of the generator's execution state, which retains local variables and the current execution position between suspensions and resumptions. Upon suspension, this state is preserved, allowing seamless resumption as if the pause never occurred, maintaining continuity in its computation. In languages like , this state preservation allows generators to track iterative progress, such as positions in data structures, across multiple value productions. Generators implicitly expose an interface, typically including methods like next() to advance and retrieve the next value, hasNext() or equivalent to check availability (often inferred via exceptions or return values), and close() to terminate early if needed. The next() method drives the core loop of suspension and resumption, raising a termination signal—such as StopIteration—when no further values can be produced. The lifecycle of a generator can be traced step-by-step as follows:
  1. Creation: Invoking the generator function allocates and returns the iterator object in a suspended, uninitialized state, with no execution having occurred.
  2. First Advancement: The initial next() call initializes the execution and executes the body until the first yield, suspending and returning the yielded value. Local state is now preserved at this point.
  3. Subsequent Advancements: Each further next() resumes from the prior suspension, executes to the next yield (or end), and either returns a value or signals exhaustion via an exception if the body completes without yielding. State persists across these cycles, allowing cumulative computation.
  4. Exhaustion and Termination: Upon reaching the end of the body or an explicit termination, the final next() raises the exhaustion exception, closing the generator. The state is discarded, and no further resumption is possible. Early closure via close() may trigger cleanup without full execution.

Syntax and Yielding

Generators in programming languages are typically defined using a specialized function declaration that distinguishes them from ordinary functions, often incorporating a modifier or keyword to signify their ability to produce a sequence of values lazily. Common abstract syntax patterns include forms like def gen(): yield value; in imperative styles or function* gen() { yield value; } in others, where the yield keyword suspends execution and returns a value to the caller while preserving the function's state for resumption. The yield statement serves as the core mechanism for producing values from a generator. In its basic form, yield expression evaluates the expression and yields its result to the caller, pausing the generator until the next invocation resumes execution from that point. This enables lazy evaluation, where values are computed only when requested, as opposed to generating an entire sequence upfront. Advanced variants of yield support and composition. The yield from (or equivalent like yield*) construct allows a generator to delegate iteration to another iterable or sub-generator, effectively composing sequences by forwarding values from the delegate until it is exhausted, after which execution continues in the parent generator. This facilitates modular generator design without manual looping over sub-iterables. Generators often support bidirectional communication, enabling the caller to inject values back into the suspended generator. Upon resumption via a send(value) method (or similar), the yielded expression in the generator receives the sent value as if it were the result of the prior yield, allowing dynamic interaction such as parameter passing or coroutine-like behavior. To illustrate, consider an abstract generator producing even numbers:

generator even_numbers() { let i = 0; while (true) { yield i; i += 2; } }

generator even_numbers() { let i = 0; while (true) { yield i; i += 2; } }

Here, each call to the generator's next method yields the next even number, suspending after each yield. For , a composite generator might multiple sub-generators:

generator all_numbers() { yield from even_numbers(); yield from odd_numbers(); // Assuming odd_numbers() is another generator }

generator all_numbers() { yield from even_numbers(); yield from odd_numbers(); // Assuming odd_numbers() is another generator }

This yields all even numbers first, then odds, composing the sequences seamlessly without explicit iteration in the parent.

Error Handling and Termination

In programming languages that support generators, such as Python, exceptions raised within the generator function during execution—such as in code between yield statements—propagate outward to the caller when the next value is requested via methods like next() or send(). This allows the consumer code to catch and handle errors at the suspension point without disrupting the overall iteration protocol. For instance, if a division by zero occurs inside the generator, the exception travels up the call stack to the iterating code, terminating the generator unless explicitly caught internally. Similarly, in JavaScript, the throw() method on a generator object injects an exception at the current yield position, enabling propagation if not handled within the generator body. Generators provide mechanisms for explicit termination and cleanup to manage resources properly, even if the iteration is not fully exhausted. In Python, the close() method raises a GeneratorExit exception at the most recent suspension point, triggering any associated finally blocks for resource release, such as closing files or connections, before the generator is finalized. This ensures deterministic cleanup during garbage collection or manual termination, and since Python 3.13, close() can return a final value if the generator explicitly returns one after handling GeneratorExit. In contrast, generators lack a dedicated close() but achieve similar effects through the return() method, which terminates the generator and executes cleanup code in try...finally constructs. These features prevent resource leaks in scenarios like early abortion of long-running iterations. Generators integrate with context managers to enhance , particularly in languages like Python where the with statement automates setup and teardown. A generator can implement a context manager using the @contextlib.contextmanager decorator, splitting the function at a yield point: code before yield handles entry (e.g., opening a resource), and code after handles exit (e.g., closing it), with the yielded value serving as the context. This pattern ensures exceptions during iteration trigger proper cleanup without manual intervention, making generators suitable for managing temporary resources like database connections during . When consuming a generator within a with block, external context managers can wrap it to guarantee invocation of close() on exit, further safeguarding against partial execution states. The exhaustion of a generator is conventionally signaled by raising StopIteration in iterator protocols, as seen in Python, where this built-in exception is automatically thrown when the generator function completes without further yields, such as via a or natural end. This exception carries any return value from the generator (accessible via StopIteration.value since Python 3.3), allowing consumers to distinguish normal termination from errors without additional checks. In iterator loops like for, StopIteration is implicitly caught to end iteration cleanly. Languages without explicit StopIteration, such as JavaScript, use a done: true flag in the returned object to indicate completion, maintaining compatibility with iterable protocols. Best practices for error handling in generators emphasize robustness and resource safety to mitigate risks from partial or interrupted execution. Developers should wrap yield points in try...finally blocks to guarantee cleanup of open resources, such as files or locks, regardless of exceptions or early termination. Explicitly calling close() (or equivalent) after partial consumption prevents dangling resources, especially in memory-constrained or long-lived applications. When propagating exceptions, provide descriptive messages or types to aid , and avoid silent swallows by re-raising after ; for instance, catching transient errors inside the generator to retry yields while letting fatal ones propagate. Testing should cover edge cases like exceptions mid-yield and abrupt closure to ensure consistent state, prioritizing these over exhaustive error enumeration to maintain generator simplicity and efficiency.

Language Support

Early and Academic Languages

In , native support for generators has never been part of the standard, but during the , developers relied on library-based implementations to achieve coroutine-like behavior for generator functionality in performance-critical . These extensions often utilized mechanisms such as setjmp/longjmp for context switching or manual stack management to simulate yielding, enabling memory-efficient iteration without full threads. Early examples include discussions and prototypes from the late transitioning into the , where coroutines were implemented to handle in embedded or low-level applications. Such library approaches prioritized efficiency in resource-constrained environments, avoiding the overhead of operating system threads. C++ extended this foundation with template-based generators well before native coroutine support arrived. Prior to C++20, libraries like the Embedded Template Library (ETL) provided generator classes using C++ templates to implement iterable sequences with , suitable for embedded systems where memory and performance are paramount. These template-driven solutions allowed developers to create resumable functions mimicking generators through state machines or iterator patterns, though they required manual management of yielding logic. With the ratification of in December 2020, the language introduced built-in featuring the co_yield keyword, which suspends execution and returns a value to the caller, directly enabling generator patterns as a core language feature. This addition transformed from library hacks into a standardized mechanism, enhancing support for asynchronous and lazy computation in high-performance code. Perl's approach to generators evolved gradually, with early versions emphasizing list processing functions that laid groundwork for concepts. In Perl 5.005, released in 1998, functions like and facilitated transformations on lists, which could be composed to approximate lazy lists when paired with infinite series or recursive definitions, though full eagerness in list evaluation limited true . This period marked prototypes and discussions toward more advanced yielding, culminating in modules like Coro (introduced in 2004) that provided support for proper generators using continuation-passing styles. These library extensions were geared toward scripting tasks requiring efficient data streaming, bridging imperative code with functional-like without native keywords. Tcl introduced native coroutine support in version 8.6, released on December 20, 2012, via the command paired with yield and yieldto for generator-style suspension and resumption. The yield command allows a coroutine to produce a value and pause, returning control to the caller, while yieldto enables targeted resumption, making it ideal for implementing generators in event-driven scripting. This built-in feature, part of Tcl's shift toward stackless execution models, supported practical applications in GUI and network programming, where low-overhead yielding improves responsiveness without external libraries. Raku, formerly known as Perl 6 and first released in , integrated generators through supply blocks and hyperoperators, emphasizing lazy and parallel evaluation in its imperative-functional hybrid design. Supply blocks, defined as supply { ... } whenever { emit $value }, create asynchronous data streams that emit values on demand, functioning as generators for with multiple subscribers. Hyperoperators, such as >>+<< for element-wise addition on lists, apply operations lazily and can parallelize across threads, enabling scalable generator patterns for large datasets. These features, native to the language, target performance-critical scripting and data processing, building on Perl's legacy while providing keyword-driven yielding for concurrency.

Imperative and Scripting Languages

In imperative and scripting languages, generators facilitate efficient, on-demand computation, particularly suited to web development, data processing, and automation tasks where memory constraints and iterative workflows are common. These languages emphasize procedural control flow, integrating generators as lightweight alternatives to full iterators or coroutines, often with syntax that pauses execution to yield values while preserving state. Python, , , and exemplify this approach, prioritizing practicality in production environments like server-side scripting and client-side interactivity. Python introduced generators in version 2.2, released in December 2001, through PEP 255, which added the yield keyword to ordinary functions defined with def. This transforms a function into a generator that yields values sequentially upon iteration, suspending execution at each yield and resuming from that point on the next call, enabling lazy evaluation without building entire collections in memory. For example, a simple generator for Fibonacci numbers might use:

python

def fibonacci(): a, b = 0, 1 while True: yield a a, b = b, a + b

def fibonacci(): a, b = 0, 1 while True: yield a a, b = b, a + b

Generator expressions, proposed in PEP 289 and implemented in Python 2.4 (November 2004), extend this conciseness further, allowing inline creation of generators similar to list comprehensions but using parentheses for laziness. The syntax (x for x in range(10)) produces an iterable that yields integers from 0 to 9 on demand, consuming minimal memory compared to [x for x in range(10)] and ideal for data scripting pipelines. These features have made Python a staple for memory-efficient tasks in scientific computing and web backends. PHP added generators in version 5.5.0, released on July 20, 2013, via the yield keyword in functions that return a built-in Generator object implementing the Iterator interface. This allows pausing execution to yield values, supporting web-oriented streaming of large datasets like database results without buffering. The send() method on the Generator class enables bidirectional communication, passing values back into the generator for interactive control, as in:

php

function gen() { $input = yield 'start'; yield $input * 2; } $generator = gen(); echo $generator->current(); // Outputs: start $generator->send(5); // Next yield: 10

function gen() { $input = yield 'start'; yield $input * 2; } $generator = gen(); echo $generator->current(); // Outputs: start $generator->send(5); // Next yield: 10

Such capabilities enhance PHP's role in for dynamic content generation. , while lacking native generator functions, supports generator-like behavior through the Fiber class introduced in version 1.9.0 (December 25, 2007), which provides lightweight, cooperative concurrency primitives. , created with Fiber.new and controlled via Fiber.yield and Fiber.resume, allow block-based pausing and yielding of values, simulating generators for tasks like in web applications. An example implements a simple range generator:

ruby

def range_generator(start, end_val) Fiber.new do i = start while i <= end_val Fiber.yield i i += 1 end end end gen = range_generator(1, 5) puts gen.resume # 1 puts gen.resume # 2

def range_generator(start, end_val) Fiber.new do i = start while i <= end_val Fiber.yield i i += 1 end end end gen = range_generator(1, 5) puts gen.resume # 1 puts gen.resume # 2

Additionally, Ruby 2.0 (February 24, 2013) introduced Enumerator.lazy, which wraps enumerables for deferred execution, complementing fibers in data scripting by avoiding immediate computation of infinite or large sequences. This block-centric design aligns with Ruby's emphasis on readable, imperative code in web frameworks like Rails. ECMAScript incorporated generators in the 6th edition (ES6, published June 2015), using function* declarations and the yield operator to create iterable objects that pause at yield points. The yield* expression supports delegation to sub-generators, enabling composable iteration for client-side data flows. For instance:

javascript

function* idGenerator() { let id = 0; while (true) { yield id++; } } const gen = idGenerator(); console.log(gen.next().value); // 0

function* idGenerator() { let id = 0; while (true) { yield id++; } } const gen = idGenerator(); console.log(gen.next().value); // 0

Async generators, added in ES2018, extend this with async function* and yield awaiting promises, facilitating asynchronous iteration over streams like API responses without callbacks. This has boosted their adoption in web scripting for handling real-time data, underscoring ECMAScript's evolution toward efficient, non-blocking concurrency.

Object-Oriented and Functional Languages

In object-oriented languages, generators are often integrated through interfaces that enable iterable sequences, allowing methods to produce values on demand without loading entire collections into memory. This approach aligns with encapsulation and polymorphism principles, where generator methods implement standard iteration protocols. For instance, in C#, introduced with version 2.0 in 2005, the yield return statement enables iterator methods that return IEnumerable<T>, pausing execution upon each yield and resuming from that point on subsequent calls to MoveNext(). This facilitates lazy evaluation of sequences, such as generating Fibonacci numbers incrementally, and integrates seamlessly with LINQ for querying without intermediate materialization. Similarly, F#, a functional-first language with object-oriented features, supports generators via sequence expressions in expressions starting with version 2.0 in 2010. The seq { yield expr } syntax produces lazy sequences implementing IEnumerable, leveraging the seq builder to compose yields with filters, maps, and iterations in a declarative style. This allows for efficient handling of large or infinite data streams, such as yielding primes from a , while maintaining through .NET's generic interfaces. Java, as of November 2025, lacks native generator support with a yield keyword, relying instead on the API introduced in Java 8 to mimic generator behavior through methods like generate() and iterate(). These create infinite or finite lazy streams from supplier functions, as in Stream.generate(Math::random).limit(10), which produces values on demand via Spliterators without explicit yielding. The Project Loom initiative, finalized in JDK 21 with virtual threads and advanced in JDK 25 via JEP 505 for , previews coroutine-like constructs but does not introduce generators. In functional languages like , generators emerge implicitly through of lists, a core feature since the language's standardization in the 1990s. Infinite lists, such as [x*x | x <- [1..]] for squares, act as generators by computing elements only when demanded, often composed using do-notation in the list monad for sequencing yields in a generator-like flow. This preserves and purity, avoiding side effects in generation, as seen in monadic parsers or non-deterministic computations where do-blocks chain yields recursively. Across these paradigms, generators in object-oriented contexts emphasize interface conformance for interoperability, such as C#'s IEnumerator or F#'s seq, enabling polymorphic iteration over class hierarchies. In functional settings like , they uphold for , treating sequences as first-class values in monadic pipelines without mutable state.

Recent and Niche Implementations

In variants of Smalltalk, such as Smalltalk, generators are implemented using blocks to create lazy of values, where the block receives a generator object and yields values on demand via the #yield: message, allowing iterative computation without full materialization. This approach, inspired by earlier concepts in Scheme and , enables blocks to act as coroutines for producing sequences lazily, with resumption triggered by messages like #next. The niche XL programming language, an extension of for rule-based graph transformation developed in the 2000s, supports native generator expressions that successively yield results in constructs like for-loops and , facilitating paradigms through declarative rule application and . These generators integrate with XL's relational operators and parallel rewriting features, allowing efficient handling of multi-scale simulations without explicit . In , version 3.3 (released in 2016) enhanced support for via promise objects in function arguments, enabling generator-like behavior through closures that delay computation until values are accessed, particularly useful in statistical modeling and pipelines. This mechanism, combined with functions like delayedAssign for explicit laziness, allows domain-specific applications in statistics to simulate iterative yielding without native generator syntax. Rust has provided unstable generators since 2016 via the generators feature gate (RFC 2033), primarily focused on async programming where they underpin coroutine state machines for efficient, stackless execution in systems-level code. These generators yield values through yield expressions, supporting traits like Iterator in experimental contexts, though stabilization remains pending due to integration with async/await. Go lacks native generators but commonly implements them using goroutines and channels, where a producer goroutine yields values over a channel to a consumer, enabling concurrent, lazy iteration patterns akin to pipelines in data processing. This approach leverages Go's concurrency primitives for domain-specific uses in systems programming, such as streaming computations, without requiring explicit yield keywords.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.