Hubbry Logo
Rastrigin functionRastrigin functionMain
Open search
Rastrigin function
Community hub
Rastrigin function
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something
Rastrigin function
Rastrigin function
from Wikipedia
Rastrigin function of two variables
In 3D
Contour

In mathematical optimization, the Rastrigin function is a non-convex function used as a performance test problem for optimization algorithms. It is a typical example of non-linear multimodal function. It was first proposed in 1974 by Rastrigin[1] as a 2-dimensional function and has been generalized by Rudolph.[2] The generalized version was popularized by Hoffmeister & Bäck[3] and Mühlenbein et al.[4] Finding the minimum of this function is a fairly difficult problem due to its large search space and its large number of local minima.

On an -dimensional domain it is defined by:

where and . There are many extrema:

  • The global minimum is at where .
  • The maximum function value for is located at :
Number of dimensions Maximum value at
1 40.35329019
2 80.70658039
3 121.0598706
4 161.4131608
5 201.7664509
6 242.1197412
7 282.4730314
8 322.8263216
9 363.1796117

Here are all the values at 0.5 interval listed for the 2D Rastrigin function with :

0 20.25 1 22.25 4 26.25 9 32.25 16 40.25 25 28.92
20.25 40.5 21.25 42.5 24.25 46.5 29.25 52.5 36.25 60.5 45.25 49.17
1 21.25 2 23.25 5 27.25 10 33.25 17 41.25 26 29.92
22.25 42.5 23.25 44.5 26.25 48.5 31.25 54.5 38.25 62.5 47.25 51.17
4 24.25 5 26.25 8 30.25 13 36.25 20 44.25 29 32.92
26.25 46.5 27.25 48.5 30.25 52.5 35.25 58.5 42.25 66.5 51.25 55.17
9 29.25 10 31.25 13 35.25 18 41.25 25 49.25 34 37.92
32.25 52.5 33.25 54.5 36.25 58.5 41.25 64.5 48.25 72.5 57.25 61.17
16 36.25 17 38.25 20 42.25 25 48.25 32 56.25 41 44.92
40.25 60.5 41.25 62.5 44.25 66.5 49.25 72.5 56.25 80.5 65.25 69.17
25 45.25 26 47.25 29 51.25 34 57.25 41 65.25 50 53.92
28.92 49.17 29.92 51.17 32.92 55.17 37.92 61.17 44.92 69.17 53.92 57.85

The abundance of local minima underlines the necessity of a global optimization algorithm when needing to find the global minimum. Local optimization algorithms are likely to get stuck in a local minimum.

See also

[edit]

Notes

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
The Rastrigin function is a non-convex, multimodal mathematical function commonly employed as a benchmark test problem to evaluate the performance of algorithms. It is defined over an n-dimensional domain as
f(x)=10n+i=1n(xi210cos(2πxi)),f(\mathbf{x}) = 10n + \sum_{i=1}^{n} \left( x_i^2 - 10 \cos(2\pi x_i) \right),
where x=(x1,,xn)\mathbf{x} = (x_1, \dots, x_n) and the constant A = 10 scales the function to emphasize its periodic structure. The function features a global minimum of 0 at x=0\mathbf{x} = \mathbf{0}, surrounded by numerous local minima arranged in a highly regular, lattice-like pattern that challenges algorithms to escape deceptive traps.
Originally proposed in 1974 by Latvian mathematician Leonid A. Rastrigin as a two-dimensional problem in his book Systems of Extreme Control, the function was later generalized to higher dimensions, notably by Günter Rudolph in the context of evolution strategies. This generalization has made it a staple in the field of and optimization since the 1990s, with its complex landscape—combining quadratic terms for convexity near the origin and cosine oscillations for —serving to test an algorithm's ability to navigate rugged search spaces without relying on gradients. Typically evaluated over the bounded domain xi[5.12,5.12]x_i \in [-5.12, 5.12] for all i, the Rastrigin function's deceptive simplicity belies its difficulty, as the number of local minima grows exponentially with dimensionality. In practice, the function's properties have been leveraged in diverse applications, from validating and genetic algorithms to assessing hybrid metaheuristics, with empirical studies confirming its utility in revealing convergence issues in population-based methods. Its periodic nature also allows for analytical insights into optimization dynamics, such as progress rates in evolution strategies, underscoring its enduring role in advancing non-convex optimization research.

Definition

Mathematical Formulation

The Rastrigin function is a multidimensional test function commonly used in studies. For an nn-dimensional input vector x=(x1,,xn)Rn\mathbf{x} = (x_1, \dots, x_n) \in \mathbb{R}^n, it is defined mathematically as f(x)=10n+i=1n(xi210cos(2πxi)).f(\mathbf{x}) = 10n + \sum_{i=1}^n \left( x_i^2 - 10 \cos(2\pi x_i) \right). This standard form was originally proposed by Leonid A. Rastrigin in his 1974 book on extremal control systems. The function is typically considered over a bounded search space, such as [5.12,5.12]n[-5.12, 5.12]^n, which aligns with the period of the cosine oscillations and ensures the presence of numerous local optima within the domain. Each term in the sum combines a quadratic component xi2x_i^2, which contributes a convex parabolic shape centered at the origin, with a cosine perturbation 10cos(2πxi)-10 \cos(2\pi x_i), whose and periodicity introduce regular oscillations that disrupt the global convexity. This structure arises from augmenting a simple —known for its unimodal, easy-to-optimize nature—with trigonometric terms to simulate periodic barriers, thereby creating a highly multimodal surface suitable for testing optimization algorithms' ability to escape local traps.

Variants and Generalizations

The Rastrigin function was originally proposed by L. A. Rastrigin in as a two-dimensional test problem for extremal control systems. This initial formulation focused on capturing multimodal behavior in low dimensions to evaluate optimization methods. It was subsequently generalized to arbitrary n dimensions by Mühlenbein, Schomisch, and Born in 1991, extending its applicability to higher-dimensional search spaces while preserving the characteristic oscillatory landscape. Common variants of the Rastrigin function include shifted versions, where the location of the global minimum is displaced from the origin via an input transformation, thereby testing algorithms' ability to escape origin bias; this form appears as function F9 in the CEC 2005 benchmark suite. Rotated variants introduce non-separability by premultiplying the input vector with an orthogonal , which couples variables and challenges coordinate-descent-based approaches; a combined shifted and rotated form is defined as F10 in the same CEC 2005 suite. These modifications enhance the function's utility in assessing robustness across diverse problem characteristics. Parameter variations typically adjust the amplitude coefficient A (standard value 10) or the angular frequency ω (standard value 2π) within the cosine term, altering the depth and frequency of oscillations to create landscapes with varying numbers of local minima. Such adjustments allow for tailored benchmarking of optimization techniques sensitive to oscillation scale. Higher-dimensional adaptations scale the generalized n-dimensional form to dimensions such as 10, 30, 50, or 100, as implemented in CEC benchmark suites to probe algorithmic performance in large-scale global optimization. These extensions are particularly prevalent in competitions like CEC 2005 and later, where the function's multimodality intensifies with dimensionality.

Properties

Multimodality

The Rastrigin function exhibits strong , featuring numerous local minima generated by cosine oscillations overlaid on a quadratic surface. This oscillatory component introduces deceptive basins that complicate the identification of the global optimum. The local minima are regularly distributed throughout the search space, positioned near coordinates of the variables xix_i, which results in a rugged resembling alternating hills and valleys. In the typical bounded domain [5.12,5.12]n[-5.12, 5.12]^n, the count of these local minima grows exponentially with the problem dimension nn. This structure renders the function particularly challenging for optimization, as it frequently ensnares gradient-based and local search methods in suboptimal traps, thereby serving as a rigorous benchmark for evaluating the efficacy of algorithms.

Global and Local Minima

The Rastrigin function features a unique global minimum at the origin, x=(0,0,,0)\mathbf{x} = (0, 0, \dots, 0), where f(x)=0f(\mathbf{x}) = 0. This location achieves the minimum value because the oscillatory cosine terms reach their maximum of 1 at integer coordinates, particularly zero, while the quadratic terms vanish. Analytically, the global minimum can be confirmed by bounding the function. For the standard formulation f(x)=10d+i=1d[xi210cos(2πxi)]f(\mathbf{x}) = 10d + \sum_{i=1}^d [x_i^2 - 10 \cos(2\pi x_i)], each component satisfies xi210cos(2πxi)+100x_i^2 - 10 \cos(2\pi x_i) + 10 \geq 0 since xi20x_i^2 \geq 0 and 10cos(2πxi)10-10 \cos(2\pi x_i) \geq -10, with equality only when xi=0x_i = 0 (where cos(0)=1\cos(0) = 1). Thus, f(x)0f(\mathbf{x}) \geq 0, with equality solely at the origin. This structure ensures the global minimum is isolated and verifiable without numerical search. In addition to the global minimum, the function exhibits numerous local minima located approximately at points where each coordinate xix_i is near a non-zero , such as ±1,±2\pm 1, \pm 2, due to the periodic nature of the cosine terms aligning near these values. For instance, in one , local minima occur near x=±1,±2,x = \pm 1, \pm 2, \dots, with corresponding function values strictly greater than 0, often close to the global minimum in magnitude but separated in the search space. These local minima arise as solutions to the critical point xi=10πsin(2πxi)x_i = -10\pi \sin(2\pi x_i) for each , which cluster near integers beyond zero. The basin of attraction for the global minimum is relatively large, driven by the dominant quadratic terms that funnel trajectories toward the origin over broad regions, whereas the local minima possess shallow basins that are numerous but narrow, scaling exponentially with dimensionality. This contrast highlights the function's deceptive landscape, where local trap optimizers but the global prevails in wider exploratory searches.

Differentiability and Continuity

The Rastrigin function, defined as f(x)=10n+i=1n(xi210cos(2πxi))f(\mathbf{x}) = 10n + \sum_{i=1}^n \left( x_i^2 - 10 \cos(2\pi x_i) \right) for xRn\mathbf{x} \in \mathbb{R}^n, is continuous on the entire domain Rn\mathbb{R}^n. This property arises from its construction as a finite sum of continuous components: quadratic terms xi2x_i^2, which are polynomials, and cosine functions cos(2πxi)\cos(2\pi x_i), both of which are continuous everywhere. Continuity ensures that small changes in the input vector x\mathbf{x} result in correspondingly small changes in the function value, a fundamental requirement for many theoretical analyses in optimization. Beyond continuity, the Rastrigin function is infinitely differentiable, classifying it as a CC^\infty () function across Rn\mathbb{R}^n. This smoothness stems from the fact that polynomials and the cosine function are themselves infinitely differentiable, and sums and compositions of such functions preserve this property. The CC^\infty nature allows for the computation of higher-order derivatives if needed, though first-order derivatives suffice for most gradient-based techniques. In benchmark contexts, this regularity contrasts with the function's multimodal landscape, where smooth local irregularities arise from the oscillatory cosine modulation. The of the Rastrigin function is separable, with partial given by fxi(x)=2xi+20πsin(2πxi)\frac{\partial f}{\partial x_i}(\mathbf{x}) = 2x_i + 20\pi \sin(2\pi x_i) for each i=1,,ni = 1, \dots, n. Thus, the full vector is f(x)=(2x1+20πsin(2πx1),,2xn+20πsin(2πxn))\nabla f(\mathbf{x}) = (2x_1 + 20\pi \sin(2\pi x_1), \dots, 2x_n + 20\pi \sin(2\pi x_n)). These are derived directly from the chain rule applied to the quadratic and trigonometric terms, confirming the function's differentiability. The availability of an explicit, smooth makes the Rastrigin function amenable to derivative-based optimization algorithms, such as or quasi-Newton methods, which rely on local information to navigate the search space. However, the periodic oscillations introduced by the sine terms in the can produce misleading directions, often trapping optimizers in local minima despite the underlying . This duality—smooth yet deceptive —highlights the function's utility as a test case for assessing the robustness of gradient-utilizing solvers against multimodal challenges.

Visualization and Analysis

Graphical Representations

The Rastrigin function is commonly visualized in two dimensions using contour plots, which reveal a grid-like of local minima arranged near coordinates, with closed level curves around each minimum superimposed on the overall parabolic shape, illustrating the periodic oscillatory behavior due to the cosine terms. In three dimensions, surface plots provide a more immersive view, portraying the function as a rugged, wavy with a prominent central representing the global minimum, surrounded by alternating ridges and depressions that correspond to local maxima and minima. This visualization emphasizes the deceptive landscape that challenges optimization algorithms, as the surface undulates with increasing away from the origin. For dimensions beyond three, direct plotting becomes infeasible, so heatmaps of two-dimensional projections or fixed slices are employed to capture the . These representations show dense patterns of hot and cool spots, highlighting clusters of local minima arranged in a lattice-like fashion across the projected space, which underscores the function's regular yet numerous deceptive attractors. Such graphical representations make the Rastrigin's multimodal properties immediately apparent, aiding in the intuitive understanding of its optimization challenges. To generate these plots, tools like utilize built-in functions such as contour for 2D levels and surf for 3D surfaces, often within optimization toolboxes. Similarly, Python's library enables comparable visualizations through contourf and plot_surface methods, integrated in frameworks like PyMOO for benchmark functions.

Behavior in Low Dimensions

In the one-dimensional case, the Rastrigin function simplifies to f(x)=x210cos(2πx)+10f(x) = x^2 - 10 \cos(2\pi x) + 10, defined over the standard search interval [5.12,5.12][-5.12, 5.12]. This reduction highlights the function's inherent , with approximately 12 local minima distributed regularly across the interval, creating a series of oscillatory "valleys" superimposed on the parabolic x2x^2 trend. The global minimum occurs at x=0x = 0, where f(0)=0f(0) = 0, while the local minima are positioned near values, slightly displaced by the influence of the quadratic term that dominates far from the origin. These local minima arise from the balance between the smooth quadratic growth and the rapid oscillations of the cosine term, which has a period of 1, leading to roughly one potential minimum per within the bounded domain. This structure makes the 1D case a foundational example for understanding how the function traps gradient-based optimizers in suboptimal points, yet the limited number of extrema allows for relatively straightforward or visualization to locate the global optimum. The two-dimensional formulation represents the original proposal by Rastrigin in , where the function was introduced as f(x,y)=20+x2+y210cos(2πx)10cos(2πy)f(x, y) = 20 + x^2 + y^2 - 10 \cos(2\pi x) - 10 \cos(2\pi y) to test extremal control systems. Despite being separable—composed additively of independent univariate terms—the 2D features axis-aligned minima arranged on a Cartesian grid near integer pairs (k,l)(k, l) for k,lk, l, resulting in a product of the 1D minima counts and yielding hundreds of local optima within the domain [5.12,5.12]2[-5.12, 5.12]^2. This grid-like pattern facilitates intuitive of search dynamics, such as how trajectories might follow coordinate axes toward deceptive basins, but the sheer volume of traps already poses significant challenges for exhaustive . Behavior in these low dimensions provides critical intuition for scalability in higher dimensions, foreshadowing the curse of dimensionality: the exponential proliferation of local minima, approximately proportional to mnm^n where m12m \approx 12 is the 1D count and nn is the dimension, transforms the landscape into an increasingly rugged and deceptive terrain that overwhelms local search strategies. In contrast to higher-dimensional instances, low-dimensional versions of the permit manual optimization or simple techniques like grid sampling, as the finite and predictable arrangement of minima enables systematic coverage of the space without excessive computational cost, underscoring the function's role in demonstrating dimensionality-dependent difficulty.

Applications

Benchmark Function in Optimization

The Rastrigin function serves as a prominent benchmark in research, designed to assess algorithms' proficiency in navigating highly multimodal landscapes to escape local minima and reach the global optimum. Its structure, featuring numerous regularly distributed local optima, challenges methods to balance exploration and exploitation effectively. This function is incorporated into key benchmark suites, such as the IEEE Congress on Evolutionary Computation (CEC) 2005 and 2013 competitions on real-parameter optimization, where it functions as a core test problem among 25 and 28 functions, respectively. It also appears in the Black-Box Optimization Benchmarking (BBOB) suite as function f_{15}, tailored for evaluating derivative-free and black-box optimizers across scalable dimensions. Algorithm performance on the Rastrigin function is commonly measured by success rate (proportion of runs reaching the global optimum within a tolerance), convergence speed (rate of error reduction over iterations), and computational efficiency via the number of function evaluations needed. These metrics highlight the function's utility in comparing optimizer robustness under controlled . In comparison to other benchmarks, the Rastrigin presents greater difficulty than the unimodal Sphere function, which lacks local minima and allows straightforward gradient-based convergence, but it is less challenging than the Griewank function due to the latter's non-separability from the product term, which introduces deceptive couplings and irregular optima placement. The Rastrigin's separable, cosine-modulated maintains a predictable global structure, aiding visibility of the optimum amid local traps.

Role in Evolutionary Algorithms

The Rastrigin function serves as a challenging benchmark for evaluating the performance of strategies (ES), particularly in assessing convergence rates and progress toward the global minimum amid its numerous local optima. In analyses of the intermediate multi-recombinative (μ/μ_I, λ)-ES, researchers have derived progress rates to quantify the algorithm's advancement on this function, revealing linear convergence in initial phases followed by slowdowns near local minima and acceleration closer to the global optimum. These studies, using large population sizes such as μ=1500 and λ=3000, demonstrate that theoretical approximations align closely with simulations when population fluctuations are minimized, highlighting the function's utility in probing ES dynamics over multiple generations. Further investigations into (μ/μ_I, λ)-ES on the Rastrigin function have focused on convergence properties, introducing aggregated rate measures that depend on the residual to the optimizer. For moderate strengths, a distance-dependent emerges, complicating global search, while small strengths lead to or escape conditions that trap solutions in suboptimal regions. To counter these challenges, scaling population sizes proportionally to the dimensionality—such as increasing μ and λ—enhances global convergence rates, as validated through experimental runs on 100-dimensional instances. A 2022 analysis at the Parallel from (PPSN) specifically examined strengths, showing that normalized strengths (σ*) in the range [0,1] yield varying based on to the optimum, with optimal regimes avoiding premature stagnation. In genetic algorithms (GAs), the Rastrigin function is commonly employed to demonstrate and tune optimization solvers, such as the ga() function, which minimizes the two-dimensional variant through selection, crossover, and . Typical implementations use default population sizes of around 50-100 individuals, achieving solutions near the global minimum (e.g., objective values ≈2.5 after 100 generations in runs), though multiple executions are needed due to variability. This setup illustrates GA's ability to navigate the function's oscillatory landscape, with parameters like crossover fraction (0.8) and mutation rate (0.01) directly influencing escape from local minima. Addressing in GAs on the Rastrigin function often involves tuning and crossover rates to balance and exploitation, as explored in foundational studies using 10-dimensional instances with string of 130 bits. Experiments varying crossover probabilities (p_c from 0.0 to 0.9) and rates (p_m from 0.1/l to 1.0/l, where l is locus ) across sizes up to 2000 reveal that higher aids diversity to overcome traps, while moderate crossover promotes effective recombination, improving overall metrics like absorption time to the optimum. These interactions underscore the function's role in refining GA parameter strategies for robust handling of deceptive multimodal landscapes.

History

Origin

The Rastrigin function was introduced by Leonid A. Rastrigin, a Soviet renowned for his contributions to optimization and methods. Born on July 23, 1929, Rastrigin specialized in extremal control and adaptive systems during his career at institutions such as the Institute of in . His work emphasized approaches to solving complex optimization problems in engineering and . Rastrigin proposed the function in 1974 as part of his research into multimodal optimization landscapes. Initially formulated as a two-dimensional test case, it served to illustrate challenges in locating global extrema amid numerous local minima. This proposal appeared in his book Systems of Extreme Control, published by Nauka in , where it was used to analyze the performance of strategies in high-dimensional search spaces. The original context centered on theoretical studies of extremal systems, particularly how randomized and adaptive algorithms navigate rugged fitness surfaces in optimization theory. Rastrigin's development drew from his earlier explorations of techniques, aiming to model real-world problems in where deterministic methods often fail. The publication, written in Russian, remained primarily within Soviet-era literature until English translations and citations in Western journals facilitated its in the late 1980s and beyond.

Subsequent Developments

Following its initial proposal as a two-dimensional test problem, the Rastrigin function was generalized to arbitrary dimensions by Heinz Mühlenbein, Michael Schomisch, and Johannes Born in 1991, enabling its use for evaluating optimization algorithms in higher-dimensional spaces up to 400 dimensions. This extension, detailed in their work on parallel genetic algorithms, transformed the function into a standard multimodal benchmark by introducing scalable complexity through summed sinusoidal terms across dimensions, facilitating broader testing of global search capabilities. In the , the generalized Rastrigin function gained prominence in benchmark libraries for , appearing in early test suites for and to assess performance on deceptive landscapes. Its adoption accelerated with inclusions in competitions like the IEEE on (CEC), starting from special sessions in the early 2000s, such as the 2008 large-scale benchmark where rotated and shifted variants were used to evaluate scalability. By 2025, the function has been referenced in over 10,000 optimization papers, underscoring its enduring role in validation across CEC events and beyond. Recent theoretical advancements have focused on progress rate analyses for evolution strategies on the Rastrigin function, providing insights into convergence dynamics under . In 2022, Omeradžić and Beyer derived a first-order progress rate for intermediate multi-recombinative strategies, incorporating noisy order to approximate mutation-induced variance and compare theoretical predictions with empirical simulations. Building on this, subsequent 2023 studies extended the analysis to population sizing models, revealing that larger populations mitigate trapping in local minima by enhancing exploration rates on highly multimodal instances. Additionally, explorations of fractal-like properties in Rastrigin variants have emerged in 2024 frameworks, using techniques to exploit self-similar structures for improved optimization in continuous problems.

References

Add your contribution
Related Hubs
Contribute something
User Avatar
No comments yet.