Recent from talks
Nothing was collected or created yet.
Jump point search
View on WikipediaIn computer science, jump point search (JPS) is an optimization to the A* search algorithm for uniform-cost grids. It reduces symmetries in the search procedure by means of graph pruning,[1] eliminating certain nodes in the grid based on assumptions that can be made about the current node's neighbors, as long as certain conditions relating to the grid are satisfied. As a result, the algorithm can consider long "jumps" along straight (horizontal, vertical and diagonal) lines in the grid, rather than the small steps from one grid position to the next that ordinary A* considers.[2]
Jump point search preserves A*'s optimality, while potentially reducing its running time by an order of magnitude.[1]
History
[edit]Harabor and Grastien's original publication provides algorithms for neighbor pruning and identifying successors.[1] The original algorithm for neighbor pruning allowed corner-cutting to occur, which meant the algorithm could only be used for moving agents with zero width, limiting its application to either real-life agents (e.g., robotics) or simulations (e.g., many games).
The authors presented modified pruning rules for applications where corner-cutting is not allowed the following year.[3] This paper also presents an algorithm for pre-processing a grid in order to minimize online search times.
A number of further optimizations were published by the authors in 2014.[4] These optimizations include exploring columns or rows of nodes instead of individual nodes, pre-computing "jumps" on the grid, and stronger pruning rules.
Future work
[edit]Although jump point search is limited to uniform cost grids and homogeneously sized agents, the authors are placing future research into applying JPS with existing grid-based speed-up techniques such as hierarchical grids.[4][5]
References
[edit]- ^ a b c D. Harabor; A. Grastien (2011). Online Graph Pruning for Pathfinding on Grid Maps (PDF). 25th National Conference on Artificial Intelligence. AAAI.
- ^ Witmer, Nathan (5 May 2013). "Jump Point Search Explained". zerowidth positive lookahead. Archived from the original on 2014-03-10. Retrieved 10 March 2014.
- ^ D. Harabor; A. Grastien (2012). The JPS Pathfinding System. 26th National Conference on Artificial Intelligence. AAAI. Archived from the original on 2020-11-09. Retrieved 2015-07-11.
- ^ a b Harabor, Daniel; Grastien, Alban. "Improving Jump Point Search" (PDF). Australian National University College of Engineering and Computer Science. Association for the Advancement of Artificial Intelligence (www.aaai.org). Retrieved 11 July 2015.
- ^ Adi Botea; Martin Muller (2004). "Near Optimal Hierarchical Path-Finding" (PDF). University of Alberta.
Jump point search
View on GrokipediaOverview
Definition and purpose
Jump Point Search (JPS) is an optimized pathfinding algorithm designed for uniform-cost grid environments, serving as a symmetry-reducing enhancement to the A* search algorithm by expanding only specific nodes known as "jump points" rather than all neighboring cells.[1] This approach identifies and prunes redundant paths that arise due to the symmetric structure of grids, where multiple equivalent routes exist between points, thereby focusing the search on directionally significant nodes that could lead to optimal paths.[1] The primary purpose of JPS is to efficiently compute optimal paths in obstacle-filled rectangular grids, minimizing computational overhead while preserving the optimality and completeness guarantees of A*.[1] By exploiting the inherent symmetries in grid maps—such as straight-line traversals where intermediate nodes offer no unique routing advantages—JPS skips over these redundant expansions, potentially accelerating searches by an order of magnitude in large, open spaces compared to standard A*.[1] It assumes a uniform-cost model where cardinal movements (up, down, left, right) cost 1 unit and diagonal movements cost √2 units, operating on undirected grids with cells designated as either traversable or obstacles, and allowing up to eight possible neighbors per cell.[1] For instance, consider a simple 2D grid scenario with no obstacles, where the start point is at the bottom-left and the goal is at the top-right, forming a straight diagonal line. In standard A*, the algorithm would expand every intermediate cell along this path. JPS, however, identifies the direction of travel and "jumps" directly to the next jump point—such as the goal itself or a point where a deviation becomes possible—bypassing the in-between nodes and adding only the jump point as a successor.[1] This illustrates how JPS reduces the search space in symmetric, linear traversals without missing the optimal route.Relation to A* search
Jump point search (JPS) is an optimized variant of the A* algorithm specifically designed for uniform-cost grid maps, retaining A*'s core best-first search framework, including the use of open and closed sets to manage explored nodes and an admissible heuristic such as the Euclidean or Manhattan distance to guide the search toward the goal.[1] This integration allows JPS to inherit A*'s guarantees of optimality and completeness when the heuristic is consistent, without altering the fundamental evaluation function , where is the path cost from the start to node and is the estimated cost from to the goal.[1] The primary modification in JPS lies in the successor generation phase, where instead of expanding all neighboring nodes (typically up to 8 in a grid), the algorithm applies symmetry-breaking pruning rules to identify only "jumpable" directions and advances directly to the next relevant node, known as a jump point, thereby skipping intermediate symmetric nodes that would not lead to optimal paths.[1] This jumping mechanism dramatically reduces the number of node expansions compared to standard A*, often achieving speedups of an order of magnitude or more in open terrains by eliminating redundant searches along straight-line or diagonal paths.[1] Heuristic consistency is preserved in JPS by ensuring that the admissible heuristic remains unchanged and that pruned nodes do not affect the monotonicity required for optimality, allowing the algorithm to find the shortest path without relaxation or approximation.[1] At a high level, JPS integrates with A* by replacing the standard successor function with a specialized one that incorporates pruning and jumping, as illustrated in the following pseudocode outline:function JPS(start, goal):
open_set = PriorityQueue() // prioritized by f(n) = g(n) + h(n)
closed_set = Set()
g_scores = {start: 0}
open_set.insert(start, h(start))
while open_set is not empty:
current = open_set.extract_min()
if current == goal:
return reconstruct_path(current)
closed_set.add(current)
for successor in identify_successors(current, goal): // JPS-modified: prunes and jumps
tentative_g = g_scores[current] + distance(current, successor)
if successor in closed_set or tentative_g >= g_scores[successor]:
continue
came_from[successor] = current
g_scores[successor] = tentative_g
f_score = tentative_g + h(successor)
open_set.insert_or_update(successor, f_score)
function identify_successors(node, [goal](/page/Goal)): // Key JPS modification
successors = []
for direction in possible_directions(node):
if is_jumpable(direction):
jump_point = jump(node, direction, [goal](/page/Goal))
if jump_point is valid:
successors.append(jump_point)
return successors
function JPS(start, goal):
open_set = PriorityQueue() // prioritized by f(n) = g(n) + h(n)
closed_set = Set()
g_scores = {start: 0}
open_set.insert(start, h(start))
while open_set is not empty:
current = open_set.extract_min()
if current == goal:
return reconstruct_path(current)
closed_set.add(current)
for successor in identify_successors(current, goal): // JPS-modified: prunes and jumps
tentative_g = g_scores[current] + distance(current, successor)
if successor in closed_set or tentative_g >= g_scores[successor]:
continue
came_from[successor] = current
g_scores[successor] = tentative_g
f_score = tentative_g + h(successor)
open_set.insert_or_update(successor, f_score)
function identify_successors(node, [goal](/page/Goal)): // Key JPS modification
successors = []
for direction in possible_directions(node):
if is_jumpable(direction):
jump_point = jump(node, direction, [goal](/page/Goal))
if jump_point is valid:
successors.append(jump_point)
return successors
Background
Grid-based pathfinding
Grid-based pathfinding refers to the process of finding the shortest path between two points in a discrete 2D lattice composed of cells, where certain cells are designated as obstacles that cannot be traversed.[3] This approach models the environment as a graph, with each free cell (or grid point) serving as a node and possible movements between adjacent cells as edges.[4] It is commonly applied in domains such as robotics, video games, and simulations to enable navigation around barriers while minimizing path length or cost.[3] Movement in grid-based pathfinding is governed by specific models that define allowable directions and associated costs. The cardinal (or 4-neighbor) model permits movement only in four orthogonal directions—north, south, east, and west—resulting in uniform costs for horizontal and vertical steps.[3] The 8-directional (or octile) model extends this by including four diagonal directions, allowing for more natural, shorter paths in open spaces, with costs of 1 for orthogonal moves and √2 for diagonal moves.[1] These models assume a regular lattice structure where each cell connects to its immediate neighbors based on the chosen connectivity. Grids are typically represented using 2D arrays to store cell states, with free spaces marked as traversable and obstacles as impassable to prevent expansion into blocked areas.[5] Adjacency can be implicitly defined through array indexing for efficiency, or explicitly via lists that enumerate neighbors according to the movement model, facilitating graph search operations.[3] This representation simplifies implementation but requires careful handling of boundary conditions and obstacle integration. A key challenge in grid-based pathfinding arises from the potentially high node count in large environments, such as expansive maps with thousands of cells, which can generate exponential search spaces during exploration without targeted optimizations.[5] Obstacles further complicate this by fragmenting the space and increasing the branching factor, demanding efficient pruning or heuristic guidance to maintain computational feasibility.[3] A* search serves as a standard informed method for addressing these issues in grid environments.[6]Limitations of standard A* on grids
In uniform-cost grid maps, the standard A* algorithm encounters substantial inefficiencies stemming from the inherent symmetry of the grid structure. Numerous paths between the start and goal nodes are equivalent in cost but differ merely in the sequence of moves, prompting A* to generate and evaluate a large number of redundant states. This symmetry arises because uniform edge weights make detours and alternative routings indistinguishable in terms of path length until later in the search, causing the algorithm to explore unnecessary branches that do not contribute to the optimal solution.[1] A key manifestation of this issue occurs during straight-line pathfinding, where A* expands intermediate nodes along the route while checking all possible symmetric continuations, such as diagonal or orthogonal deviations that could be pruned if grid topology were better exploited. In an empty grid without obstacles, for instance, A* generates a diamond-shaped expansion front (under Manhattan distance) or approximately elliptical front (under octile distance), evaluating every node within the bounding ellipse defined by the goal's initial f-value and wasting effort on backtracking or circuitous paths that symmetry renders suboptimal. This leads to excessive node generations, as dominated states—those reachable via multiple equivalent routes—are not inherently identified and discarded.[1] The expansion overhead exacerbates these problems, with A* routinely examining up to 8 neighbors per node in 8-connected grids, resulting in a worst-case time complexity of , where is the branching factor and is the solution depth. Even when employing consistent and admissible heuristics like the octile distance—which computes to provide a tight lower bound on costs in grids allowing diagonal traversal with costs of 1 and —A* fails to prune symmetric successors, as the heuristic does not leverage the grid's regular structure to eliminate equivalent paths on the fly.[1][7][8] These limitations motivate symmetry-breaking techniques, such as Jump Point Search, which address redundant expansions by pruning successors based on grid-specific rules.[1]Core Algorithm
Successor pruning
Successor pruning in Jump Point Search (JPS) is a symmetry-reduction technique applied during node expansion to eliminate redundant successor candidates in grid-based pathfinding, focusing only on "natural" neighbors that cannot be reached via equally optimal symmetric paths from the node's parent.[9] This pruning leverages the uniform cost structure of grids, where intermediate nodes along straight, obstacle-free paths are unnecessary to expand individually, as optimal paths would not deviate symmetrically.[10] By discarding these dominated successors, JPS significantly reduces the branching factor from up to 8 neighbors in standard A* to as few as 4, enhancing efficiency without sacrificing optimality.[9] Natural neighbors are the direct adjacent cells that remain after applying pruning rules, representing the minimal set required to preserve all possible optimal paths; these are typically the cells in the four cardinal directions (up, down, left, right) relative to the expansion direction, unless symmetry allows further reduction.[10] In contrast, forced neighbors are additional successors that must be considered due to obstacles blocking alternative symmetric paths; these arise when the path through the current node is strictly shorter than any detour around obstacles, ensuring no optimal path is overlooked.[9] For instance, in an open grid, forced neighbors are rare, but near walls or barriers, they can appear in up to four positions, such as when a diagonal move is compelled by an adjacent blockage.[10] Pruning rules differ by movement type. For horizontal or vertical (cardinal) moves, a neighbor is pruned if an alternative path from the parent to that neighbor exists with length less than or equal to the path via the current node, which occurs in straight-line scenarios without obstacles; thus, only the immediate neighbor in the direction away from the parent typically survives as natural.[9] For diagonal moves, pruning is stricter: a neighbor is eliminated if an alternative path of equal length includes an earlier diagonal step, or if a shorter path exists; this often leaves up to three natural diagonal neighbors, but forced ones may be retained if cardinal paths adjacent to the diagonal are obstructed.[10] Algorithmically, during the expansion of a node with parent , JPS first identifies the incoming direction from and applies the rules to generate only natural and forced successors in the eight possible directions, skipping pruned ones entirely.[9] This step checks for obstacle-induced forces by verifying if adjacent cells block symmetric alternatives, ensuring the search proceeds only to viable candidates before identifying jump points as the endpoints of these pruned paths.[10]Jump point identification
Jump points in Jump Point Search (JPS) are defined as nodes from which an optimal path can first deviate from the straight-line direction inherited from its parent node, typically due to obstacles, the goal, or branching opportunities that force a turn.[9] Specifically, a node is a jump point from a node in direction if for some positive integer that minimizes , and satisfies one of the following: it is the goal node; it has a forced neighbor (a neighbor not reachable via a natural extension of the incoming direction); or, if is diagonal, a successor in one of the cardinal directions from leads to another jump point.[9] This definition ensures that only these pivotal nodes are considered for expansion, allowing the search to skip intermediate symmetric nodes along straight-line paths.[9] The identification of jump points relies on a recursive scanning function calledjump, which traverses the grid in a specified direction until a jump point is encountered or the scan terminates without finding one.[9] The function takes as input the current node , the direction vector (one of eight possible moves: four cardinal or four diagonal), the start node , and the goal .[9] Directions are represented by unit vectors, such as for right or for northeast.[9] For diagonal directions, the function also considers the two orthogonal cardinal components and (e.g., right and up for northeast) to check for potential deviations.[9] Pruning ensures that only these identified jump points are added to the open set during search.[9]
Step-by-step, the jump function proceeds as follows: from the current node , it advances to the next node by stepping in direction ; if is an obstacle or outside the grid boundaries, it returns null to indicate no jump point; if is the goal, it returns ; if has any forced neighbor (a neighbor that cannot be reached symmetrically from the parent direction), it returns as the jump point; for diagonal , it recursively calls jump on in directions and , returning if either yields a non-null result indicating a deviation; otherwise, it recurses on in the original direction .[9] This process continues linearly until a termination condition is met, effectively "jumping" over non-branching cells.[9]
For example, consider a grid with a wall blocking a straight path; starting from node moving northeast (), the scan proceeds diagonally until reaching a node adjacent to the wall, where a forced neighbor (e.g., a cell around the wall's corner) exists, making the jump point just before the obstacle forces a turn.[9]
The pseudocode for the jump function is as follows:
Function jump(x, \vec{d}, s, g):
n ← step(x, \vec{d}) // Advance one step in direction \vec{d}
if n is [obstacle](/page/Obstacle) or outside [grid](/page/The_Grid) then
return null
if n = g then
return n
if ∃ n' ∈ neighbors(n) such that n' is forced then
return n
if \vec{d} is diagonal then
for i ∈ {1, 2} do
if jump(n, \vec{d_i}, s, g) ≠ null then
return n
return jump(n, \vec{d}, s, g)
Function jump(x, \vec{d}, s, g):
n ← step(x, \vec{d}) // Advance one step in direction \vec{d}
if n is [obstacle](/page/Obstacle) or outside [grid](/page/The_Grid) then
return null
if n = g then
return n
if ∃ n' ∈ neighbors(n) such that n' is forced then
return n
if \vec{d} is diagonal then
for i ∈ {1, 2} do
if jump(n, \vec{d_i}, s, g) ≠ null then
return n
return jump(n, \vec{d}, s, g)
step computes the adjacent node, neighbors checks the eight possible adjacent cells, and a neighbor is "forced" if it lies in a direction not aligned with the incoming .[9]
Overall search procedure
Jump Point Search (JPS) operates as a variant of the A* algorithm, where the core search loop remains similar—maintaining an open set prioritized by the evaluation function , where is the cost from the start to node and is a heuristic estimate to the goal—but the successor generation is replaced by a specialized procedure that identifies and jumps to relevant jump points rather than enumerating all immediate neighbors.[9] This integration allows JPS to prune symmetric paths on-the-fly during the search, expanding only nodes that represent necessary turning points or the goal.[9] The high-level steps begin with initializing the open set to contain the start node, setting its -score to 0 and parent to null. While the open set is not empty, the node with the lowest -score is dequeued as the current node. If the current node is the goal, the search terminates successfully, and the path is reconstructed by backtracking through parent pointers. Otherwise, successors are generated by first pruning the eight possible neighbors of the current node based on direction from the parent to avoid revisiting symmetric paths, then for each unpruned neighbor direction, invoking a jump function to identify the next jump point in that direction, which may span multiple grid cells. These identified jump points are then evaluated: if they offer a better or equal path cost and have not been processed with a lower cost, they are added to or updated in the open set with updated -scores, -scores, and parents pointing back to the current node.[9] In 8-directional grids, the procedure explicitly handles both cardinal (up, down, left, right) and diagonal movements, with diagonal steps costing times the cardinal cost. Pruning ensures that only promising directions are explored, and the jump function checks for forced neighbors or goal convergence separately for diagonal cases by briefly scanning perpendicular cardinal directions to detect potential jump points that would force a deviation.[9] This maintains consistency across movement types without altering the underlying A* priority mechanism. The search terminates upon dequeuing the goal node, yielding an optimal path via reconstruction, or when the open set empties, indicating no path exists. Edge cases include scenarios where the start coincides with the goal, resulting in an immediate trivial path of length zero; where the start or goal is an obstacle, in which case the search either fails immediately or cannot reach the goal, respectively; and grids with no feasible path due to obstacles, exhausting the open set without success.[9] The complete procedure can be expressed through the following key components integrated into A*, as detailed in the original formulation:[9] Algorithm 1: IdentifySuccessors(x, s, g)Input: Current node , start , goal
Output: Set of jump point successors of
successors(x) ← ∅
PruneNeighbours(x)
for each neighbour n of x do
jp ← Jump(x, Direction(x,n), s, g)
if jp ≠ null then
successors(x) ← successors(x) ∪ {jp}
return successors(x)
successors(x) ← ∅
PruneNeighbours(x)
for each neighbour n of x do
jp ← Jump(x, Direction(x,n), s, g)
if jp ≠ null then
successors(x) ← successors(x) ∪ {jp}
return successors(x)
Input: Node , direction ~d, start , goal
Output: Nearest jump point from in direction ~d, or null
n ← Step(x, ~d)
if n is [obstacle](/page/Obstacle) or outside grid then
return null
if n = g then
return n
if HasForcedNeighbour(n, ~d) then
return n
if ~d is diagonal then
if Jump(n, ~d1, s, g) ≠ null or Jump(n, ~d2, s, g) ≠ null then
return n // where ~d1, ~d2 are cardinal components of ~d
return Jump(n, ~d, s, g)
n ← Step(x, ~d)
if n is [obstacle](/page/Obstacle) or outside grid then
return null
if n = g then
return n
if HasForcedNeighbour(n, ~d) then
return n
if ~d is diagonal then
if Jump(n, ~d1, s, g) ≠ null or Jump(n, ~d2, s, g) ≠ null then
return n // where ~d1, ~d2 are cardinal components of ~d
return Jump(n, ~d, s, g)
