Hubbry Logo
AA treeAA treeMain
Open search
AA tree
Community hub
AA tree
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
AA tree
AA tree
from Wikipedia

An AA tree in computer science is a form of balanced tree used for storing and retrieving ordered data efficiently. AA trees are named after their originator, Swedish computer scientist Arne Andersson.[1]

AA trees are a variation of the red–black tree, a form of binary search tree which supports efficient addition and deletion of entries. Unlike red–black trees, red nodes on an AA tree can only be added as a right subchild. In other words, no red node can be a left sub-child. This results in the simulation of a 2–3 tree instead of a 2–3–4 tree, which greatly simplifies the maintenance operations. The maintenance algorithms for a red–black tree need to consider seven different shapes to properly balance the tree:

An AA tree on the other hand only needs to consider two shapes due to the strict requirement that only right links can be red:

Balancing rotations

[edit]

Whereas red–black trees require one bit of balancing metadata per node (the color), AA trees require O(log(log(N))) bits of metadata per node, in the form of an integer "level". The following invariants hold for AA trees:

  1. The level of every leaf node is one.
  2. The level of every left child is exactly one less than that of its parent.
  3. The level of every right child is equal to or one less than that of its parent.
  4. The level of every right grandchild is strictly less than that of its grandparent.
  5. Every node of level greater than one has two children.

A link where the child's level is equal to that of its parent is called a horizontal link, and is analogous to a red link in the red–black tree. Individual right horizontal links are allowed, but consecutive ones are forbidden; all left horizontal links are forbidden. These are more restrictive constraints than the analogous ones on red–black trees, with the result that re-balancing an AA tree is procedurally much simpler than re-balancing a red–black tree.

Insertions and deletions may transiently cause an AA tree to become unbalanced (that is, to violate the AA tree invariants). Only two distinct operations are needed for restoring balance: "skew" and "split". Skew is a right rotation to replace a subtree containing a left horizontal link with one containing a right horizontal link instead. Split is a left rotation and level increase to replace a subtree containing two or more consecutive right horizontal links with one containing two fewer consecutive right horizontal links. Implementation of balance-preserving insertion and deletion is simplified by relying on the skew and split operations to modify the tree only if needed, instead of making their callers decide whether to skew or split.

function skew is
    input: T, a node representing an AA tree that needs to be rebalanced.
    output: Another node representing the rebalanced AA tree.

    if nil(T) then
        return Nil
    else if nil(left(T)) then
        return T
    else if level(left(T)) == level(T) then
        Swap the pointers of horizontal left links.
        L = left(T)
        left(T) := right(L)
        right(L) := T
        return L
    else
        return T
    end if
end function

Skew:

function split is
    input: T, a node representing an AA tree that needs to be rebalanced.
    output: Another node representing the rebalanced AA tree.

    if nil(T) then
        return Nil
    else if nil(right(T)) or  nil(right(right(T))) then
        return T
    else if level(T) == level(right(right(T))) then
        We have two horizontal right links.  Take the middle node, elevate it, and return it.
        R = right(T)
        right(T) := left(R)
        left(R) := T
        level(R) := level(R) + 1
        return R
    else
        return T
    end if
end function

Split:

Insertion

[edit]

Insertion begins with the normal binary tree search and insertion procedure. Then, as the call stack unwinds (assuming a recursive implementation of the search), it's easy to check the validity of the tree and perform any rotations as necessary. If a horizontal left link arises, a skew will be performed, and if two horizontal right links arise, a split will be performed, possibly incrementing the level of the new root node of the current subtree. Note, in the code as given above, the increment of level(T). This makes it necessary to continue checking the validity of the tree as the modifications bubble up from the leaves.

function insert is
    input: X, the value to be inserted, and T, the root of the tree to insert it into.
    output: A balanced version T including X.

    Do the normal binary tree insertion procedure. Set the result of the
    recursive call to the correct child in case a new node was created or the
    root of the subtree changes.
    if nil(T) then
        Create a new leaf node with X.
        return node(X, 1, Nil, Nil)
    else if X < value(T) then
        left(T) := insert(X, left(T))
    else if X > value(T) then
        right(T) := insert(X, right(T))
    end if
    Note that the case of X == value(T) is unspecified. As given, an insert
    will have no effect. The implementor may desire different behavior.

    Perform skew and then split. The conditionals that determine whether or
    not a rotation will occur or not are inside of the procedures, as given
    above.
    T := skew(T)
    T := split(T)

    return T
end function

Deletion

[edit]

As in most balanced binary trees, the deletion of an internal node can be turned into the deletion of a leaf node by swapping the internal node with either its closest predecessor or successor, depending on which are in the tree or on the implementor's whims. Retrieving a predecessor is simply a matter of following one left link and then all of the remaining right links. Similarly, the successor can be found by going right once and left until a null pointer is found. Because of the AA property of all nodes of level greater than one having two children, the successor or predecessor node will be in level 1, making their removal trivial.

To re-balance a tree, there are a few approaches. The one described by Andersson in his original paper is the simplest, and it is described here, although actual implementations may opt for a more optimized approach. After a removal, the first step to maintaining tree validity is to lower the level of any nodes whose children are two levels below them, or who are missing children. Then, the entire level must be skewed and split. This approach was favored, because when laid down conceptually, it has three easily understood separate steps:

  1. Decrease the level, if appropriate.
  2. Skew the level.
  3. Split the level.

However, we have to skew and split the entire level this time instead of just a node, complicating our code.

function delete is
    input: X, the value to delete, and T, the root of the tree from which it should be deleted.
    output: T, balanced, without the value X.
   
    if nil(T) then
        return T
    else if X > value(T) then
        right(T) := delete(X, right(T))
    else if X < value(T) then
        left(T) := delete(X, left(T))
    else
        If we're a leaf, easy, otherwise reduce to leaf case. 
        if leaf(T) then
            return right(T)
        else if nil(left(T)) then
            L := successor(T)
            right(T) := delete(value(L), right(T))
            value(T) := value(L)
        else
            L := predecessor(T)
            left(T) := delete(value(L), left(T))
            value(T) := value(L)
        end if
    end if

    Rebalance the tree. Decrease the level of all nodes in this level if
    necessary, and then skew and split all nodes in the new level.
    T := decrease_level(T)
    T := skew(T)
    right(T) := skew(right(T))
    if not nil(right(T))
        right(right(T)) := skew(right(right(T)))
    end if
    T := split(T)
    right(T) := split(right(T))
    return T
end function
function decrease_level is
    input: T, a tree for which we want to remove links that skip levels.
    output: T with its level decreased.

    should_be = min(level(left(T)), level(right(T))) + 1
    if should_be < level(T) then
        level(T) := should_be
        if should_be < level(right(T)) then
            level(right(T)) := should_be
        end if
    end if
    return T
end function

A good example of deletion by this algorithm is present in the Andersson paper.

Performance

[edit]

The performance of an AA tree is equivalent to the performance of a red–black tree. While an AA tree makes more rotations than a red–black tree, the simpler algorithms tend to be faster, and all of this balances out to result in similar performance. A red–black tree is more consistent in its performance than an AA tree, but an AA tree tends to be flatter, which results in slightly faster search times.[2]

See also

[edit]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
An AA tree is a that uses a single integer level attribute on each node to maintain balance, ensuring that the height remains logarithmic in the number of elements for efficient search, insertion, and deletion operations. Invented by Swedish computer scientist Arne Andersson in 1993, it represents a simplified variant of red-black trees, achieving the same O(log n) bounds while requiring fewer rules for balancing. AA trees enforce balance through levels satisfying five invariants: every leaf is at level 1; every left child is exactly one level below its parent; every right child is at the same level or one below its parent; every right grandchild is strictly less than its grandparent's level; and every node above level 1 has two children. These properties are equivalent to those of red-black trees when levels are mapped to colors—same-level links as red and level increases as black—but AA trees eliminate color attributes in favor of explicit levels, making the structure a strict of valid red-black trees where red nodes appear only as right children. This equivalence ensures that AA trees model 2-3 trees in binary form, with levels corresponding to tree orders. Insertion in an AA tree begins with a standard insertion, assigning the new leaf the level of its parent, followed by a bottom-up pass applying two balancing rotations: skew (a left rotation on a node with a left at the same level) to eliminate left horizontal links, and split (a right rotation on a node with a right-right at the same level) to handle right-leaning structures. Deletion involves removing the target node, temporarily decreasing levels along the path to propagate balance, and then reapplying skew and split operations, potentially with an additional level-update step to adjust the root or subtrees. Both operations complete in O(log n) time due to the bounded . The primary advantage of AA trees lies in their simplicity: implementations require only three core procedures—skew, split, and level updates—compared to the more complex case analysis in red-black trees, making them easier to code and verify while preserving strong balance guarantees. They are particularly useful in educational contexts and systems where straightforward balanced tree semantics are preferred over optimized but intricate alternatives.

Introduction

Definition and Motivation

An AA tree is a designed as a variant of red-black trees, where each node is assigned an integer level—starting at 1 for leaves—instead of a color to enforce structural balance. The primary motivation for AA trees is to simplify the implementation of balanced search trees compared to red-black trees, which rely on intricate color-based rules that can complicate coding; by using levels, AA trees achieve equivalent balance guarantees with fewer cases to handle during maintenance operations, while ensuring O(log n) for insertions, deletions, and searches, and maintaining a height bounded by 2 log(n+1). Like standard binary search trees, AA trees maintain the ordering invariant that for any node, all keys in its left subtree are less than the node's key, and all keys in its right subtree are greater than the node's key. A key structural invariant in AA trees prohibits two consecutive nodes of the same level along any path from the to a , analogous to the red-red prohibition in red-black trees, which prevents imbalance. For illustration, a minimal AA tree with a single node has that node at level 1, serving as both and . Adding a second node as a requires promoting the to level 2 while keeping the child at level 1 to satisfy the no-consecutive-same-level rule. A three-node tree, such as keys 2 (, level 2), 1 (left , level 1), and 3 (right , level 1), exemplifies a balanced structure at the smallest scale.

History and Development

The AA tree was invented by Swedish computer scientist Arne Andersson in 1993 while he was at the Department of Computer Systems, . Andersson introduced the data structure in his paper titled "Balanced Search Trees Made Simple," which was presented at the Third International Workshop on Algorithms and Data Structures (WADS '93) held in , , from August 11–13, 1993, and published in the proceedings by Springer Verlag (Lecture Notes in Computer Science, vol. 709, pp. 60–71). This work marked a key milestone in the evolution of balanced binary search trees, offering a streamlined approach amid the rising adoption of structures like AVL trees—developed in 1962 by G. M. Adelson-Velsky and Y. M. Landis—and red-black trees, formalized by L. J. Guibas and R. Sedgewick in 1978. Andersson's primary motivation was to simplify the rules governing red-black trees, which were efficient but often cumbersome to implement and explain due to their color-based balancing invariants. By replacing colors with integer levels and restricting certain configurations, AA trees achieved comparable O(log n) performance for insertions, deletions, and searches while reducing the number of cases to handle during rebalancing. This design responded directly to the early context, where balanced trees were increasingly essential for practical algorithms but required accessible implementations for and . Since its introduction, AA trees have found adoption primarily in educational settings for their elegance and ease of coding, appearing in algorithms textbooks and course materials post-1993. For example, Mark Allen Weiss refined and named the structure in the second edition of his book Data Structures and Using C++ (2000), emphasizing its pedagogical value as a variant of left-leaning red-black trees. Implementations have appeared in various open-source libraries and programming resources, such as Rust's aatree crate and examples from Project Nayuki, though no significant updates to the core design have emerged. The structure's enduring appeal lies in its balance of simplicity and theoretical guarantees, making it a staple for self-balancing trees without the full of red-black variants.

Structure and Properties

Node Components

In an AA tree, each node consists of a key for maintaining ordering, pointers to left and right child nodes, and an level field that is at least 1. The key field stores the comparable value that dictates the node's position relative to others, ensuring that all keys in the left subtree are less than the node's key and all in the right subtree are greater. The left and right pointers reference child nodes or null for leaves, forming the tree's hierarchical structure. The level field serves a dual purpose, acting both as a measure of the node's structural position and as a balancing mechanism equivalent to the color in red-black trees, without requiring a separate color attribute. Leaves are fixed at level 1, while internal nodes occupy higher levels to create a spine-like backbone along the right edges, with left subtrees branching off at decreasing levels. This design enforces a horizontal spine at each level via right children of equal level, while prohibiting horizontal links on the left to maintain balance. The node's components uphold key invariants for structural integrity: the level of every leaf is 1 (AA1); the level of a left is strictly less than its parent's (AA3); the level of a node is generally one more than its children's, except the right of a node with two children, which may equal the parent's level (AA2); and every node at level greater than 1 must have two children (AA4). These rules ensure levels are either one less than the parent (vertical link) or equal (horizontal right link only), preventing unbalanced configurations like consecutive horizontal links on non-right paths. A representative node structure in pseudocode (Python style) illustrates these components:

python

class Node: def __init__(self, key): self.key = key # Search key for ordering self.left = None # Left child pointer self.right = None # Right child pointer self.level = 1 # Balancing level (≥1)

class Node: def __init__(self, key): self.key = key # Search key for ordering self.left = None # Left child pointer self.right = None # Right child pointer self.level = 1 # Balancing level (≥1)

This minimal representation supports the tree's operations while keeping implementation simple.

Level Assignment Rules

In AA trees, the level assignment rules form the core invariants that maintain balance without requiring explicit height computations during operations. Each node stores an integer level value, starting from 1 at and increasing toward the root. The primary rule stipulates that all leaf nodes must have level 1, ensuring a uniform base for the structure. Additionally, no left child can have the same level as its ; a left child's level is always exactly one less than its parent's, while a right child's level is either equal to or one less than its parent's. These constraints eliminate horizontal links on the left, allowing them only for right children at the same level, promoting a skewed but controlled right-leaning orientation. A crucial path invariant further enforces balance: on any path from the to a , no two consecutive nodes may share the same level. This prevents chains of horizontal links that could lead to unbalanced paths, as consecutive same-level nodes would violate the rule by implying an invalid direct horizontal connection. For validation, consider an attempted configuration where a level-2 node has a right child also at level 2, and that child has another right child at level 2; this forms consecutive level-2 nodes on the path, directly breaching the path invariant and rendering the invalid, as it mimics a prohibited red-red sequence in the red-black analogy. Such violations are corrected through rotations during insertions and deletions to restore the invariants. These rules collectively guarantee a balanced tree with height at most 2log2(n+1)12 \log_2(n+1) - 1, where nn is the number of nodes, achieved by alternating levels that form a "spine" of decreasing levels interspersed with horizontal right links. This bound arises from the structure's equivalence to a -black tree, where AA tree levels correspond to the black-height (counting black nodes along paths), and horizontal right links represent red nodes without allowing consecutive reds. As a result, the minimum path length equals the root's level, while the maximum is roughly twice that due to inserted red links, ensuring logarithmic performance.

Balancing Mechanisms

Single and Double Rotations

In AA trees, balancing is achieved through two single operations: skew and split. These locally restructure the tree to enforce the invariants after insertions or deletions. Skew is a right that eliminates a horizontal left link (where a node and its left have the same level), while split is a left that eliminates consecutive horizontal right links (where a node, its right , and the right 's right all have the same level), followed by a level promotion. These operations preserve the property and in-order traversal order. Levels remain unchanged during the rotations themselves, with promotion handled in split. The skew operation is performed on a node xx that has a left yy with level(y)=level(x)\mathrm{level}(y) = \mathrm{level}(x). A right is applied: yy becomes the , xx becomes yy's right , and xx's original left (none, since yy was left) is adjusted accordingly—specifically, y.righty.right becomes x.leftx.left, but since no left horizontal before, standard right rotation: x.left=y.rightx.left = y.right, y.right=xy.right = x. This resolves the left horizontal link without level changes. The split operation is the counterpart for right spines. It is applied to a node xx where the right child yy has level(y)=level(x)\mathrm{level}(y) = \mathrm{level}(x), and yy's right child zz has level(z)=level(y)\mathrm{level}(z) = \mathrm{level}(y). A left rotation is performed: yy becomes the , xx becomes yy's left , with x.right=y.leftx.right = y.left, y.left=xy.left = x. Then, to maintain balance, level(y)\mathrm{level}(y) is incremented by 1. This promotes yy and prevents consecutive horizontal rights. Note that child levels never exceed the parent's; violations are only for equal levels in allowed positions. AA trees do not use double rotations. Instead, the sequence of applying skew followed by split on the same node handles zig-zag patterns effectively, as skew may create a right spine that split then resolves. To illustrate skew (right ), consider inserting a node creating a left horizontal: x=5x=5 (lv 2), left child y=3y=3 (lv 2). Before:

5 (lv2) / 3 (lv2)

5 (lv2) / 3 (lv2)

After right (skew):

3 (lv2) \ 5 (lv2)

3 (lv2) \ 5 (lv2)

The search property holds (keys left <3 none, between 3-5 none, >5 none), and horizontal left is eliminated. For split (left with promotion), suppose x=5x=5 (lv 2), right y=7y=7 (lv 2), y.right=9y.right=9 (lv 2). Before:

5 (lv2) \ 7 (lv2) \ 9 (lv2)

5 (lv2) \ 7 (lv2) \ 9 (lv2)

After left rotation: y=7y=7 , left x=5x=5, right 9; then level(7) = 3. After:

7 (lv3) / \ 5 9 (lv2) (lv2)

7 (lv3) / \ 5 9 (lv2) (lv2)

This breaks the right chain, promotes 7. In-order (5,7,9) preserved. For a zig-zag case: parent p=10p=10 (lv2), right r=15r=15 (lv2), r.left=s=12r.left = s=12 (lv2). First, at r, skew: right rotate on r, now r's subtree: 12 (lv2) right 15 (lv2). Now, p right =12 (lv2). Then split on p: p=10 lv2 right 12 lv2, 12 right 15 lv2, so left rotate on p: 12 root, left 10 lv2, right 15 lv2; then level(12)=3. Final:

12 (lv3) / \ 10 15 (lv2) (lv2)

12 (lv3) / \ 10 15 (lv2) (lv2)

This achieves balance without explicit double rotation. Level promotion occurs as part of split, as detailed below.

Level Promotion and

In AA trees, level promotion occurs during insertion as part of the split operation to maintain balance. After performing the left in split to resolve a chain of two horizontal right links, the level of the new subtree (the original right ) is increased by 1. This ensures that the node now has two children at the previous level, simulating a promotion from to in the red-black equivalent, preventing further horizontal links at that level. The promotion is always applied after split, as the results in both children having the same (lower) level. This may propagate upward if the promoted node now creates a horizontal link with its parent, triggering additional skew and split recursively or iteratively until invariants hold. This explicit level adjustment simplifies the rules compared to recoloring in red-black trees. Consider inserting nodes 3, 2, 1 into an empty AA tree (new nodes at level 1):
  • Insert 3: root 3 (lv1)
  • Insert 2: left of 3 (lv1); skew on 3: right rotate, root 2 (lv1) right 3 (lv1)
  • Insert 1: left of 2 (lv1); skew on 2: right rotate, root 1 (lv1) right 2 (lv1) with 2 right 3 (lv1)
  • Now split on 1: left rotate, new root 2 (lv1) left 1 (lv1) right 3 (lv1); promote level(2) to 2.
Tree: 2 (lv2) / 1 (lv1) \ 3 (lv1) Insert 0: left of 1 (lv1); creates horizontal left, skew on 1: right rotate, 0 (lv1) right 1 (lv1) as left subtree of 2. No split needed. Final: 2 (lv2) / 0 (lv1) --right--> 1 (lv1) \ 3 (lv1) This demonstrates how skew and split with promotion balance the tree. Level occurs during deletion to shrink the tree after node removal. After deleting a node, levels are decreased along the path from the deletion point upward. For each node on the path, if its level is more than one greater than the level of its highest child (treating null as level 0), decrease the node's level by 1. If the right child had the same level as the node before , also decrease the right child's level by 1. This "easy" and "hard" deletion cases use these adjustments followed by skew and split to restore invariants. ensures no large gaps in levels, maintaining the right child level <= parent and left < parent. Through skew, split, promotions, and demotions, AA trees enforce no horizontal left links and at most single horizontal right links, keeping height O(log n) with simpler rules than red-black trees.

Core Operations

Insertion Algorithm

The insertion algorithm for an AA tree commences with a standard (BST) insertion procedure, where the new node is placed as a leaf following the key ordering rules, and its level is initialized to 1. This ensures the node adheres to the BST property: all keys in the left subtree are less than the node's key, and all in the right subtree are greater. The algorithm assumes unique keys, treating duplicates as invalid or ignoring them to maintain tree integrity. Following the initial placement, a post-insertion fix-up phase traverses upward from the new node to restore the AA tree balancing properties, specifically eliminating consecutive horizontal links (where a parent and its right child share the same level). This fix-up is typically implemented recursively, integrating the balancing steps during the return from the recursive insertion call. At each node on the path, two operations are applied in sequence: first, a skew to address any left horizontal link (a parent and its left child at the same level), and second, a split to handle potential right horizontal links involving the node's right subtree. The skew performs a single right rotation (zig) if the left child has the same level as the parent, promoting the left child upward. The split performs a single left rotation (zig) if the right child's right child matches the parent's level, after which the level of the former right child (now the new subtree root) is incremented by one to separate the horizontal links. These operations, akin to the single and double rotations in the balancing mechanisms, are repeated iteratively up the path until no violations remain, ensuring no two consecutive nodes at the same level form horizontal links. The recursive insertion can be outlined in pseudocode as follows, where insert handles the BST placement and fix-up:

Node* insert(Node* root, int key) { if (root == nullptr) { return new Node(key, 1); // New leaf at level 1 } if (key < root->key) { root->left = insert(root->left, key); } else if (key > root->key) { root->right = insert(root->right, key); } // Fix-up: skew then split root = skew(root); root = split(root); return root; } Node* skew(Node* node) { if (node->left != nullptr && node->left->level == node->level) { // Right rotation (zig) Node* temp = node->left; node->left = temp->right; temp->right = node; return temp; } return node; } Node* split(Node* node) { if (node->right != nullptr && node->right->right != nullptr && node->right->right->level == node->level) { // Left rotation (zig) on right child Node* temp = node->right; node->right = temp->left; temp->left = node; temp->level++; // Promote level of new root return temp; } return node; }

Node* insert(Node* root, int key) { if (root == nullptr) { return new Node(key, 1); // New leaf at level 1 } if (key < root->key) { root->left = insert(root->left, key); } else if (key > root->key) { root->right = insert(root->right, key); } // Fix-up: skew then split root = skew(root); root = split(root); return root; } Node* skew(Node* node) { if (node->left != nullptr && node->left->level == node->level) { // Right rotation (zig) Node* temp = node->left; node->left = temp->right; temp->right = node; return temp; } return node; } Node* split(Node* node) { if (node->right != nullptr && node->right->right != nullptr && node->right->right->level == node->level) { // Left rotation (zig) on right child Node* temp = node->right; node->right = temp->left; temp->left = node; temp->level++; // Promote level of new root return temp; } return node; }

This structure propagates the fix-up naturally through recursion, with each call performing local adjustments. For illustration, consider inserting keys 10, 20, and 30 into an empty AA tree. First, 10 becomes the root at level 1. Inserting 20 places it as the right child of 10 at level 1, forming a single horizontal link; skew on 10 finds no left child, and split finds no right-right child, so no changes occur. Inserting 30 places it as the right child of 20 at level 1, creating consecutive horizontal links (10-20-30 all at level 1). During fix-up at 20, skew does nothing (no left), and split does nothing (no right-right). At 10, skew does nothing, but split detects 20's right child 30 at level 1 matching 10's level, triggering a left rotation: 20 becomes the new root at level 2, with 10 as left child (level 1) and 30 as right child (level 1). Subsequent skew and split on 20 confirm no further violations. The final tree maintains balance with height 2. Special cases include inserting into an empty tree, where the new node directly becomes the at level 1 with no needed. The process ensures O(log n) worst-case per insertion, as the path length is bounded by the tree height, and each step performs at most a constant number of rotations amortized across operations.

Deletion Algorithm

The deletion algorithm for AA trees follows the standard (BST) procedure for locating and removing the target node, followed by a rebalancing phase to restore the level invariants and maintain the tree's balance properties. To delete a key xx, first search for the node containing xx. If the node has zero or one child, it is removed directly or replaced by its child, respectively. For a node with two children, the key xx is replaced by the key of its inorder successor (the minimum key in the right subtree), and the successor node—which has at most one child—is then deleted recursively. This ensures the deletion always reduces to removing a or a node with one child. After removal, the parent of the deleted node (or the node where the successor was removed) may violate the AA tree rules, particularly if a subtree height decreases, potentially creating a node whose level exceeds 1 plus the minimum level of its children. The rebalancing begins at this parent node and proceeds upward toward the along the search path. At each node pp on this path, the level is first updated: pp's level is set to 1 plus the minimum of its children's levels (treating missing children as level 0). If this decreases pp's level and its right child previously matched the old level, the right child's level is also decreased by 1 to preserve the no-horizontal-left-links invariant. This demotion may cascade upward if levels propagate changes. To resolve any resulting structural violations, such as consecutive horizontal (same-level) links forming a "spine," up to three skew operations followed by two split operations are applied at each node pp. A skew is a right on pp if its left child has the same level as pp, promoting the left subtree and eliminating a left horizontal link. A split is a left on pp if its right child and right grandchild both have the same level as pp, which decreases the level of the middle node and restructures the right spine. These operations—skew(pp); skew(prightp \rightarrow right); skew(prightrightp \rightarrow right \rightarrow right); split(pp); split(prightp \rightarrow right)—correct local imbalances without increasing the overall height, ensuring the tree remains balanced. Note that in deletion, these rotations do not adjust levels themselves; level changes occur only through the update step. If the 's level decreases during rebalancing, a new may be created. This process handles edge cases like deleting a (simple removal with potential demotion cascade), deleting the (re-rooting after fix-up), or deleting from a long horizontal spine (resolved by rotations to merge subtrees). The following pseudocode outlines the deletion process (adapted from standard implementations based on the original description):

function delete(key x, node t): if t is null: return null if x < t.key: t.left = delete(x, t.left) else if x > t.key: t.right = delete(x, t.right) else: // Found the node to delete if t.left is null: return t.right else if t.right is null: return t.left else: // Two children: replace with successor succ = findMin(t.right) t.key = succ.key t.right = delete(succ.key, t.right) // Rebalance upward from t t = fixAfterDelete(t) return t function fixAfterDelete(node p): if p is null: return null updateLevel(p) skew(p) skew(p.right) skew(p.right.right) split(p) split(p.right) return p function updateLevel(node p): oldLevel = p.level p.level = 1 + min(getLevel(p.left), getLevel(p.right)) // getLevel(null) = 0 if p.level < oldLevel and p.right != null and p.right.level == oldLevel: p.right.level = oldLevel - 1 function skew(node p): if p.left != null and p.left.level == p.level: // Right rotation, no level change rightRotate(p) // No level decrement function split(node p): if p.right != null and p.right.right != null and p.right.right.level == p.level: // Left rotation, no level change leftRotate(p) // No level increment in deletion

function delete(key x, node t): if t is null: return null if x < t.key: t.left = delete(x, t.left) else if x > t.key: t.right = delete(x, t.right) else: // Found the node to delete if t.left is null: return t.right else if t.right is null: return t.left else: // Two children: replace with successor succ = findMin(t.right) t.key = succ.key t.right = delete(succ.key, t.right) // Rebalance upward from t t = fixAfterDelete(t) return t function fixAfterDelete(node p): if p is null: return null updateLevel(p) skew(p) skew(p.right) skew(p.right.right) split(p) split(p.right) return p function updateLevel(node p): oldLevel = p.level p.level = 1 + min(getLevel(p.left), getLevel(p.right)) // getLevel(null) = 0 if p.level < oldLevel and p.right != null and p.right.level == oldLevel: p.right.level = oldLevel - 1 function skew(node p): if p.left != null and p.left.level == p.level: // Right rotation, no level change rightRotate(p) // No level decrement function split(node p): if p.right != null and p.right.right != null and p.right.right.level == p.level: // Left rotation, no level change leftRotate(p) // No level increment in deletion

Note that rotations (rightRotate and leftRotate) are standard AVL-style rotations preserving BST order, and levels are adjusted only in updateLevel to maintain invariants. The fixAfterDelete is applied recursively up the path in a full implementation, but the local operations suffice due to the spine-limiting property. Consider a balanced 5-node AA tree with keys 4 (level 2), left child 2 (level 1), 2's left 1 (level 1), 2's right 3 (level 1), and 4's right 10 (level 1). Deleting key 1 (a leaf at level 1) removes it directly, leaving 2 with only right child 3. Updating level at 2 sets it to 1 + level(3) = 1 (no change), but at 4, min child levels are now level(2)=1 and level(10)=1, so no demotion. No rotations are needed, but if deleting 3 instead, 2 loses its right child, updateLevel(2) sets level to 1 (from 1 + 0), then at 4, min child=1 (from 2), so 4's level demotes to 2. This demotion cascade shortens the tree height while rotations would handle any induced spines. Deleting the root 4 would replace it with successor 10 (after swapping), then fix-up demotes levels along the path, potentially re-rooting at 2 or 10 after rotations. These steps ensure amortized O(log n) time, even for spine deletions where multiple demotions occur.

Search Procedure

The search procedure in an AA tree follows the standard binary search tree (BST) traversal algorithm, leveraging the ordered structure of the tree without any modifications specific to the AA balancing mechanism. To search for a key, start at the root node and compare the key to the current node's value: if equal, the node is returned as the result; if the key is smaller, recurse on the left subtree; if larger, recurse on the right subtree. This process continues until the key is found or a null child is reached, indicating the key is absent from the tree. The search operation is strictly read-only, involving no rotations, level promotions, or demotions, as these are reserved for insertion and deletion to maintain balance. Due to the AA tree's balance properties, which ensure a height of at most 2log2(n+1)2 \log_2 (n + 1) for nn nodes, the time complexity is O(h)O(h) in the worst case, where hh is the tree height, yielding O(logn)O(\log n) performance; the average case is also O(logn)O(\log n). The following pseudocode illustrates the recursive search implementation, which can be adapted to iterative form if preferred:

function search(node, key): if node is null: return null // Key not found if key == node.key: return node // Key found if key < node.key: return search(node.left, key) else: return search(node.right, key)

function search(node, key): if node is null: return null // Key not found if key == node.key: return node // Key found if key < node.key: return search(node.left, key) else: return search(node.right, key)

Invoke it as search(root, key) to initiate the traversal from the tree's root. An extension of the search procedure enables efficient enumeration of all keys in sorted order via an inorder traversal, which visits nodes in left-root-right sequence; the balanced height ensures this completes in O(n)O(n) time overall. For example, consider searching for key 6 in a sample AA tree resulting from insertions of keys 1, 2, 3, 4, 5, 7, 9 (after balancing): assume the structure has root 4 (level 2), left child 2 (level 1) with left 1 (level 1) and right 3 (level 1), and right child 6 (level 1) with left 5 (level 1) and right 7 (level 1), but wait, adjusted for keys; alternatively, a valid path for illustration: start at 4 (6 > 4, go right to 7 level 1), then at 7 (6 < 7, go left to 5 level 1), then at 5 (6 > 5, go right to null), concluding the key is not present; the of 3 demonstrates logarithmic for 7 nodes.

Analysis and Comparisons

Performance Characteristics

AA trees achieve worst-case O(log n) for search, insertion, and deletion operations, where n is the number of nodes in the tree. This bound arises from the tree's restriction, ensuring that the longest path from to is logarithmic. Specifically, the h of an AA tree satisfies h ≤ 2 log₂(n + 1) - 1. The derivation stems from the level assignment invariants, where nodes at even levels (considering level 1 as the ) can have at most one at the same level, while nodes at odd levels allow two; this structure effectively doubles the minimum number of nodes every two levels, mimicking the balancing in red-black trees but enforced through levels rather than colors. In terms of amortized performance, insertions and deletions in AA trees perform O(1) rotations on average, leading to a total time of O(m log n) for m operations. Each rotation (skew or split) takes O(1) time, and while up to O(log n) may occur in the worst case per operation, the probabilistic depth of propagation yields the amortized bound. Search operations, being standard traversals without rebalancing, also remain O(log n) without additional amortized considerations. Space complexity for AA trees is O(n), as each node stores a fixed number of fields: key, left and right child pointers, and a level integer, with no auxiliary structures beyond the tree itself. Empirically, AA trees are simpler to implement than AVL trees due to fewer rotation cases (only single left and double left-right rotations), resulting in less code and easier maintenance, while offering performance comparable to red-black trees. However, AA trees can be slightly taller than AVL trees, with a height factor of approximately 2 log₂ n versus 1.44 log₂ n for AVL, though this difference is negligible for practical applications with n up to millions.

Relation to Red-Black Trees

AA trees maintain a structural with red-black trees, ensuring equivalent balance properties and logarithmic height guarantees. In this mapping, the level of a node in an AA tree corresponds directly to the black-height in a red-black tree, where horizontal links (connecting nodes at the same level) represent red edges, and vertical links (to lower levels) represent black edges. This equivalence arises because both data structures enforce that all paths from the root to a have the same number of "black" steps, preventing degeneration into linear chains. The primary simplification in AA trees lies in eliminating explicit node colors, replacing them with a single level attribute per node. Rotations in AA trees—specifically skew (left rotation on left children at the same level) and split (right rotation followed by skew to handle consecutive same-level right children)—combine the roles of recoloring and that red-black trees handle separately through color flips and up to four rotation types. This unified approach reduces the number of cases in insertion and deletion algorithms from the multiple color-specific rules in red-black trees to just two balancing operations. To convert an AA tree to a red-black tree, color all edges between nodes at the same level as , and edges to nodes at a lower level as ; the resulting structure satisfies all red-black invariants, including no two adjacent nodes (since same-level connections are strictly horizontal and right-leaning) and equal black-height paths. Conversely, not every red-black tree can be directly represented as an AA tree due to AA trees' restriction that "" (same-level) nodes appear only as right children, forbidding left-leaning edges. This restriction ensures the equivalence holds for the subset of red-black trees that adhere to right-only children, preserving amortized O(log n) performance while simplifying enforcement. AA trees offer implementation advantages over red-black trees, particularly in code simplicity and maintainability: without color attributes, developers avoid tracking and updating multiple states, reducing the algorithm's case count by approximately half and minimizing bugs from color violations. This makes AA trees especially suitable for pedagogical purposes and , as focuses solely on level consistency rather than intertwined color and logic. However, the coarser of levels can lead to slightly more rotations during balancing compared to the finer control of colors in red-black trees, resulting in marginally higher constant factors in practice. Historically, AA trees were developed by Arne Andersson in 1993 as a deliberate simplification of red-black trees (themselves introduced by and McCreight in 1972), prioritizing ease of over absolute efficiency to make balanced search trees more accessible. The draws from earlier B-trees and 2-3 trees but streamlines the binary representation to avoid the complexity that often deters adoption of red-black trees. For illustration, consider inserting keys 10, 20, and 30 into an empty AA tree (starting at level 1 for leaves). The resulting AA tree has 20 at level 2 (root), 10 as its left child at level 1 (vertical link), and 30 as its right child at level 1 (vertical link). Mapping to red-black: color the 20-10 edge black (level decrease) and 20-30 black (level decrease); all nodes (10, 20, 30) are black, satisfying red-black rules with black-height 2 on all paths. This example demonstrates the direct correspondence, where the AA level structure implicitly enforces red-black balance without explicit coloring.
Add your contribution
Related Hubs
User Avatar
No comments yet.