Recent from talks
Nothing was collected or created yet.
AA tree
View on WikipediaThis article needs additional citations for verification. (June 2011) |
An AA tree in computer science is a form of balanced tree used for storing and retrieving ordered data efficiently. AA trees are named after their originator, Swedish computer scientist Arne Andersson.[1]
AA trees are a variation of the red–black tree, a form of binary search tree which supports efficient addition and deletion of entries. Unlike red–black trees, red nodes on an AA tree can only be added as a right subchild. In other words, no red node can be a left sub-child. This results in the simulation of a 2–3 tree instead of a 2–3–4 tree, which greatly simplifies the maintenance operations. The maintenance algorithms for a red–black tree need to consider seven different shapes to properly balance the tree:
An AA tree on the other hand only needs to consider two shapes due to the strict requirement that only right links can be red:
Balancing rotations
[edit]Whereas red–black trees require one bit of balancing metadata per node (the color), AA trees require O(log(log(N))) bits of metadata per node, in the form of an integer "level". The following invariants hold for AA trees:
- The level of every leaf node is one.
- The level of every left child is exactly one less than that of its parent.
- The level of every right child is equal to or one less than that of its parent.
- The level of every right grandchild is strictly less than that of its grandparent.
- Every node of level greater than one has two children.
A link where the child's level is equal to that of its parent is called a horizontal link, and is analogous to a red link in the red–black tree. Individual right horizontal links are allowed, but consecutive ones are forbidden; all left horizontal links are forbidden. These are more restrictive constraints than the analogous ones on red–black trees, with the result that re-balancing an AA tree is procedurally much simpler than re-balancing a red–black tree.
Insertions and deletions may transiently cause an AA tree to become unbalanced (that is, to violate the AA tree invariants). Only two distinct operations are needed for restoring balance: "skew" and "split". Skew is a right rotation to replace a subtree containing a left horizontal link with one containing a right horizontal link instead. Split is a left rotation and level increase to replace a subtree containing two or more consecutive right horizontal links with one containing two fewer consecutive right horizontal links. Implementation of balance-preserving insertion and deletion is simplified by relying on the skew and split operations to modify the tree only if needed, instead of making their callers decide whether to skew or split.
function skew is
input: T, a node representing an AA tree that needs to be rebalanced.
output: Another node representing the rebalanced AA tree.
if nil(T) then
return Nil
else if nil(left(T)) then
return T
else if level(left(T)) == level(T) then
Swap the pointers of horizontal left links.
L = left(T)
left(T) := right(L)
right(L) := T
return L
else
return T
end if
end function
function split is
input: T, a node representing an AA tree that needs to be rebalanced.
output: Another node representing the rebalanced AA tree.
if nil(T) then
return Nil
else if nil(right(T)) or nil(right(right(T))) then
return T
else if level(T) == level(right(right(T))) then
We have two horizontal right links. Take the middle node, elevate it, and return it.
R = right(T)
right(T) := left(R)
left(R) := T
level(R) := level(R) + 1
return R
else
return T
end if
end function
Insertion
[edit]Insertion begins with the normal binary tree search and insertion procedure. Then, as the call stack unwinds (assuming a recursive implementation of the search), it's easy to check the validity of the tree and perform any rotations as necessary. If a horizontal left link arises, a skew will be performed, and if two horizontal right links arise, a split will be performed, possibly incrementing the level of the new root node of the current subtree. Note, in the code as given above, the increment of level(T). This makes it necessary to continue checking the validity of the tree as the modifications bubble up from the leaves.
function insert is
input: X, the value to be inserted, and T, the root of the tree to insert it into.
output: A balanced version T including X.
Do the normal binary tree insertion procedure. Set the result of the
recursive call to the correct child in case a new node was created or the
root of the subtree changes.
if nil(T) then
Create a new leaf node with X.
return node(X, 1, Nil, Nil)
else if X < value(T) then
left(T) := insert(X, left(T))
else if X > value(T) then
right(T) := insert(X, right(T))
end if
Note that the case of X == value(T) is unspecified. As given, an insert
will have no effect. The implementor may desire different behavior.
Perform skew and then split. The conditionals that determine whether or
not a rotation will occur or not are inside of the procedures, as given
above.
T := skew(T)
T := split(T)
return T
end function
Deletion
[edit]As in most balanced binary trees, the deletion of an internal node can be turned into the deletion of a leaf node by swapping the internal node with either its closest predecessor or successor, depending on which are in the tree or on the implementor's whims. Retrieving a predecessor is simply a matter of following one left link and then all of the remaining right links. Similarly, the successor can be found by going right once and left until a null pointer is found. Because of the AA property of all nodes of level greater than one having two children, the successor or predecessor node will be in level 1, making their removal trivial.
To re-balance a tree, there are a few approaches. The one described by Andersson in his original paper is the simplest, and it is described here, although actual implementations may opt for a more optimized approach. After a removal, the first step to maintaining tree validity is to lower the level of any nodes whose children are two levels below them, or who are missing children. Then, the entire level must be skewed and split. This approach was favored, because when laid down conceptually, it has three easily understood separate steps:
- Decrease the level, if appropriate.
- Skew the level.
- Split the level.
However, we have to skew and split the entire level this time instead of just a node, complicating our code.
function delete is
input: X, the value to delete, and T, the root of the tree from which it should be deleted.
output: T, balanced, without the value X.
if nil(T) then
return T
else if X > value(T) then
right(T) := delete(X, right(T))
else if X < value(T) then
left(T) := delete(X, left(T))
else
If we're a leaf, easy, otherwise reduce to leaf case.
if leaf(T) then
return right(T)
else if nil(left(T)) then
L := successor(T)
right(T) := delete(value(L), right(T))
value(T) := value(L)
else
L := predecessor(T)
left(T) := delete(value(L), left(T))
value(T) := value(L)
end if
end if
Rebalance the tree. Decrease the level of all nodes in this level if
necessary, and then skew and split all nodes in the new level.
T := decrease_level(T)
T := skew(T)
right(T) := skew(right(T))
if not nil(right(T))
right(right(T)) := skew(right(right(T)))
end if
T := split(T)
right(T) := split(right(T))
return T
end function
function decrease_level is
input: T, a tree for which we want to remove links that skip levels.
output: T with its level decreased.
should_be = min(level(left(T)), level(right(T))) + 1
if should_be < level(T) then
level(T) := should_be
if should_be < level(right(T)) then
level(right(T)) := should_be
end if
end if
return T
end function
A good example of deletion by this algorithm is present in the Andersson paper.
Performance
[edit]The performance of an AA tree is equivalent to the performance of a red–black tree. While an AA tree makes more rotations than a red–black tree, the simpler algorithms tend to be faster, and all of this balances out to result in similar performance. A red–black tree is more consistent in its performance than an AA tree, but an AA tree tends to be flatter, which results in slightly faster search times.[2]
See also
[edit]References
[edit]- ^ Andersson, Arne (1993). "Balanced search trees made simple" (PDF). In Dehne, Frank K. H. A.; Sack, Jörg-Rüdiger; Santoro, Nicola; Whitesides, Sue (eds.). Algorithms and Data Structures, Third Workshop, WADS '93, Montréal, Canada, August 11-13, 1993, Proceedings. Lecture Notes in Computer Science. Vol. 709. Springer. pp. 60–71. doi:10.1007/3-540-57155-8_236.
- ^ Heger, Dominique A. (October 2004). "A disquisition on the performance behavior of binary search tree data structures" (PDF). Upgrade. 5 (5). Council of European Professional Informatics Societies: 67–75. Archived from the original (PDF) on 2014-03-27.
AA tree
View on GrokipediaIntroduction
Definition and Motivation
An AA tree is a self-balancing binary search tree designed as a variant of red-black trees, where each node is assigned an integer level—starting at 1 for leaves—instead of a color to enforce structural balance.[1] The primary motivation for AA trees is to simplify the implementation of balanced search trees compared to red-black trees, which rely on intricate color-based rules that can complicate coding; by using levels, AA trees achieve equivalent balance guarantees with fewer cases to handle during maintenance operations, while ensuring O(log n) time complexity for insertions, deletions, and searches, and maintaining a height bounded by 2 log(n+1).[1] Like standard binary search trees, AA trees maintain the ordering invariant that for any node, all keys in its left subtree are less than the node's key, and all keys in its right subtree are greater than the node's key. A key structural invariant in AA trees prohibits two consecutive nodes of the same level along any path from the root to a leaf, analogous to the red-red prohibition in red-black trees, which prevents imbalance.[1] For illustration, a minimal AA tree with a single node has that node at level 1, serving as both root and leaf. Adding a second node as a child requires promoting the root to level 2 while keeping the child at level 1 to satisfy the no-consecutive-same-level rule. A three-node tree, such as keys 2 (root, level 2), 1 (left child, level 1), and 3 (right child, level 1), exemplifies a balanced structure at the smallest scale.[1]History and Development
The AA tree was invented by Swedish computer scientist Arne Andersson in 1993 while he was at the Department of Computer Systems, Uppsala University.[1] Andersson introduced the data structure in his paper titled "Balanced Search Trees Made Simple," which was presented at the Third International Workshop on Algorithms and Data Structures (WADS '93) held in Montreal, Canada, from August 11–13, 1993, and published in the proceedings by Springer Verlag (Lecture Notes in Computer Science, vol. 709, pp. 60–71).[3] This work marked a key milestone in the evolution of balanced binary search trees, offering a streamlined approach amid the rising adoption of structures like AVL trees—developed in 1962 by G. M. Adelson-Velsky and Y. M. Landis—and red-black trees, formalized by L. J. Guibas and R. Sedgewick in 1978. Andersson's primary motivation was to simplify the rules governing red-black trees, which were efficient but often cumbersome to implement and explain due to their color-based balancing invariants.[1] By replacing colors with integer levels and restricting certain configurations, AA trees achieved comparable O(log n) performance for insertions, deletions, and searches while reducing the number of cases to handle during rebalancing.[3] This design responded directly to the early 1990s context, where balanced trees were increasingly essential for practical algorithms but required accessible implementations for education and software development.[1] Since its introduction, AA trees have found adoption primarily in educational settings for their elegance and ease of coding, appearing in algorithms textbooks and course materials post-1993.[4] For example, Mark Allen Weiss refined and named the structure in the second edition of his book Data Structures and Problem Solving Using C++ (2000), emphasizing its pedagogical value as a variant of left-leaning red-black trees.[4] Implementations have appeared in various open-source libraries and programming resources, such as Rust'saatree crate and Java examples from Project Nayuki, though no significant updates to the core design have emerged.[5][6] The structure's enduring appeal lies in its balance of simplicity and theoretical guarantees, making it a staple for teaching self-balancing trees without the full complexity of red-black variants.
Structure and Properties
Node Components
In an AA tree, each node consists of a key for maintaining binary search tree ordering, pointers to left and right child nodes, and an integer level field that is at least 1.[1] The key field stores the comparable value that dictates the node's position relative to others, ensuring that all keys in the left subtree are less than the node's key and all in the right subtree are greater.[1] The left and right pointers reference child nodes or null for leaves, forming the tree's hierarchical structure.[1] The level field serves a dual purpose, acting both as a measure of the node's structural position and as a balancing mechanism equivalent to the color in red-black trees, without requiring a separate color attribute.[1] Leaves are fixed at level 1, while internal nodes occupy higher levels to create a spine-like backbone along the right edges, with left subtrees branching off at decreasing levels.[1] This design enforces a horizontal spine at each level via right children of equal level, while prohibiting horizontal links on the left to maintain balance.[1] The node's components uphold key invariants for structural integrity: the level of every leaf is 1 (AA1); the level of a left child is strictly less than its parent's (AA3); the level of a node is generally one more than its children's, except the right child of a node with two children, which may equal the parent's level (AA2); and every node at level greater than 1 must have two children (AA4).[1] These rules ensure child levels are either one less than the parent (vertical link) or equal (horizontal right link only), preventing unbalanced configurations like consecutive horizontal links on non-right paths.[1] A representative node structure in pseudocode (Python style) illustrates these components:class Node:
def __init__(self, key):
self.key = key # Search key for ordering
self.left = None # Left child pointer
self.right = None # Right child pointer
self.level = 1 # Balancing level (≥1)
class Node:
def __init__(self, key):
self.key = key # Search key for ordering
self.left = None # Left child pointer
self.right = None # Right child pointer
self.level = 1 # Balancing level (≥1)
Level Assignment Rules
In AA trees, the level assignment rules form the core invariants that maintain balance without requiring explicit height computations during operations. Each node stores an integer level value, starting from 1 at the leaves and increasing toward the root. The primary rule stipulates that all leaf nodes must have level 1, ensuring a uniform base for the structure. Additionally, no left child can have the same level as its parent; a left child's level is always exactly one less than its parent's, while a right child's level is either equal to or one less than its parent's. These constraints eliminate horizontal links on the left, allowing them only for right children at the same level, promoting a skewed but controlled right-leaning orientation.[1] A crucial path invariant further enforces balance: on any path from the root to a leaf, no two consecutive nodes may share the same level. This prevents chains of horizontal links that could lead to unbalanced paths, as consecutive same-level nodes would violate the rule by implying an invalid direct horizontal connection. For validation, consider an attempted configuration where a level-2 node has a right child also at level 2, and that child has another right child at level 2; this forms consecutive level-2 nodes on the path, directly breaching the path invariant and rendering the tree invalid, as it mimics a prohibited red-red sequence in the red-black analogy. Such violations are corrected through rotations during insertions and deletions to restore the invariants.[1] These rules collectively guarantee a balanced tree with height at most , where is the number of nodes, achieved by alternating levels that form a "spine" of decreasing levels interspersed with horizontal right links. This bound arises from the structure's equivalence to a red-black tree, where AA tree levels correspond to the black-height (counting black nodes along paths), and horizontal right links represent red nodes without allowing consecutive reds. As a result, the minimum path length equals the root's level, while the maximum is roughly twice that due to inserted red links, ensuring logarithmic performance.[1]Balancing Mechanisms
Single and Double Rotations
In AA trees, balancing is achieved through two single rotation operations: skew and split. These locally restructure the tree to enforce the invariants after insertions or deletions. Skew is a right rotation that eliminates a horizontal left link (where a node and its left child have the same level), while split is a left rotation that eliminates consecutive horizontal right links (where a node, its right child, and the right child's right child all have the same level), followed by a level promotion. These operations preserve the binary search tree property and in-order traversal order. Levels remain unchanged during the rotations themselves, with promotion handled in split.[1] The skew operation is performed on a node that has a left child with . A right rotation is applied: becomes the parent, becomes 's right child, and 's original left child (none, since was left) is adjusted accordingly—specifically, becomes , but since no left horizontal before, standard right rotation: , . This resolves the left horizontal link without level changes.[2] The split operation is the counterpart for right spines. It is applied to a node where the right child has , and 's right child has . A left rotation is performed: becomes the parent, becomes 's left child, with , . Then, to maintain balance, is incremented by 1. This promotes and prevents consecutive horizontal rights. Note that child levels never exceed the parent's; violations are only for equal levels in allowed positions.[1] AA trees do not use double rotations. Instead, the sequence of applying skew followed by split on the same node handles zig-zag patterns effectively, as skew may create a right spine that split then resolves.[2] To illustrate skew (right rotation), consider inserting a node creating a left horizontal: parent (lv 2), left child (lv 2). Before: 5 (lv2)
/
3 (lv2)
5 (lv2)
/
3 (lv2)
3 (lv2)
\
5 (lv2)
3 (lv2)
\
5 (lv2)
5 (lv2)
\
7 (lv2)
\
9 (lv2)
5 (lv2)
\
7 (lv2)
\
9 (lv2)
7 (lv3)
/ \
5 9 (lv2)
(lv2)
7 (lv3)
/ \
5 9 (lv2)
(lv2)
12 (lv3)
/ \
10 15 (lv2)
(lv2)
12 (lv3)
/ \
10 15 (lv2)
(lv2)
Level Promotion and Demotion
In AA trees, level promotion occurs during insertion as part of the split operation to maintain balance. After performing the left rotation in split to resolve a chain of two horizontal right links, the level of the new subtree root (the original right child) is increased by 1. This ensures that the node now has two children at the previous level, simulating a promotion from red to black in the red-black equivalent, preventing further horizontal links at that level. The promotion is always applied after split, as the rotation results in both children having the same (lower) level. This may propagate upward if the promoted node now creates a horizontal link with its parent, triggering additional skew and split recursively or iteratively until invariants hold. This explicit level adjustment simplifies the rules compared to recoloring in red-black trees.[1] Consider inserting nodes 3, 2, 1 into an empty AA tree (new nodes at level 1):- Insert 3: root 3 (lv1)
- Insert 2: left of 3 (lv1); skew on 3: right rotate, root 2 (lv1) right 3 (lv1)
- Insert 1: left of 2 (lv1); skew on 2: right rotate, root 1 (lv1) right 2 (lv1) with 2 right 3 (lv1)
- Now split on 1: left rotate, new root 2 (lv1) left 1 (lv1) right 3 (lv1); promote level(2) to 2.
Core Operations
Insertion Algorithm
The insertion algorithm for an AA tree commences with a standard binary search tree (BST) insertion procedure, where the new node is placed as a leaf following the key ordering rules, and its level is initialized to 1.[1] This ensures the node adheres to the BST property: all keys in the left subtree are less than the node's key, and all in the right subtree are greater.[1] The algorithm assumes unique keys, treating duplicates as invalid or ignoring them to maintain tree integrity.[1] Following the initial placement, a post-insertion fix-up phase traverses upward from the new node to restore the AA tree balancing properties, specifically eliminating consecutive horizontal links (where a parent and its right child share the same level).[1] This fix-up is typically implemented recursively, integrating the balancing steps during the return from the recursive insertion call.[1] At each node on the path, two operations are applied in sequence: first, a skew to address any left horizontal link (a parent and its left child at the same level), and second, a split to handle potential right horizontal links involving the node's right subtree.[1] The skew performs a single right rotation (zig) if the left child has the same level as the parent, promoting the left child upward.[1] The split performs a single left rotation (zig) if the right child's right child matches the parent's level, after which the level of the former right child (now the new subtree root) is incremented by one to separate the horizontal links.[1] These operations, akin to the single and double rotations in the balancing mechanisms, are repeated iteratively up the path until no violations remain, ensuring no two consecutive nodes at the same level form horizontal links.[1] The recursive insertion can be outlined in pseudocode as follows, whereinsert handles the BST placement and fix-up:
Node* insert(Node* root, int key) {
if (root == nullptr) {
return new Node(key, 1); // New leaf at level 1
}
if (key < root->key) {
root->left = insert(root->left, key);
} else if (key > root->key) {
root->right = insert(root->right, key);
}
// Fix-up: skew then split
root = skew(root);
root = split(root);
return root;
}
Node* skew(Node* node) {
if (node->left != nullptr && node->left->level == node->level) {
// Right rotation (zig)
Node* temp = node->left;
node->left = temp->right;
temp->right = node;
return temp;
}
return node;
}
Node* split(Node* node) {
if (node->right != nullptr && node->right->right != nullptr &&
node->right->right->level == node->level) {
// Left rotation (zig) on right child
Node* temp = node->right;
node->right = temp->left;
temp->left = node;
temp->level++; // Promote level of new root
return temp;
}
return node;
}
Node* insert(Node* root, int key) {
if (root == nullptr) {
return new Node(key, 1); // New leaf at level 1
}
if (key < root->key) {
root->left = insert(root->left, key);
} else if (key > root->key) {
root->right = insert(root->right, key);
}
// Fix-up: skew then split
root = skew(root);
root = split(root);
return root;
}
Node* skew(Node* node) {
if (node->left != nullptr && node->left->level == node->level) {
// Right rotation (zig)
Node* temp = node->left;
node->left = temp->right;
temp->right = node;
return temp;
}
return node;
}
Node* split(Node* node) {
if (node->right != nullptr && node->right->right != nullptr &&
node->right->right->level == node->level) {
// Left rotation (zig) on right child
Node* temp = node->right;
node->right = temp->left;
temp->left = node;
temp->level++; // Promote level of new root
return temp;
}
return node;
}
Deletion Algorithm
The deletion algorithm for AA trees follows the standard binary search tree (BST) procedure for locating and removing the target node, followed by a rebalancing phase to restore the level invariants and maintain the tree's balance properties. To delete a key , first search for the node containing . If the node has zero or one child, it is removed directly or replaced by its child, respectively. For a node with two children, the key is replaced by the key of its inorder successor (the minimum key in the right subtree), and the successor node—which has at most one child—is then deleted recursively. This ensures the deletion always reduces to removing a leaf or a node with one child.[1] After removal, the parent of the deleted node (or the node where the successor was removed) may violate the AA tree rules, particularly if a subtree height decreases, potentially creating a node whose level exceeds 1 plus the minimum level of its children. The rebalancing begins at this parent node and proceeds upward toward the root along the search path. At each node on this path, the level is first updated: 's level is set to 1 plus the minimum of its children's levels (treating missing children as level 0). If this decreases 's level and its right child previously matched the old level, the right child's level is also decreased by 1 to preserve the no-horizontal-left-links invariant. This demotion may cascade upward if levels propagate changes.[1] To resolve any resulting structural violations, such as consecutive horizontal (same-level) links forming a "spine," up to three skew operations followed by two split operations are applied at each node . A skew is a right rotation on if its left child has the same level as , promoting the left subtree and eliminating a left horizontal link. A split is a left rotation on if its right child and right grandchild both have the same level as , which decreases the level of the middle node and restructures the right spine. These operations—skew(); skew(); skew(); split(); split()—correct local imbalances without increasing the overall height, ensuring the tree remains balanced. Note that in deletion, these rotations do not adjust levels themselves; level changes occur only through the update step. If the root's level decreases during rebalancing, a new root may be created. This process handles edge cases like deleting a leaf (simple removal with potential demotion cascade), deleting the root (re-rooting after fix-up), or deleting from a long horizontal spine (resolved by rotations to merge subtrees).[1] The following pseudocode outlines the deletion process (adapted from standard implementations based on the original description):function delete(key x, node t):
if t is null:
return null
if x < t.key:
t.left = delete(x, t.left)
else if x > t.key:
t.right = delete(x, t.right)
else:
// Found the node to delete
if t.left is null:
return t.right
else if t.right is null:
return t.left
else:
// Two children: replace with successor
succ = findMin(t.right)
t.key = succ.key
t.right = delete(succ.key, t.right)
// Rebalance upward from t
t = fixAfterDelete(t)
return t
function fixAfterDelete(node p):
if p is null:
return null
updateLevel(p)
skew(p)
skew(p.right)
skew(p.right.right)
split(p)
split(p.right)
return p
function updateLevel(node p):
oldLevel = p.level
p.level = 1 + min(getLevel(p.left), getLevel(p.right)) // getLevel(null) = 0
if p.level < oldLevel and p.right != null and p.right.level == oldLevel:
p.right.level = oldLevel - 1
function skew(node p):
if p.left != null and p.left.level == p.level:
// Right rotation, no level change
rightRotate(p)
// No level decrement
function split(node p):
if p.right != null and p.right.right != null and p.right.right.level == p.level:
// Left rotation, no level change
leftRotate(p)
// No level increment in deletion
function delete(key x, node t):
if t is null:
return null
if x < t.key:
t.left = delete(x, t.left)
else if x > t.key:
t.right = delete(x, t.right)
else:
// Found the node to delete
if t.left is null:
return t.right
else if t.right is null:
return t.left
else:
// Two children: replace with successor
succ = findMin(t.right)
t.key = succ.key
t.right = delete(succ.key, t.right)
// Rebalance upward from t
t = fixAfterDelete(t)
return t
function fixAfterDelete(node p):
if p is null:
return null
updateLevel(p)
skew(p)
skew(p.right)
skew(p.right.right)
split(p)
split(p.right)
return p
function updateLevel(node p):
oldLevel = p.level
p.level = 1 + min(getLevel(p.left), getLevel(p.right)) // getLevel(null) = 0
if p.level < oldLevel and p.right != null and p.right.level == oldLevel:
p.right.level = oldLevel - 1
function skew(node p):
if p.left != null and p.left.level == p.level:
// Right rotation, no level change
rightRotate(p)
// No level decrement
function split(node p):
if p.right != null and p.right.right != null and p.right.right.level == p.level:
// Left rotation, no level change
leftRotate(p)
// No level increment in deletion
Search Procedure
The search procedure in an AA tree follows the standard binary search tree (BST) traversal algorithm, leveraging the ordered structure of the tree without any modifications specific to the AA balancing mechanism. To search for a key, start at the root node and compare the key to the current node's value: if equal, the node is returned as the result; if the key is smaller, recurse on the left subtree; if larger, recurse on the right subtree. This process continues until the key is found or a null child is reached, indicating the key is absent from the tree.[1] The search operation is strictly read-only, involving no rotations, level promotions, or demotions, as these are reserved for insertion and deletion to maintain balance.[2] Due to the AA tree's balance properties, which ensure a height of at most for nodes, the time complexity is in the worst case, where is the tree height, yielding performance; the average case is also .[1] The following pseudocode illustrates the recursive search implementation, which can be adapted to iterative form if preferred:function search(node, key):
if node is null:
return null // Key not found
if key == node.key:
return node // Key found
if key < node.key:
return search(node.left, key)
else:
return search(node.right, key)
function search(node, key):
if node is null:
return null // Key not found
if key == node.key:
return node // Key found
if key < node.key:
return search(node.left, key)
else:
return search(node.right, key)
search(root, key) to initiate the traversal from the tree's root.[1]
An extension of the search procedure enables efficient enumeration of all keys in sorted order via an inorder traversal, which visits nodes in left-root-right sequence; the balanced height ensures this completes in time overall.[2]
For example, consider searching for key 6 in a sample AA tree resulting from insertions of keys 1, 2, 3, 4, 5, 7, 9 (after balancing): assume the structure has root 4 (level 2), left child 2 (level 1) with left 1 (level 1) and right 3 (level 1), and right child 6 (level 1) with left 5 (level 1) and right 7 (level 1), but wait, adjusted for keys; alternatively, a valid path for illustration: start at 4 (6 > 4, go right to 7 level 1), then at 7 (6 < 7, go left to 5 level 1), then at 5 (6 > 5, go right to null), concluding the key is not present; the path length of 3 demonstrates logarithmic efficiency for 7 nodes.[1]
