AVL Trees - Trees that can keep balanced by rotating

An AVL tree (named after inventors Adelson-Velsky and Landis) is a self-balancing binary search tree.

Before we delve into AVL trees, we need to learn a few things.


Height of a tree

The height of a tree is the length of the longest path from the root node (the top node) to some node. In the image above, the tree has a height of 3.

This tree is called balanced. Balance is an important topic in AVL trees. First, let’s look at why balance is important. The speed of searching a balanced tree is $\theta{log \ n]$. The speed of searching a non-balanced tree is $\theta{n}$.

This tree isn’t balanced, it’s a sad tree 😢. Formally, balance is defined as:

The tree is considered balanced if the height is θlog nθlog n

This sucks as a definition. So, everytime you make a tree you have to check this weird property? Balance can be better explained.

Balance is where each node has one left and one right child. In the above example, no nodes had a left child so it is unbalanced.

Height of a node

The depth of a node is a distance from it to the root. The height of a node is the length of the longest path from the node to a leaf.

Here is a tree with heights of each node listed next to them.

Each leaf has its own pointers, both pointing towards NULL (with a height of -1).

We often do not include the null pointers in diagrams, but they’re important as they make our lives easier later on.


AVL Trees

AVL trees require the weight of the left and right children of every node to differ by at most =+ 1. In fact, anything in the range −1≤x≤1−1≤x≤1 is fine. Preferably, we’d like them to differ by 0. That is, it’s perfectly matched on left and right. This isn’t realistic and is impossible for an odd number of nodes in a tree.

$$SR≤1$$

Now, we know we can’t make them 0. And it’s tiresome to calculate S−RSRfor every single pair of nodes. So we just make sure the right has +1 more node than the left. A very short proof of this working is:

$$left=n$$

$$right=n+1$$

$$H=right−left=n−n+1=1$$

Inserting in AVL

  1. Simple Binary Search Tree Insert
  2. Fix AVL balance issues

Let’s start out with this tree. If you look at the root node, the right side has height 2. The left side has height 3. 2 - 3 = -1, which is fine.

Now, let’s insert 23.

Right is 2, left is 4. 2 - 4 = -2. Uh oh! It’s a sad tree now because it’s not balanced. We can turn this tree into a balanced tree by rotating it.

Rotations

Our tree is right heavy. We want to rotate it left. We can do it like this:

Rotations take O(1) time.

We rotate the tree we had above, now it is a happy tree. A colleague told me that 26 is between 23 and 29, therefore 26 goes on top. While this isn’t official advice in any textbooks, it seems to work for most of the problems I’m going to present in this article.

We insert(55). Now it’s a sad tree again, what a disaster! What we have here is a zig-zag:

If we rotate the zig-zag we get:

That’s…. no better than before. Let’s quickly rotate our tree to match this zig-zag pattern:

Now, the solution to the zig-zag pattern is to rotate again. A double rotation. 55 is inbetween 50 and 65, so it should go on top.

Now it’s a happy tree! 🌴

Now we understand rotations, let’s take a look at AVL insertions one more time.

AVL Insertion

  1. BST insert
  2. Fix AVL property from changed node up.

Suppose X is lowest node in the now not-AVL tree. Assume right(x) is higher (heavier).

  • If X’s right is right-heavy, Right rotate - RR(x).
  • Else, Right rotate, then left roate.

AVL Deletion

AVL Deletion is the same as insertion, but instead of insertion we delete. The same 2 principles apply:

  • Delete the node
  • Rebalance the tree