Table of contents
1.
Introduction
2.
What are AVL Trees?
3.
AVL Tree Examples
3.1.
Example 1: Insertion
3.2.
Example 2: Deletion
4.
Balance Factor in AVL Trees
5.
Operations in AVL tree
5.1.
Insertion
5.2.
Deletion
6.
AVL Rotations
6.1.
Left Rotation (LL - Single Rotation) 
6.2.
Right Rotation (RR - Single Rotation)
6.3.
Left-Right Rotation (LR - Double Rotation) 
6.4.
Right-Left Rotation (RL - Double Rotation)
7.
Implementation of AVL Trees
7.1.
C
8.
Advantages of AVL Trees
9.
Disadvantages of AVL Trees
10.
Application of AVL Tree in Data Structure
11.
Frequently Asked Questions
11.1.
What is AVL tree and its application?
11.2.
What is AVL tree in data structure complexity?
11.3.
Which is better: AVL tree or Red-Black Tree?
12.
Conclusion
Last Updated: Nov 19, 2024
Easy

AVL Tree Data Structure

Author Yukti Kumari
5 upvotes
Career growth poll
Do you think IIT Guwahati certified course can help you in your career?

Introduction

Trees are one of the most helpful Data Structures that are often used to perform operations like search, insertion, and deletion efficiently. In this article, we will learn about AVL tree in data structure from scratch. 

AVL Tree in Data Structure

This article will address all the important points and terms related to AVL trees, starting with their definition, why we need them, and their advantages and properties. 

What are AVL Trees?

AVL trees are nothing but self-balancing binary search trees. 

What do you mean by a self-balancing tree?

The height of the left and the right subtrees of every node differs by at most 1, which implies the difference between the heights will always be either 1,0 or -1.

So, AVL trees are an optimized version of binary search trees named after its inventors G.M. Abelson-Velvety and E.M. Landis, in 1962. It was an attempt to improve the performance of Binary Search Trees.

Examples

 

The tree in figure (a) is an AVL tree. As you can see, for every node, the height difference between its left subtree and right subtree does not exceed mod(1). 

While in figure (b), if you check the node with value 3, its left subtree has height=2 while the right subtree’s height=0. So, the difference is mod(2-0) = 2. Hence, the AVL property is not satisfied, and it is not an AVL tree.

AVL Tree Examples

Example 1: Insertion

Insert the keys 10, 20, 30 into an empty AVL tree.

Insert 10:
Tree:

10

Insert 20:
Tree:

   10
     \
      20

Insert 30: (Unbalanced, right-heavy)
Perform a left rotation at 10.
Resulting AVL Tree:

   20
  /  \
 10   30

Example 2: Deletion

Start with this AVL tree:

      30
     /  \
   20    40
  /
 10

Delete 10:
Resulting AVL Tree (already balanced):

      30
     /  \
   20    40

Delete 40:
The tree becomes unbalanced. Perform a right rotation at 30.
Resulting AVL Tree:

      20
     /  \
   10    30

Balance Factor in AVL Trees

AVL trees use the balance factor to get a height-balanced tree.

BALANCE FACTOR = height(left subtree) - height(right subtree)


Let’s look at an example tree that shows the balance factor of each node - 

 

In the above example, the balance factor of every node is between -1 and +1. So, it is an AVL tree.

Properties of Balance Factor

  • If a subtree has a balance factor>0, then the subtree is called left-heavy as the height of the left subtree is greater than the height of its right subtree, so the left subtree has more nodes than the right subtree.

 

  • If a subtree has a balance factor<0, then the subtree is called right-heavy as the height of the left subtree is smaller than the height of its right subtree, so the right subtree has more nodes than the left subtree.

 

  • If the balance factor=0, the subtree is perfectly balanced, with both left and right subtrees having equal heights.

Operations in AVL tree

AVL tree performs two major operations namely Insert and Delete.

Insertion

AVL tree insertion pursues the exact procedure as that of binary search insertion. Rotations can help balance the tree and prevent violation of the AVL tree property. We first put a node into the proper position in the binary search tree for insertion in AVL trees. 

After satisfying insertion, we calculate each node's Balance Factor (BF) and review whether it equals -1,0 and 1. If it is not identical, we perform "rotations" on the tree to balance it.

Deletion

Deletion in the AVL Trees takes place in three different cases:

1. Deletion from a leaf node

If the node to be deleted is a leaf node, it is deleted without any replacement as it does not disturb the binary search tree property. However, the balance factor may get disturbed, so rotations are applied to restore it.

2. Deleting a node with one child

If the node to be removed has a single child, substitute its value with the value of the child node and later remove the child node. In case the balance factor gets disrupted, execute rotations as required.

3. Deleting a node with two children

If the node to be removed has two child nodes, locate the in-order successor of that node and substitute its value with the value of the in-order successor. Afterward, attempt to delete the in-order successor node. If the balance factor surpasses 1 following the deletion, use balance algorithms.

AVL Rotations

Different operations which we can perform on an AVL tree are:

  • Search
  • Insertion
  • Deletion
  • Traversal

Among these operations, the search and traversal operations do not affect the height of the AVL tree. But when we insert a new node in the tree or, say, delete a node, it may change the height of a subtree. This, in turn, can change the balance factor of a node and violate the height-balanced property of AVL trees.

Let us see when each of these rotations is used -

Left Rotation (LL - Single Rotation) 

Left rotation is performed when the imbalance is created due to the insertion of a new node in the right subtree of a right subtree or when a subtree becomes right heavy.

Consider this example - 

 

Node 6 has a balance factor of -2 after the insertion of node 10. After a single rotation towards the left, node becomes the root and node 6 becomes its left child. By doing this, the BST property also remains intact, and the tree becomes height-balanced.

Right Rotation (RR - Single Rotation)

Right rotation is needed when the imbalance is created due to the insertion of a node in the left subtree of a left subtree or when a subtree becomes left heavy.

See this figure:

 

Node has a balance factor of 2, so it is unbalanced. After a single rotation towards the right, node becomes the root and node becomes its right child. And the new root, i.e., node 2, has a balance factor of 0. Hence,  the tree becomes balanced.

Left-Right Rotation (LR - Double Rotation) 

It is a double rotation in which a right rotation follows a left rotation. It is needed when a new node is inserted in the left subtree of the right subtree of a node.

 

Node 1 is inserted as the left child of node 2, which is the right child of node 5. See that the balance factor of node 5 becomes 2 after insertion of node 1. Firstly, we do a left rotation, but the tree is still not balanced. If you see carefully, the tree is left heavy after this step, so we need a right rotation. After which node 1, i.e. the newly inserted node, becomes the root.

Right-Left Rotation (RL - Double Rotation)

In this, a right rotation is followed by a left rotation. It is needed when a node is inserted in the left subtree of the right subtree of a node. 

The insertion of node 6 creates an imbalance. Firstly, we perform a right rotation which makes node a parent of node and node becomes the right child of node 6. Next, we perform left rotation on the resulting right-heavy tree and finally, the tree becomes balanced.

Implementation of AVL Trees

  • C

C

#include <stdio.h>  
#include <stdlib.h>

struct Node
{
int key;
struct Node *left;
struct Node *right;
int height;
};

int get_Height(struct Node *n){
if(n==NULL)
return 0;
return n->height;
}

struct Node *create_Node(int key){
struct Node* node = (struct Node *) malloc(sizeof(struct Node));
node->key = key;
node->left = NULL;
node->right = NULL;
node->height = 1;
return node;
}

int max (int a, int b){
return (a>b)?a:b;
}

int getBalanceFactor(struct Node * n){
if(n==NULL){
return 0;
}
return get_Height(n->left) - get_Height(n->right);
}

struct Node* right_Rotate(struct Node* y){
struct Node* x = y->left;
struct Node* T2 = x->right;

x->right = y;
y->left = T2;

x->height = max(get_Height(x->right), get_Height(x->left)) + 1;
y->height = max(get_Height(y->right), get_Height(y->left)) + 1;

return x;
}

struct Node* left_Rotate(struct Node* x){
struct Node* y = x->right;
struct Node* T2 = y->left;

y->left = x;
x->right = T2;

x->height = max(get_Height(x->right), get_Height(x->left)) + 1;
y->height = max(get_Height(y->right), get_Height(y->left)) + 1;

return y;
}

struct Node *insert(struct Node* node, int key){
if (node == NULL)
return create_Node(key);

if (key < node->key)
node->left = insert(node->left, key);
else if (key > node->key)
node->right = insert(node->right, key);

node->height = 1 + max(get_Height(node->left), get_Height(node->right));
int bf = getBalanceFactor(node);

// Left Left Case
if(bf>1 && key < node->left->key){
return right_Rotate(node);
}
// Right Right Case
if(bf<-1 && key > node->right->key){
return left_Rotate(node);
}
// Left Right Case
if(bf>1 && key > node->left->key){
node->left = left_Rotate(node->left);
return right_Rotate(node);
}
// Right Left Case
if(bf<-1 && key < node->right->key){
node->right = right_Rotate(node->right);
return left_Rotate(node);
}
return node;
}

void preOrder(struct Node *root)
{
if(root != NULL)
{
printf("%d ", root->key);
preOrder(root->left);
preOrder(root->right);
}
}

int main(){
struct Node * root = NULL;


root = insert(root, 11);
root = insert(root, 22);
root = insert(root, 24);
root = insert(root, 56);
root = insert(root, 63);
root = insert(root, 35);
printf("%s \n", "The preorder traversal of AVL tree is: ");
preOrder(root);
return 0;
}
You can also try this code with Online C Compiler
Run Code

 

Output:

Output

Explanation:

The C code provided implements an AVL tree, which is a self-balancing binary search tree. It defines a node structure that includes keys, left and right pointers, and heights. The program comprises insertion, height computation, and balancing rotations. The main function creates an AVL tree, adds nodes, and shows a pre-order traversal to demonstrate how the tree maintains balance for efficient operations.

Advantages of AVL Trees

  • The AVL tree is always height-balanced, and its height is always O(log N), where is the total number of nodes in the tree.
  • The time complexities of all the operations are better than BSTs as the AVL tree has the worst-case time complexity of all operations as O(log N), while in BST, it is O(N).
  • AVL trees have self-balancing properties.
  • AVL trees are not skewed.
  • AVL trees perform better searching operations than Red-Black Trees
  • It shows better search time complexity.

Disadvantages of AVL Trees

Some of the main disadvantages of AVL trees are:

  • AVL trees are challenging to implement.
  • For specific operations, AVL trees show relatively high constant factors.
  • Insertion and deletion are slow in AVL trees.
  • If you're working on a system with frequent updates, avoid AVL trees.
  • The AVL Tree applies rotations and the balancing factor, which causes it to have a complicated data structure.

Application of AVL Tree in Data Structure

There are many applications of AVL Trees. Some of them are:

  • Its application is in in-memory and dictionaries to store data.
  • Database systems frequently use AVL trees as indexes to allow quick data searching and recovery of data functions.
  • AVL trees operate in networking routing algorithms.
  • AVL trees encounter applications in interactive storytelling games.

Frequently Asked Questions

What is AVL tree and its application?

AVL trees are self-balancing trees that are commonly employed in database applications with minimal insertions and deletions but frequent data lookups. It is utilized in applications other than database applications that require enhanced searching.

What is AVL tree in data structure complexity?

Since the AVL tree is a self-balancing binary search tree, its height in the worst scenario is O(logN). The height of the AVL tree is O(logN). It is important to balance the tree rotations, which requires O(1) time. As a result, in the worst-case scenario, overall complexity remains O(logN).

Which is better: AVL tree or Red-Black Tree?

AVL tree provides faster lookups than Red-Black tree as they are strictly balanced. Red Black Trees provide faster insertion and removal operations than AVL trees as fewer rotations are done due to relatively relaxed balancing.

Conclusion

The important points we learnt in this article are AVL Tree in Data Structure are self-balancing binary search trees, the fundamental attribute of the AVL tree is the balance factor and the balance factor is the difference between the height of the left and right subtrees of a node. The allowed values of the balance factor are -1, 0, and +1. We also learned that rotation is required when insertion or deletion creates an imbalance in any of the subtrees. The time complexity of insert, delete, and search operation is O(log N). 

You can also check out our following articles - 

Live masterclass