AVL Trees.

Hello and welcome to my website. Today I am going to share my insights and thoughts on Binary Search Trees and their descendant AVL Trees.

AVL Trees get their names from their Russian Computer Scientist duo Georgy Adelson-Velsky and Evgenii Landis. They were first invented in 1962. I like learning new data structures and AVL Trees are an extremely good weapon to have in your Data Structure repertoire.

First a foreword on Binary Search Trees for those who don’t know it. Binary Search Trees (“BST”) are as their name suggests Binary Trees, meaning, very node has either 0/1/2 children and so on. The BST property is that for every node, the left child has a value lesser than that of its parent and the right child has value greater. We will ignore , for the time being, the case of equal keys.

I won’t be covering specifics of BSTs. They are readily available and this post serves as a cursory introduction to AVL trees. We know that Searching,Retrieving and Deleting a node in a BST takes O(log(n))/O(h) time where n=number_of_nodes in BST or h = Height of BST. The problem with BSTs is the order in which the keys are inserted. Its a well known fact that BSTs become glorified Linked Lists, they have O(n) time insertion, in three cases

  • Keys are inserted in ascending order
  • Keys are inserted in descending order
  • Keys are inserted in alternate ascending/descending order.


AVL trees overcome this and give O(log(n)) time insertion,deletion or search by something called Balancing. AVL trees are one of many self-balancing trees. They are different from BSTs in the sense that, they balance their height after every insertion/deletion and in the process never degenerate into Linked Lists.

Now, How do they balance themselves ? They do this via this interesting mechanism called Tree-rotation. AVL Trees along with BST property have a specific feature – For every node the absolute value of difference between the height of right and left sub-tree is always  less than or equal to 1. We will label balance factor as absolute_value(height_of_right_subtree – height_of_left_subtree). So balance factor is always -1,0 or 1. This property is why maintains O(log(n)) insertion/deletion at all times.

Now back to where we left off. Tree-rotation , there are two types tree-rotations, left rotation and right rotation.


Attached are the images of left and right rotation. The thing you need to grasp is upon inserting and element into an AVL Tree, the balance factor at the worst can become 2(absolute value of difference). This is where Tree rotation comes into picture, as you can infer from the above two illustrations, For unbalanced trees, Tree rotation can reduce the balance factor to our required range of [0,1]. When Trees are already balanced, then Tree rotation only changes the order of the elements without disturbing the tree.

Alas this has become too long a post. Only one more case left. Stay with me. Sometimes one rotation is just not enough. You need two tree rotations to balance our AVL Tree. They are when left sub tree becomes right heavy and right sub tree becomes left heavy. The illustrations will make this lucid –

right left heavy
Left sub tree is right heavy. First rotate around B and then rotate around A.
Right sub tree is left heavy. First right rotate around B and then left rotate around A.

So whenever you insert or delete an element from an AVL Tree, you need to do Tree balancing which is done via Tree rotation. With this, I have almost reached the end of my writing. This is my inaugural blog post, I hope my illustrations/explanations helped you attain some, however rudimentary it was, understanding of this beautiful data structure. Check this link, explains tree rotation in a very lucid manner- Tree rotations.

Link to AVL Tree code in C/C++. Take a look at my implementation of AVL Tree on GitHub. Follow this blog, more interesting stuff on Red Black Trees and Tries, to come.If you liked this follow me on github GitHub profile. Adios Amigos and Merry Christmas.