An overview of virtualization in computer systems and software engineering – why it’s more important than you might imagine | by Kaustav Ganguly | Jul 2022
In this blog, we take a look at virtualization from a layman’s perspective… what is virtualization, why do we need it, and how is it implemented.
Virtualization can be implemented at different levels and in different areas in the grand scheme of things ranging from hardware, software, operating systems, server, cloud, applications and even network. Each has its use cases although in general all are useful and provide several benefits in the IT ecosystem. However, in this article, we take a look at virtualization from a software engineering perspective, why you should know about it, and how you as a developer can benefit from it.
Let’s stop and think about the concerns one may have when developing an application.
- You want your code to run on several different platforms with little or no adjustments
- You want your development flow to be fast with minimal setup and configuration required
- You want scalability in your application that automatically adapts to increased resource requirements
- You want to ensure that the server remains active all the time and accessible by users in different geolocations without interruption (deployability)
- You don’t want to worry about all the dirty maintenance work and if something goes wrong you should be able to fix it quickly.
- You also want maximum availability of your application for the end user with minimum latency
- You want a fast, easy, and streamlined deployment mechanism that can deploy code instantly without any hassle
Virtualization gives you the ability to do all of this and more.
But wait.. what is virtualization in the first place.
Think of it as an abstraction layer of the system that provides its particular set of features (possibly adding some of its own) without actually being one. Basically we are trying to emulate hardware/software functionality via pseudo software without exposing the real system. However, the underlying system is still there and, in fact, serves as a framework that compliments the virtualization software by allowing it to run on top of or as part of it. It’s just that virtualization software has one or more built-in layers (depending on need) that completely abstain from it.
Now virtualization software comes in different flavors depending on what one is trying to achieve, but a similar goal in all of them is to hide the underlying real system or abstract parts of it as his. For example, there is no cloud computing without virtualization. Hardware virtualization is what makes cloud computing possible, it allows multiple applications to run on the same server.
General Benefits of Virtualization
- Working on virtual environments allows consistency across multiple platforms, regardless of differences in environment, version, or firmware.
- Server efficiency and availability can be dramatically increased with the dynamic load balancing made possible by virtualization.
- For developers, testing software for multiple environments is a snap with virtualization
- It also provides a stable development environment where debugging does not require reconfiguring the environment. Instead, it can simply be restored to the last stable state
- Maintainability is greatly improved when using virtualization, as repairing it simply requires it to be replaced with another image with minimal downtime.
Let’s look at some common categories of virtualization software, what sets them apart, and how they work
Virtual machines can be considered as independent computers that emulate and divide the physical components of the real computer (memory, processor, driver…), each VM comes with its own operating system and its own kernel.
It is used to increase the efficiency of using hardware resources. The hypervisor is the software layer that coordinates the virtual machines. It serves as the interface between the virtual machine and the underlying physical hardware, ensuring that each has access to the physical resources it needs to run. This also ensures that the virtual machines do not interfere with each other by encroaching on each other’s memory space or compute cycles.
Containers have their own way of providing virtualization to applications. They don’t have a separate kernel/operating system, but run on the same operating system and provide their own layer on which applications can run. It allows them to have isolated virtual networks with multiple containers running inside them with fully managed setup, configuration, and communication.
There is no direct application abstraction, but a concept of encapsulating individual services (single process) in containers that can be run on top of the base layer. This brings the advantages highlighted above and the independence of containers even allows the implementation of microservices. The containers themselves are defined by templates known as images that can be easily pushed, cloned, pushed, and even just dumped and rebuilt if needed.
As is evident, virtualization solves many common challenges in software engineering and computer science in general and it has applications in a wide range of fields. Deciding what type of virtualization to implement for the system under given circumstances keeping in mind the benefits one is looking for is very crucial. It is important to note that there may be situations where multiple virtualizations are used together as is often the case in DevOps where Containers and VMs are used together.