How server virtualisation is evolving to cloud native environments

Cover Image

Server virtualisation has given businesses the ability to abstract applications and the operating system from the underlying hardware. For years organisations have used virtualisation to gain greater levels of utilisation from their hardware investments, where one big server is divided into multiple VMs (virtual machines), each of which can be configured to run a workload, comprising an operating system and a software environment.

Businesses can deploy appliances, known as hyper-converged systems, that combine CPUs, storage and virtual server environment, all in one box. These tend to be horizontally scalable, enabling IT departments to grow their virtual server environment simply by adding another appliance. The management software in hyper-converged systems takes care of distributing workloads to maximise the greater level of computational power and storage available.

Today, thanks to the advent of infrastructure as a service, organisations can choose to run their VMs locally, on premise, or in a public cloud using IaaS (infrastructure as a service) from any of the major cloud providers.

Among the biggest problem areas in server virtualisation is the idea of VM sprawl. Since, they are not physical servers, VMs can easily be created. Often, however, they are not removed if they are no longer required. On-premise, this may not create such a big issue, since a dormant VM does not really consume resources like CPU, memory and storage. But if they are left to run indefinitely in the background, they will continue to consume valuable IT resources. In a cloud environment, businesses can incur huge fees if they fail to administer their virtual servers properly.

While server virtualisation is a well-established method to reduce IT costs by consolidating on-premise servers and running appropriate workloads in the public cloud, a new IT architecture is now emerging. Rather than virtualise existing software environments, new applications are increasingly being developed to run natively in the cloud. This approach departs from the horizontally integrated software stack that applications have traditionally been built upon. Instead, the application code is divided into functional units known as microservices, each of which runs in a lightweight virtual environment known as a container. The code within the container, can be an internally developed microservice, or one from an external provider. A cloud native architecture uses containers to run a set of microservices in order to perform a required function programmatically.

Feb 8, 2021
Jul 26, 2018
Already a Bitpipe member? Log in here

Download this eGuide!