Data centers are gaining increasing importance as an indispensable part of modern computing infrastructures. According to Researches and Markets, the Global Data Center Infrastructure market is projected to grow at a CAGR of 6.79% during the forecast period, reaching a total market size of US$230.169 billion in 2025 from US$155.201 billion in 2019.
Data centers are physical facilities where computing and networking equipment are located and centralized. They host servers and associated components, such as storage systems and network communications. The main role of a data center is to collect, store, process, and distribute large amounts of data.
Over the years, data centers have evolved from centralized on-premises facilities to edge deployments to public cloud services, and they are now critical components of modern IT infrastructure. However, this growth in importance is also accompanied by increasing concern regarding the energy consumption of these facilities. In total, the global internet consumes 416.2 terawatt-hours of electricity per year, which is more electricity than the whole of the United Kingdom. This translates into large amounts of carbon emissions. In fact, 2% of global carbon emissions come from the electricity used by the internet, which is mainly consumed by data centers.
According to widely cited energy forecasts, the total electricity demand of information and communication technology (ICT) will accelerate in the 2020s, and data centers will take a larger slice. This implies that their environmental impact will increase as well. If we add to that the rising energy costs, it becomes obvious that businesses and organizations need to efficiently address data centers’ energy consumption issues in order to cut costs and promote environmental responsibility.
On average, servers and cooling systems account for the greatest shares of direct power use in data centers as shown in the figure below.
The volume of workloads that a data center processes and the total number of servers required to process the workloads are the main factors that determine how much energy those servers will consume. However, so much energy is wasted due to the fact that servers run idle or at low utilization. Servers usually don’t employ any energy-saving mechanisms like spin-down idle disks or standby, and 60% of a typical database server’s peak energy is consumed when idle.
With that in mind, let’s look at some of the efficient ways that can help reduce servers’ energy consumption.
Most physical servers run at about 5% to 15% utilization, yet they draw full power. In order to increase server utilization and reduce wasted energy, most companies and organizations use server consolidation and virtualization.
Server virtualization aims at reducing electrical consumption by lowering the number of physical servers. Virtualization is the process of creating and abstracting multiple virtual servers on a single physical server host via software, each virtual server runs independently. This operation is done using a hypervisor that serves as a platform for the virtual servers’ operating systems while keeping the environments totally private and separate from each other.
Virtualization and consolidation go hand in hand, and that’s because the former is the enabler of the latter. Consolidating multiple, independent servers to a single physical server enables those servers to operate more efficiently and reduces energy costs by up to 80%.