Load Balancers

System Design

Load Balancers

Overview

When the number of requests in an application increases, it can overload a server which affects system performance.

A single server has limited throughput and resources.

For example, an online marketplace like Amazon. During Black Fridays or the Christmas season, it experiences an unusual surge in traffic. It’s only a matter of seconds before the server gets overloaded, therefore, there is a need to scale to effectively handle the increased demand.

Scaling can be done in two ways, vertically or horizontally. In order to scale horizontally, there is a need for a load balancer.

A load balancer is a device that is used to distribute application traffic across a number of servers. It improves the overall performance of a system by distributing the traffic to different servers, therefore, decreasing the burden on a single server.

A load balancer sits between clients and servers. It routes clients’ requests between servers, ensuring that no single server is overworked which could make an application unavailable and unreliable.

Whenever you're ready

There are 4 ways we can help you become a great backend engineer:

The MB Platform

Join 1000+ backend engineers learning backend engineering. Build real-world backend projects, learn from expert-vetted courses and roadmaps, track your learnings and set schedules, and solve backend engineering tasks, exercises, and challenges.

The MB Academy

The “MB Academy” is a 6-month intensive Advanced Backend Engineering BootCamp to produce great backend engineers.

Join Backend Weekly

If you like post like this, you will absolutely enjoy our exclusive weekly newsletter, Sharing exclusive backend engineering resources to help you become a great Backend Engineer.

Get Backend Jobs

Find over 2,000+ Tailored International Remote Backend Jobs or Reach 50,000+ backend engineers on the #1 Backend Engineering Job Board