System Design & Architecture

Refining Load Balancer

In the last article, we covered the basics of building a Layer 7 load balancer in Go, touching on routing, SSL termination, and rate limiting. Since then, the focus has been on improving performance, maintainability, and scalability.

This article highlights key upgrades like adopting clean architecture, switching to configuration files, and using connection pooling to enhance backend communication. These changes make the system more flexible and set the stage for even more optimizations, including advanced health checkers, which we’ll explore in the next article.

Read more →
System Design & Architecture

Building a Layer 7 Load Balancer

Load balancers are essential for applications or services that handle high volumes of traffic, ensuring that they can scale efficiently to meet demand. Modern load balancers have evolved into comprehensive traffic management solutions, often incorporating rate limiting, caching, DDoS protection, and more, making them indispensable for high-scale systems.

What is a load balancer? #

Imagine you’re at an airport, standing in line for the security check. There is a single line of passengers, but there are five booths with agents checking boarding passes and IDs. At the front of the line, there is an agent directing passengers to available booths. This agent acts like a basic load balancer, distributing incoming requests (passengers) to backend servers (TSA agents) for processing.

Read more →
AI & ML

A guide to Neural Networks

Fig 1. Neural net with Input layer ∈ ℝ16; hidden layer 1 ∈ ℝ12; hidden layer 2 ∈ ℝ10 and finally a single unit ∈ ℝ in the output layer

Neural Network architecture

Fig 1. Neural net with Input layer ∈ ℝ16; hidden layer 1 ∈ ℝ12; hidden layer 2 ∈ ℝ10 and finally a single unit ∈ ℝ in the output layer

TLDR: Neural Networks are computational models that stacks units called neurons into layers to create predictions after we have trained the network. Each unit performs a linear transformation on the data followed by a non-linear transformation to produce a result. Forward propagation computes the network result and backward propagation computes the changes in weights and biases for the network in subsequent epochs. The non-linear activation function supports the network to assume any mathematical function, this is called the Universal Approximation theorem.

Introduction #

Every day we are relying on machine learning models to make our lives simpler and hassle-free. Take your average day, where you start, by reading news and weather predictions, or later in the day when you are checking your emails, we unconsciously are relying on machine learning models that run behind the scenes. The list goes on and on with applications in image recognition used in our gallery, audio recognition in voice assistants. This list keeps growing every day.

A few popular classical Machine learning models include Logistic Regression, Linear Regression, Decision Trees, Support Vector Machines, and Naive Bayes models. Although these models are nowadays less commonly used in practice, they are still very useful as understanding them helps to understand the more recent models. Ensemble models such as Random forests and XGBoost are some machine learning models that are commonly used these days.

Read more →
AI & ML

Topic modeling with Gensim

I first started doing topic modeling when I used to play around with the nips dataset. The first time I tried it, I used scikit-learn for this. I used LDA and NMF for this, and I received results that I was happy with. In this way, I think scikit-learn is one of the most appropriate tools available for exploratory data science tasks. But I had bigger plans, of tackling even bigger datasets. Then I got introduced to another python library gensim which is focused on topic modeling. Among many features it provides, it includes transformations such as online LDA, LSA and HDP, and wrappers to other popular libraries like scikit-learn, vowpal wabbit, and Mallet.

The code can be viewed at my Github repository.

Read more →
AI & ML

Shelter Animal Outcomes

A puppy taken in at a shelter

Picture taken from SOS

Shelter animal outcomes is a knowledge competition hosted by kaggle. My goal was to learn and get familiarized with the different techniques, methods and tools required to solve classification problems. Throughout this journey, I utilized pandas, matplotlib, seaborn, jupyter and scikit-learn. The goal of this competition is to predict the outcome of a cat or a dog, based on its age, gender(also whether spayed/neutered or not), breed, color. The outcomes are return-to-owner, adoption, transfer, euthanasia and died. In some cases we are provided with outcome subtypes too.

All my work related to this can be found in this github repository, it houses my jupyter notebooks.

Read more →