In 1997, IBM’s supercomputer Deep Blue defeated the world champion Garry Kasparov in a chess game. An iconic moment in the AI history, indeed! In this section, we will see that writing artificial intelligence algorithms are not as nearly interesting as it is in the Hollywood movies.

Think about any two-player board game: chess, backgammon, whatever you prefer... You and your rival take turns and each of you trying to reach that definitive state that you crush your opponent. At any point, you have a limited set of moves and at each step, your aim is to make the best…

Gather around kids, it’s time for a story. And this story is about a breathtaking race between the tortoise and the hare. Spoiler alert: At the end, tortoise catches up to the hare. Well, kind of…

Okay, so I haven’t been writing for a while because… Well, I didn’t want to. Then yesterday I came across this question on **HackerRank**, which happens to be the last question of the **C++** section and found it quite interesting. But before directly diving into the question, let’s start with a more simple problem and progressively build this thing up.

This is a classic…

What would you do if I ask you to find the k-th smallest or largest element in an unordered set of numbers? Sort the numbers? God, you are boring…

Most programmers immediately answer with sorting when they are asked to find k-th smallest element. Why is that? Well, **because it is simple!** All you need to do is a call to a *sort *function and you are good to go. But can we do any better? *I guess I wouldn’t be writing this if we couldn’t…*

Let’s start by finding the smallest/largest element in the set. …

Okay, this one should not take long. The entire purpose of this post is to introduce the idea of utilizing randomness in order to improve the performance.

Initially, I was planning to write on a different subject but halfway through I realized a short post on randomized algorithms might be beneficial for the subject. So, here goes: a **randomized algorithm** is simply an algorithm that uses some sort of randomness at some point during your algorithm to improve the performance by uniformly distributing the inputs.

This subject is heavily based on probabilistic analysis, hence requires a strong mathematical background. However…

Did you ever wonder how

data compressionprograms like WinRAR or WinZip work? And did you ever ask yourself if we can compress the data, why not store and use it that way in the first place? If so, welcome to the third chapter of the series:Huffman Codes.

So, I just solved my 200th question in HackerRank after being off for nearly two years, which was incidentally about Huffman Decoding and found myself writing this post. Let me start by asking a question: How can you modify the data in your computer to take less space? …

This story is a direct sequel to my previous post on Greedy Algorithms, so you may want to check that out first. It is still okay if you are only interested in **Dynamic Programming** and want to skip it since I will be explaining the problem again.

When I was a university student, dynamic programming was the hardest topic in the Algorithms class for me. That was mainly because of being have to read long pages of redundant explanations prior to understanding the actual problem. …

If someone asked me to describe what a software engineer does in a single word, my answer would be **optimization**. The fact is most of the time we don’t look for the best solution and we don’t look for the cheapest either, but we look for the optimum one. Sacrificing your time for the best solution is a noble cause, but there may come times where the best solution is overkill and you don’t always have the resources (time, money, etc.) to go after your ideals. In fact, you don’t always need the best solution either, just a fairly good…

Senior Software Engineer, Rock Vocalist & Guitarist