Category: Data structures and algorithms
-
Cyclic Sort Coding Pattern
1. Introduction Cyclic Sort is a highly efficient in-place sorting pattern for arrays where: The key idea is: Place each number at its correct index (e.g., number 1 at index 0, number 2 at index 1, …). Unlike general sorting algorithms like quicksort or mergesort, Cyclic Sort achieves O(n) time complexity without extra space if…
-
Merge Intervals Coding Pattern
1. Introduction The Merge Intervals pattern is one of the most versatile and frequently used patterns in algorithm design, especially for problems involving ranges, durations, time slots, or segments.It allows us to efficiently combine overlapping intervals into larger, non-overlapping intervals. Each interval is typically represented as: The goal is to simplify a list of intervals…
-
Fast and Slow Pointers Coding Pattern
1. Introduction The Fast and Slow Pointers pattern (also known as the Tortoise and Hare algorithm) is a powerful technique used for solving problems that involve sequential traversal, particularly in linked lists, arrays, or circular data structures. The main idea is simple but elegant:You use two pointers that move through the sequence at different speeds…
-
Two Pointers coding pattern
1. Introduction to the Two Pointers Pattern The Two Pointers pattern is a fundamental algorithmic approach used when you need to process elements in pairs or comparative sequences within a data structure such as an array, string, or linked list. It involves maintaining two distinct indices (pointers) that move across the structure in a controlled…
-
Islands (Matrix Traversal) Coding Pattern
1. Introduction to the Islands (Matrix Traversal) Pattern The Islands pattern refers to a class of problems where you are given a 2D grid (matrix), and you need to find or count connected groups of certain cells (usually marked as 1, X, or some special symbol). Each group or cluster of connected cells is referred…
-
Sliding Window Technique
What is Sliding Window? The Sliding Window is a powerful technique for solving problems that involve contiguous sequences or subarrays within a given list or string. Instead of recalculating the result for every possible subarray from scratch, we reuse previous computations, “sliding” the window forward and updating the result incrementally. This reduces time complexity significantly…
-
Min Stack
Problem Statement Design a stack that supports the following operations in constant time: Example: Constraints: Approach We need to keep track of the minimum value at all times. There are multiple ways to do this: Method 1: Using Two Stacks Java Implementation Usage Example Explanation Time Complexity: Space Complexity:
-
Space Complexity
What Is Space Complexity? As discussed earlier, algorithm complexity includes two main components: In this section, we’ll focus on the space complexity, which is usually denoted as the function . All the rules for evaluating space complexity using Big-O notation are identical to those used for time complexity. Example 1: Simple Function Call Let’s consider…
-
Amortized Analysis
Amortized analysis is a powerful technique used to analyze the time complexity of algorithms, especially when an occasional expensive operation is offset by many cheap ones. Let’s explore this concept step-by-step with a concrete example. Understanding the Problem Suppose we have a fully filled array of elements, and we want to insert one more element.…
-
Understanding Recursive Function Complexity in Algorithms
In this section, we’ll explore how to analyze the complexity of recursive functions, a topic often perceived as challenging, but highly important in computer science. To determine the complexity of a recursive function, we follow these general steps: Example 1: A Simple Recursive Chain Consider the following function: Here’s how to evaluate its complexity: As…
