AdverseAdverse ← Back to Learn
// THE COMPLETE REFERENCE

Data Structures
& Algorithms

Big O complexity · Core data structures · Every LeetCode pattern you need to know

7Complexities
10Data Structures
18LC Patterns

Big O Notation

A mathematical notation describing how runtime or space requirements grow relative to input size as n → ∞

The Core Idea

We drop constants and lower-order terms. O(2n + 5) → O(n). We care about growth rate, not exact counts. Best case uses Ω (Omega), average case uses Θ (Theta), worst case uses O (Big O). In interviews, always default to worst case unless asked otherwise.

O(1)
Constant Time

Runtime does not change regardless of input size. Always executes in the same time or space regardless of n.

// Array index access
arr[5] → always 1 step
HashMap.get(key) → always ~1 step
O(log n)
Logarithmic Time

Input is halved at each step. Incredibly efficient — doubling n only adds one more operation.

// Binary search
// Balanced BST operations
// Divide and conquer (no merge)
O(n)
Linear Time

Runtime scales directly with input size. Every element must be visited at least once.

// Single loop over array
// Linear search
// Building a hash map from array
O(n log n)
Linearithmic Time

The sweet spot for comparison-based sorting. Each of n elements takes O(log n) work.

// Merge sort, Heap sort
// Quicksort (avg case)
// Divide & conquer with merge
O(n²)
Quadratic Time

Nested loops over the same data. Doubles in n means 4× the work. Avoid for n > 10,000.

// Bubble sort, Insertion sort
// Brute-force pair search
// Comparing all pairs in array
O(2ⁿ)
Exponential Time

Doubles with each addition to input. Each element has 2 choices: include or exclude.

// Recursive Fibonacci (naive)
// Power set generation
// Traveling salesman (brute)
O(n!)
Factorial Time

Generating all permutations. Completely impractical beyond n=12. The worst growth rate.

// All permutations
// Brute force TSP
// Bogosort

Growth Order (slowest → fastest)

O(1) < O(log n) < O(√n) < O(n) < O(n log n) < O(n²) < O(n³) < O(2ⁿ) < O(n!)

Core Data Structures

The building blocks. Know these deeply — every algorithm lives on top of one of these.

🗂

Array

Contiguous block of memory storing elements of the same type. O(1) access by index, but insertion/deletion mid-array is costly due to shifting.

Operation Avg Worst
Access O(1) O(1)
Search O(n) O(n)
Insert (end) O(1) O(n)
Insert (mid) O(n) O(n)
Delete O(n) O(n)
Two PointersSliding WindowPrefix SumKadane's
🗃

Hash Map / Hash Set

Key-value store using a hash function to compute array index. O(1) average for all operations. Handle collisions via chaining or open addressing.

Operation Avg Worst
Access O(1) O(n)
Search O(1) O(n)
Insert O(1) O(n)
Delete O(1) O(n)
Frequency CountAnagramTwo SumMemoization
🔗

Linked List

Nodes linked by pointers. No contiguous memory. O(1) insert/delete at known position, O(n) search. Doubly-linked lists allow O(1) deletion given node reference.

head1
2
3
null
Operation Avg Worst
Access O(n) O(n)
Search O(n) O(n)
Insert (head) O(1) O(1)
Delete (known) O(1) O(1)
Fast & Slow PointerReverseLRU Cache
📚

Stack

LIFO — Last In, First Out. Only access the top element. Implemented with array or linked list. Core to DFS, backtracking, expression parsing.

Operation Avg Worst
Push O(1) O(1)
Pop O(1) O(1)
Peek/Top O(1) O(1)
Search O(n) O(n)
DFSParenthesesMonotonic StackBacktracking
🎫

Queue / Deque

FIFO — First In, First Out. A deque (double-ended queue) allows O(1) push/pop from both ends. Used in BFS, sliding window maximum.

Operation Avg Worst
Enqueue O(1) O(1)
Dequeue O(1) O(1)
Peek O(1) O(1)
Search O(n) O(n)
BFSSliding Window MaxLevel Order

Heap (Priority Queue)

Complete binary tree satisfying heap property. Min-heap: parent ≤ children. O(log n) insert/delete, O(1) peek at min/max. Built on array — children of i are at 2i+1 and 2i+2.

Operation Avg Worst
Peek Min/Max O(1) O(1)
Insert O(log n) O(log n)
Delete Min/Max O(log n) O(log n)
Heapify O(n) O(n)
Top KDijkstraMerge K ListsMedian
🌳

Binary Tree / BST

BST: left < parent < right. Enables O(log n) search/insert on balanced trees. Unbalanced degrades to O(n). DFS traversals: inorder, preorder, postorder.

Operation Avg (balanced) Worst
Access O(log n) O(n)
Search O(log n) O(n)
Insert O(log n) O(n)
Delete O(log n) O(n)
DFSBFSLCAPath SumSerialize
🕸

Graph

Set of vertices (nodes) connected by edges. Directed or undirected, weighted or unweighted. Represented as adjacency list O(V+E) or adjacency matrix O(V²).

Operation Adj. List Adj. Matrix
Add vertex O(1) O(V²)
Add edge O(1) O(1)
DFS/BFS O(V+E) O(V²)
Check edge O(E) O(1)
DFS/BFSTopological SortUnion-FindDijkstra
🔤

Trie (Prefix Tree)

Tree where each node represents a character. Enables O(L) search/insert where L = word length. Perfect for prefix searches, autocomplete, word dictionaries.

class TrieNode: def __init__(self): self.children = {} self.is_end = False
AutocompleteWord Search IIPrefix Match
🔀

Union-Find (DSU)

Disjoint Set Union — tracks connected components. With path compression + union by rank, both union and find run in O(α(n)) ≈ O(1) amortized.

def find(parent, x): if parent[x] != x: parent[x] = find(parent, parent[x]) return parent[x] def union(parent, rank, x, y): px, py = find(parent, x), find(parent, y) if rank[px] < rank[py]: px, py = py, px parent[py] = px
Number of IslandsMST (Kruskal)Redundant Connection

Algorithm Complexity Table

Quick-reference for time and space complexity of the most common algorithms

Algorithm Best Average Worst Space
Bubble Sort O(n) O(n²) O(n²) O(1)
Selection Sort O(n²) O(n²) O(n²) O(1)
Insertion Sort O(n) O(n²) O(n²) O(1)
Merge Sort O(n log n) O(n log n) O(n log n) O(n)
Quick Sort O(n log n) O(n log n) O(n²) O(log n)
Heap Sort O(n log n) O(n log n) O(n log n) O(1)
Counting Sort O(n+k) O(n+k) O(n+k) O(k)
Radix Sort O(nk) O(nk) O(nk) O(n+k)
Linear Search O(1) O(n) O(n) O(1)
Binary Search O(1) O(log n) O(log n) O(1)
BFS O(1) O(V+E) O(V+E) O(V)
DFS O(1) O(V+E) O(V+E) O(V)
Dijkstra O(E) O((V+E) log V) O((V+E) log V) O(V)
Bellman-Ford O(VE) O(VE) O(VE) O(V)
Floyd-Warshall O(V³) O(V³) O(V³) O(V²)
Kruskal's MST O(E log E) O(E log E) O(E log E) O(V)

Every LC Pattern

Master these 18 patterns and you can solve 90%+ of LeetCode problems. Recognize the pattern → apply the template.

01 Two Pointers EASY–MED

Two indices traversing the data, often from opposite ends or at different speeds. Reduces O(n²) brute force to O(n).

USE WHEN: Sorted array/string, searching for pairs, palindrome check, removing duplicates in-place.
l, r = 0, len(arr) - 1 while l < r: if arr[l] + arr[r] == target: return [l,r] elif arr[l] + arr[r] < target: l += 1 else: r -= 1
Examples
Two Sum II3SumValid PalindromeContainer With Most Water
02 Sliding Window MEDIUM

Maintain a window of elements that expands/shrinks as you iterate. Converts O(n²) nested loops into a single O(n) pass.

USE WHEN: Contiguous subarray/substring, find max/min/longest/shortest window satisfying a condition.
l, res = 0, 0 for r in range(len(s)): # expand: process s[r] while # window invalid: # shrink: remove s[l] l += 1 res = max(res, r - l + 1)
Examples
Longest Substring Without RepeatingMin Window SubstringMax Subarray of Size K
03 Binary Search MEDIUM

Halve the search space each iteration. Applies not just to sorted arrays but to any monotonic search space — including on the answer itself.

USE WHEN: Sorted array, rotated sorted array, "find minimum/maximum such that condition holds," search on answer space.
lo, hi = 0, len(arr) - 1 while lo <= hi: mid = (lo + hi) // 2 if arr[mid] == target: return mid elif arr[mid] < target: lo = mid + 1 else: hi = mid - 1
Examples
Search in Rotated ArrayFind Peak ElementKoko Eating BananasMedian of Two Arrays
04 Fast & Slow Pointers MEDIUM

Floyd's cycle detection. Two pointers moving at different speeds — if there's a cycle, fast will eventually catch slow. Also finds middle of linked list in O(n).

USE WHEN: Linked list cycle detection, finding middle, linked list problems, "happy number"-style problems.
slow, fast = head, head while fast and fast.next: slow = slow.next fast = fast.next.next if slow == fast: return True # cycle
Examples
Linked List CycleMiddle of Linked ListFind Duplicate Number
05 Prefix Sum EASY–MED

Precompute cumulative sums so that any range sum [i, j] can be answered in O(1): prefix[j+1] - prefix[i]. Classic tradeoff of O(n) precompute for O(1) queries.

USE WHEN: Range sum queries, subarray sum equals K, 2D matrix queries, "how many subarrays sum to target?"
prefix = [0] * (len(nums) + 1) for i, n in enumerate(nums): prefix[i+1] = prefix[i] + n # range sum [l, r]: s = prefix[r+1] - prefix[l]
Examples
Subarray Sum Equals KRange Sum QueryProduct Except Self
06 Depth-First Search (DFS) MEDIUM

Explore as deep as possible along each branch before backtracking. Implemented recursively (call stack) or iteratively (explicit stack). Key for tree traversal and graph exploration.

USE WHEN: Tree traversal, graph connectivity, pathfinding, generate all combinations/subsets, detect cycles.
def dfs(node, visited): if not node or node in visited: return visited.add(node) for neighbor in graph[node]: dfs(neighbor, visited)
Examples
Number of IslandsMax Area of IslandPath SumClone Graph
07 Breadth-First Search (BFS) MEDIUM

Explore all neighbors at current depth before going deeper. Uses a queue. Guarantees shortest path in unweighted graphs. Level-order traversal of trees.

USE WHEN: Shortest path (unweighted), level-order tree traversal, "minimum steps to reach target," spreading processes.
q = deque([start]) visited = {start} while q: node = q.popleft() for nei in graph[node]: if nei not in visited: visited.add(nei) q.append(nei)
Examples
Binary Tree Level OrderWord LadderRotting OrangesShortest Path
08 Backtracking HARD

Systematically build candidates and abandon ("backtrack") a candidate as soon as it can't lead to a valid solution. Prune the search space early.

USE WHEN: Generate all valid combinations, permutations, subsets, constraint satisfaction (Sudoku, N-Queens).
def backtrack(path, choices): if is_solution(path): results.append(path[:]); return for choice in choices: path.append(choice) backtrack(path, remaining) path.pop() # undo choice
Examples
PermutationsSubsetsCombination SumN-QueensSudoku Solver
09 Dynamic Programming HARD

Break problem into overlapping subproblems; cache results (memoization = top-down, tabulation = bottom-up). Optimal substructure + overlapping subproblems = DP.

USE WHEN: Optimization (min/max), counting ways, decision making with overlapping subproblems, strings/sequences, knapsack variants.
# 1D DP: Fibonacci dp = [0] * (n + 1) dp[1] = 1 for i in range(2, n + 1): dp[i] = dp[i-1] + dp[i-2]
Examples
Climbing StairsCoin ChangeLCSKnapsackEdit DistanceWord Break
10 Monotonic Stack MEDIUM

A stack that maintains a monotonically increasing or decreasing order. Pop elements that violate the order — those pops reveal the "next greater/smaller" element.

USE WHEN: Next greater/smaller element, stock span, largest rectangle in histogram, trapping rain water.
stack, result = [], [-1] * len(nums) for i, n in enumerate(nums): while stack and nums[stack[-1]] < n: result[stack.pop()] = n stack.append(i)
Examples
Next Greater ElementDaily TemperaturesLargest RectangleTrapping Rain Water
11 Heap / Top K Elements MEDIUM

Use a min-heap of size K to find the K largest elements in O(n log k). Avoids sorting the entire array. Also used for merging K sorted lists.

USE WHEN: "Find K largest/smallest," merge K sorted lists, K closest points, median of data stream.
import heapq # K largest elements heap = [] for n in nums: heapq.heappush(heap, n) if len(heap) > k: heapq.heappop(heap)
Examples
Kth Largest ElementK Closest PointsMerge K Sorted ListsFind Median from Stream
12 Merge Intervals MEDIUM

Sort intervals by start time. Merge overlapping ones. Two intervals [a,b] and [c,d] overlap when c ≤ b. Works for scheduling, meeting rooms, calendar conflicts.

USE WHEN: Overlapping intervals, merge meetings, insert interval, minimum number of meeting rooms.
intervals.sort(key=lambda x: x[0]) merged = [intervals[0]] for s, e in intervals[1:]: if s <= merged[-1][1]: merged[-1][1] = max(merged[-1][1], e) else: merged.append([s, e])
Examples
Merge IntervalsInsert IntervalMeeting Rooms IINon-overlapping Intervals
13 Topological Sort MEDIUM

Linear ordering of vertices in a DAG such that for every edge u→v, u comes before v. Kahn's algorithm (BFS, in-degree) or DFS with a visited stack.

USE WHEN: Course scheduling, build order, dependency resolution, detect cycle in directed graph.
# Kahn’s (BFS) q = deque([v for v in graph if indegree[v] == 0]) order = [] while q: v = q.popleft(); order.append(v) for u in graph[v]: indegree[u] -= 1 if indegree[u] == 0: q.append(u)
Examples
Course ScheduleAlien DictionaryFind Order
14 Tree Traversals & BST MEDIUM

Know all four traversals cold: inorder (sorted for BST), preorder (serialize), postorder (delete), level-order (BFS). BST: validate, kth smallest, LCA.

USE WHEN: Any tree problem. Inorder gives sorted sequence. LCA via recursion. Serialize/deserialize via preorder.
# Inorder: left → root → right def inorder(node): if not node: return inorder(node.left) result.append(node.val) inorder(node.right)
Examples
Kth Smallest in BSTValidate BSTLCA of BSTInvert Binary Tree
15 Greedy MEDIUM

Make the locally optimal choice at each step, hoping it leads to the global optimum. Faster than DP when it works, but requires proving correctness (exchange argument).

USE WHEN: Interval scheduling, jump game, gas station, minimum coins (with standard denominations), activity selection.
# Jump Game II: min jumps jumps = cur_end = farthest = 0 for i in range(len(nums) - 1): farthest = max(farthest, i + nums[i]) if i == cur_end: jumps += 1; cur_end = farthest
Examples
Jump GameGas StationTask SchedulerPartition Labels
16 Union-Find Pattern MEDIUM

Efficiently track connected components. With path compression and union by rank, find/union are nearly O(1). Ideal for dynamic connectivity problems.

USE WHEN: Connected components, cycle detection in undirected graph, minimum spanning tree, grouping problems.
parent = list(range(n)) def find(x): if parent[x] != x: parent[x] = find(parent[x]) return parent[x] def union(x, y): parent[find(x)] = find(y)
Examples
Number of Islands (UF)Redundant ConnectionAccounts MergeGraph Valid Tree
17 Trie Pattern MEDIUM

Insert words into a trie and exploit prefix sharing for fast lookups. Each path from root to a marked node spells a word. Use for searching all words with a given prefix.

USE WHEN: Multiple prefix queries, autocomplete, word search in board, longest common prefix, dictionary-based DFS.
def search_prefix(self, prefix): node = self.root for ch in prefix: if ch not in node.children: return None node = node.children[ch] return node
Examples
Implement TrieWord Search IIDesign Add and Search Words
18 Bit Manipulation MED–HARD

Direct operations on binary representations. XOR tricks, bit shifting, masks. Often O(1) solutions to problems that seem to require O(n) or more.

USE WHEN: Find single number, count set bits, power of two, subset generation (bitmask DP), swap without temp variable.
# Key operations x & (x-1) # clear lowest set bit x & 1 # check if odd x ^ x # = 0 (XOR with self) x ^ 0 # = x (XOR with 0) n & -n # lowest set bit 1 << k # 2^k
Examples
Single NumberNumber of 1 BitsCounting BitsMissing NumberSubsets