Big O notation
A short reference to what Big O is, how time and space are expressed, and how common variables like n and h are used—with examples from the problems in this app.
What is Big O?
Big O describes how the time or space an algorithm needs grows as the input size grows. We ignore constant factors and lower-order terms and focus on the dominant term. For example, if something does 3n² + 100n + 2 steps, we say it is O(n²).
It answers: “If I double the input, roughly how much more work or memory?” O(n) means roughly linear (double input → double cost). O(n²) means doubling the input roughly quadruples the cost.
Time vs space
- Time complexity — How many steps (operations) the algorithm does as a function of input size. We count comparisons, loops over data, recursive calls, etc.
- Space complexity — How much extra memory the algorithm uses beyond the input: stack depth for recursion, allocated arrays/maps, and output size.
You can have fast-but-heavy algorithms (e.g. O(n) time, O(n) space) or slower-but-light ones (e.g. O(n²) time, O(1) space). Stating both helps compare solutions.
How n is used
n usually means the “main” size of the input: total number of elements you process.
- Strings: n = length of the string (number of characters).
- Arrays / lists: n = number of elements.
- Trees: n = total number of nodes (every node visited once → O(n) time).
So “O(n) time” means we do a constant amount of work per element; “O(n²)” often means nested loops over the same n elements or repeated scans.
How h is used
h usually means the height of a tree (maximum depth from root to a leaf).
- In a balanced tree, h ≈ log(n), so O(h) is the same idea as O(log n) for space (recursion stack) or for operations that depend on depth.
- In a degenerate tree (e.g. a linked list), h = n, so O(h) space for recursion can be O(n).
We use h when the cost is tied to how deep we go (e.g. stack depth in a tree traversal), not to how many nodes exist in total.
Complexity Prompts – With Answers
Practice problems with time/space analysis. Each prompt has an answer and short reasoning.
Prompt 1
Problem: You're given an array of user events sorted by timestamp. You scan once and build a map of { userId → lastSeenTimestamp }.
Answer: Time: O(n), where n is the number of events · Space: O(u), where u is the number of unique userIds
Reasoning: You scan the array once and store one entry per unique user.
Prompt 2
Problem: You're given a deeply nested JSON config. You recursively walk it and collect all keys that match a predicate.
Answer: Time: O(n), where n is the total number of keys/nodes · Space: O(h) + O(k) — h = maximum nesting depth (recursion stack), k = number of matching keys returned
Reasoning: Each key is visited once. Space includes both traversal depth and output size.
Prompt 3
Problem: Given two strings s and p, return true if s contains any permutation (anagram) of p.
Answer (optimized sliding window): Time: O(n), where n = length of s · Space: O(k), where k = number of distinct characters in p
Reasoning: Each character enters and exits the window once. Frequency map size depends on p.
Prompt 4
Problem: You normalize text by lowercasing and stripping punctuation before matching.
Answer: Time: O(T), where T = total number of characters processed · Space: O(T) if producing new strings, otherwise O(1) auxiliary
Reasoning: Each character is processed a constant number of times.
Prompt 5
Problem: You flatten a tree (e.g., component tree or folder structure) using DFS.
Answer: Time: O(n), where n = number of nodes · Space: O(n) + O(h) — O(n) for the flattened output, O(h) for recursion stack or explicit stack, h = height of the tree
Reasoning: Each node is visited once. Stack depth depends on tree height.
Prompt 6
Problem: You scan an array once using two pointers that only move forward.
Answer: Time: O(n) · Space: O(1) or O(k), depending on auxiliary data structures
Reasoning: Each pointer advances at most n times; no backtracking.
Prompt 7
Problem: You merge overlapping intervals after sorting them by start time.
Answer: Time: O(n log n) · Space: O(n)
Reasoning: Sorting dominates. Merge pass is linear.
Prompt 8
Problem: You tokenize a string into words and run a sliding window over tokens.
Answer: Time: O(T) · Space: O(W + R) — T = total characters, W = number of tokens, R = number of required tokens
Reasoning: Tokenization and window traversal are both linear in input size.
Prompt 9
Problem: You recursively traverse a tree and stop early when a condition is met.
Answer: Time: O(n) worst case · Space: O(h)
Reasoning: Worst case still visits all nodes; recursion depth depends on height.
Prompt 10
Problem: You build a frequency map from an array of items.
Answer: Time: O(n) · Space: O(u), where u = number of unique items
Reasoning: One pass through the array; map grows with unique keys.
Examples from the problems
Below, each solution is linked to its problem page so you can see the code and visualization. The time/space and short “why” match how we analyze them in this app.
Minimum Window Substring (Characters)
Given a string and a set of required characters (including duplicates), find the smallest substring that contains all required characters. If no such substring exists, return an empty string.
Brute-force sliding window
Time: O(n² · m) to O(n³) Space: O(m)
Re-scans the current window for every move (testSegment + countCharacters). Many overlapping windows are re-checked from scratch, leading to cubic behavior in the worst case.
Single-pass sliding window
Time: O(n) Space: O(m)
Each character is processed at most twice (once by the right pointer, once by the left). No re-scanning; optimal for this problem.
Component Tree Search
Given a nested component tree structure, return all components whose type matches a target value. The traversal must preserve top-to-bottom, left-to-right order.
Flatten with reduce + spread
Time: O(n²) Space: O(n)
Flattens the whole tree (O(n²) from spread/copy at each step), then a separate O(n) filter—so more than two linear passes. Spreading the accumulator and each subtree copies O(n) elements repeatedly; degenerate (list-like) trees give quadratic time. The reduce order can reverse natural tree order unless carefully structured.
Flatten with accumulator, then filter
Time: O(2n) Space: O(n)
Two passes: O(n) to flatten, then O(n) to filter. A single-pass approach only collects matching nodes as it descends, avoiding the extra traversal and the full intermediate list.
Single-pass collector
Time: O(n) Space: O(h)
Single pass over the tree. No intermediate full flatten: only matching nodes are added to the collector, so space is O(k) for the result (and O(h) stack depth). Optimal for this problem.
Deduplicate While Preserving Order
Given a list of items with IDs, remove duplicates by ID while preserving the order of first occurrence.
Map by ID (first wins)
Time: O(n) Space: O(k), k = unique ids
Single pass over the input. Each id is looked up and at most once inserted; Map operations are O(1) average. Output size is O(k). Optimal for this problem.
Dependency Resolution (Topological Sort)
Given a list of modules and dependency pairs where one module depends on another, return an order of modules that satisfies all dependencies. If a cycle exists, return an empty list.
Kahn's algorithm (BFS + indegree)
Time: O(V + E) Space: O(V + E)
Single pass to build graph and indegree, then each node and edge is processed once. Standard optimal topological sort.
Event Rate Limiting (Rolling Window)
Given a time-ordered list of events, enforce a per-event-type rate limit such that no more than a fixed number of events occur within a rolling time window. Events exceeding the limit are dropped while preserving order.
Per-type queue + sliding window
Time: O(n) Space: O(k · L), k = types, L = max events per type in window
One pass over events. Each timestamp is added once and removed at most once from its queue. Optimal for this problem.
Reverse Integer
Given a 32-bit signed integer, reverse its digits. Return 0 if the reversed value would overflow the 32-bit signed integer range. Preserve the sign of the input.
Digit extraction with place values
Time: O(log₁₀ n) Space: O(1)
Uses floating point (Math.log10, Math.pow) and a single overflow check (newNumber > 2³¹) that is wrong for positive numbers: it allows 2³¹ when max positive is 2³¹−1. More state (two place-value counters) and harder to follow than the mod-10 approach.
Pop digit (mod 10) with overflow check
Time: O(log₁₀ n) Space: O(1)
Integer-only arithmetic, no log or float. Single loop with one digit extraction and one overflow check per step. Correct bounds: positive result ≤ 2³¹−1, negative magnitude ≤ 2³¹. Standard interview solution.
Squares of a Sorted Array
Given an integer array sorted in non-decreasing order, return an array of the squares of each number sorted in non-decreasing order. Use O(n) time and O(n) space (excluding output).
Two pointers (fill from end)
Time: O(n) Space: O(n)
Single pass over the array. Each element is compared and written once. Optimal for this problem.
Binary Search (Sorted Array)
Given a sorted array of integers and a target value, return the index of the target if it exists, otherwise return -1. Use binary search: repeatedly compare the middle element to the target and narrow the search range.
Binary search (iterative)
Time: O(log n) Space: O(1)
Each step halves the search range. Optimal for sorted array search.