Preview — Pro guide
You are seeing a portion of this guide. Sign in and upgrade to unlock the full article, quizzes, and interview answers.
Sliding Window: Fixed and Variable Window Patterns
Master the sliding window pattern for contiguous subarray and substring problems. Covers fixed-size and variable-size windows, the expand-shrink template, and the key O(n) insight that most candidates miss until they understand why left never moves backward.
When to Reach for Sliding Window
Sliding window solves a specific class of problems that all share three recognition signals. First: the problem asks about a contiguous subarray or substring — not subsequences, not arbitrary subsets, but elements that must be adjacent. Second: you're optimizing over all possible windows: longest, shortest, maximum sum, minimum size. Third: there is a monotonic constraint — as you expand the window, the property either stays valid or becomes invalid in one direction only, and shrinking always restores validity.
When all three signals appear, you can collapse an O(n²) brute force into an O(n) solution. The brute force tries every window: O(n) starting positions × O(n) window sizes = O(n²). Sliding window avoids this by maintaining a live window state — a running data structure (count, sum, frequency map) that you update incrementally as you move the window's boundaries rather than recomputing from scratch.
The pattern does not apply when elements can be non-contiguous (use DP for that), when the constraint has multiple validity regions (use binary search on window size), or when you need to consider all pairs regardless of adjacency (two-pointer converging is usually better there). Learning to rule out sliding window quickly is as important as knowing when to use it.
What Interviewers Are Testing
Most candidates can describe sliding window in the abstract. What separates senior-level answers is: (1) correctly identifying which window type applies before writing code, (2) articulating the invariant being maintained — what state the window must satisfy at all times, and (3) explaining why the time complexity is O(n) rather than O(n²). Interviewers at Google and Meta specifically probe whether you understand that the amortized cost is O(n) because left only advances forward — not because the inner while loop has constant iterations.