All Articles
AlgorithmsintermediateJune 15, 2025 · 6 min read

Binary Search and Its Variants

A deep dive into binary search — from the classic algorithm to powerful variants used in competitive programming and system design.

Part 1 of series: Algorithm Fundamentals

Introduction

Binary search is one of the most fundamental algorithms in computer science. Despite its simplicity, it is surprisingly easy to get wrong — and its variants power everything from database indexing to machine learning hyperparameter tuning.

The algorithm was first published by John Mauchly in 1946, but the first bug-free implementation didn't appear until 1962 [1]. As Knuth famously documented, even experienced programmers frequently introduce off-by-one errors in their binary search implementations [2].

In this article we will build up from the basic algorithm to several powerful variants. We start with the classic algorithm, explore its tree interpretation (see Figure 1), and then move to advanced patterns like search-on-answer.

The Classic Algorithm

Given a sorted array AA of nn elements, binary search finds whether a target value tt exists in O(logn)O(\log n) time [3].

Python
def binary_search(arr: list[int], target: int) -> int:
lo, hi = 0, len(arr) - 1
while lo <= hi:
mid = lo + (hi - lo) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:

The key insight is that each comparison eliminates half of the remaining search space, giving us logarithmic time complexity. This can be visualized as a decision tree where each node represents a comparison:

Binary search tree showing the search path for target value 7. The search follows: root 8 (7 < 8, go left), node 5 (7 > 5, go right), node 7 (found).
Figure 1. Binary search on a sorted array visualized as a BST. The highlighted path shows the search for target = 7, requiring only 3 comparisons out of 8 elements.

As shown in Figure 1, each comparison at a node eliminates an entire subtree, halving the remaining candidates. The maximum number of comparisons is log2(n+1)\lceil \log_2(n+1) \rceil.

Variant 1: Lower Bound

The lower bound variant finds the first position where an element is greater than or equal to the target. This is equivalent to C++'s std::lower_bound.

Python
def lower_bound(arr: list[int], target: int) -> int:
lo, hi = 0, len(arr)
while lo < hi:
mid = lo + (hi - lo) // 2
if arr[mid] < target:
lo = mid + 1
else:
System-verilog
// 4-bit Counter in SystemVerilog
module counter_4bit (
input logic clk, // Clock input
input logic reset_n, // Asynchronous reset (active low)
output logic [3:0] q // 4-bit output
);
// Sequential logic block
always_ff @(posedge clk or negedge reset_n) begin
if (!reset_n) begin
q <= 4'b0000; // Reset the counter to zero
VHDL
-- 4-bit Counter in VHDL
library ieee;
use ieee.std_logic_1164.all;
use ieee.numeric_std.all;
entity counter_4bit is
port (
clk : in std_logic;
reset_n : in std_logic;
q : out std_logic_vector(3 downto 0)
);
end entity counter_4bit;
architecture behavioral of counter_4bit is

Variant 2: Search on Answer

Sometimes the search space is not an array but a range of possible answers. We binary-search on a monotonic predicate — a function f(x)f(x) that flips from false to true at some threshold.

f(x)={falsex<xtruexxf(x) = \begin{cases} \text{false} & x < x^* \\ \text{true} & x \geq x^* \end{cases}
(1)

The monotonic predicate in Equation 1 is the foundation of the search-on-answer technique. Table 1 lists common problems that reduce to this pattern.

Table 1. Common problems that reduce to binary search on a monotonic predicate.

ProblemPredicate
Minimum capacity to ship packages in DD daysCan we ship all packages with capacity xx in D\leq D days?
Kth smallest element in a matrixAre there k\geq k elements x\leq x in the matrix?
Allocate minimum pagesCan we allocate all books such that max pages x\leq x?

This technique extends binary search beyond arrays into continuous and discrete optimization. Bentley's work on multidimensional search trees [4] generalized many of these ideas to higher dimensions.

Complexity Analysis

Table 2. Time and space complexity of binary search variants.

VariantTimeSpace
ClassicO(logn)O(\log n)O(1)O(1)
Lower/Upper boundO(logn)O(\log n)O(1)O(1)
Search on answerO(logRC)O(\log R \cdot C)O(1)O(1)

As shown in Table 2, all variants maintain O(1)O(1) space complexity. The time complexity for search-on-answer depends on the range RR of the answer space and the cost CC of evaluating the predicate.

Algorithm Flowchart

The following diagram illustrates the control flow of the classic binary search algorithm:

StartloÃ0;hiÃn¡1lohi?midÃlo+b(hi¡lo)=2cReturnmidReturn¡1yesno
Figure 2. Flowchart of the classic binary search algorithm.

As shown in Figure 2, the algorithm repeatedly halves the search space until the target is found or the bounds cross.

Key Takeaways

  1. Always use lo + (hi - lo) // 2 instead of (lo + hi) // 2 to avoid integer overflow.
  2. Be precise about loop invariants — most binary search bugs come from off-by-one errors in the bounds.
  3. The "search on answer" pattern is the most versatile variant and appears in a huge number of problems.
  4. Binary search can be visualized as a BST traversal (Figure 1), which helps build intuition about its logarithmic behavior.

In the next article, we will explore graph traversal algorithms and their applications.

References

  1. [1]Hermann Bottenbruch (1962). “Structure and Use of ALGOL 60.” Journal of the ACM, 9(2), pp. 161--221. doi:10.1145/321119.321120
  2. [2]Donald E. Knuth (1997). “The Art of Computer Programming, Volume 3: Sorting and Searching.” Addison-Wesley Professional.
  3. [3]Thomas H. Cormen and Charles E. Leiserson and Ronald L. Rivest and Clifford Stein (2009). “Introduction to Algorithms.” MIT Press.
  4. [4]Jon Louis Bentley (1975). “Multidimensional Binary Search Trees Used for Associative Searching.” In Communications of the ACM, pp. 509--517. doi:10.1145/361002.361007
algorithmssearchinginterviews