Delving into Quadratic Time Complexity

KD Knowledge Diet
2 min readNov 20, 2023

--

Exploring Quadratic Time Complexity

Quadratic time complexity, often expressed as O(n²), is where things start to get a tad more intense. When you have an algorithm where the number of operations is the square of the size of the input data, you’ve entered the realm of quadratic complexity.

The Nested Loop Scenario

Imagine you’re organizing pairs for a dance competition where every contestant must dance with every other contestant. If there are 10 contestants, each one has to pair up with 9 others, resulting in 90 pairings. Now, if that doesn’t sound exhausting enough, imagine scaling that up to a million contestants. That’s the essence of quadratic complexity: for every new addition to the array, the amount of work increases exponentially.

Here’s how a simple nested loop might look in Python:

def all_pairs(items):
for item in items:
for other_item in items:
print(f"Pair: {item} - {other_item}")

# Example usage:
items_array = [1, 2, 3, 4, 5]
all_pairs(items_array)

If items_array has 10 elements, the all_pairs function will print out pairs 100 times. With a million elements, well, you'd better have a lot of time on your hands!

Visualizing the Impact of n²

When you plot this on a graph with data size on the X-axis and time on the Y-axis, the curve starts off gently but then skyrockets alarmingly as the data size increases. It's not a straight and steady climb like with linear time complexity; it's a steep ascent to the stratosphere of processing time.

--

--

KD Knowledge Diet
KD Knowledge Diet

Written by KD Knowledge Diet

Software Engineer, Mobile Developer living in Seoul. I hate people using difficult words. Why not using simple words? Keep It Simple Stupid!

No responses yet