Generator functions in python

Table of contents

Ever found yourself working with a massive list of items in Python, and thought "There's got to be a better way to handle this"? Well, let me introduce you to the world of generator functions, which can be a game changer when dealing with large data sets, or simply when you want to create iterators with a neat, Pythonic touch.


Generators are those nifty Python mechanisms that allow you to declare a function that behaves like an iterator. In other words, it lets you loop through a set of values, but instead of creating and storing all the values at once, it generates them on-the-fly, one at a time, and in a memory-efficient way. Hailing from the yield statement, generator functions can pause and resume their state and context around the point of yield, leading to a wonderfully optimal consumption of resources.


Let's dig into a simple example. Suppose you want to create a sequence of numbers. In the traditional approach, using a list, you would write something like this:

def create_numbers_list(n):
   numbers = []
   for i in range(n):
       numbers.append(i)
   return numbers

numbers = create_numbers_list(10000)
print(numbers) # Be careful; this could take up a lot of memory!

The function create_numbers_list generates a list of numbers from 0 to n - 1. However, if n is a large number, you'll start feeling the memory strain as the entire list is stored in memory. Ouch!


Now, let’s rewrite the above code using a generator function:

def create_numbers_generator(n):
   for i in range(n):
       yield i

numbers = create_numbers_generator(10000)
for number in numbers:
   print(number) # This will print numbers one by one without memory overload.

Here, create_numbers_generator doesn't produce a memory-gobbling list. Instead, it produces values one by one. Each call to yield temporarily hands off an individual value to the for-loop and then pauses, waiting to be resumed after each iteration. And voila! You're now iterating over a sequence with the memory footprints of an ant!


On that point, you might ask, "What's really going on behind the scenes when the Python interpreter encounters a yield statement?" It's simple: the usual function execution is suspended, and the function's state, including local variables and the execution point, is saved for later use. When the generator’s __next__() method is invoked (implicitly by a for loop or explicitly by the next() function), the function resumes execution right after the yield statement.


Here's a more creative use of generators, which shows off their lazy nature:

def fibonacci_gen():
   a, b = 0, 1
   while True:
       yield a
       a, b = b, a + b

fib = fibonacci_gen()
print(next(fib)) # 0
print(next(fib)) # 1
print(next(fib)) # 1
print(next(fib)) # 2
# and so on...

The fibonacci_gen function is an infinite generator that produces numbers from the Fibonacci sequence. By calling next(), you get the following number in the sequence without calculating the whole sequence in advance. This makes it perfect for calculating big sequences where you want to avoid memory issues or when you just don't know when to stop.


But let's not stop there. Generators also make for excellent pipeline components. Imagine you want to process a large log file. You could chain a sequence of generator functions to read, filter, and output lines.

def read_logs(file_name):
   with open(file_name, "r") as file:
       for line in file:
           yield line.strip()

def filter_errors(log_lines):
   for line in log_lines:
       if 'ERROR' in line:
           yield line

log_lines = read_logs("server.log")
error_logs = filter_errors(log_lines)
for error in error_logs:
   print(error)

Using generators like this, you read one line at a time, process it, and then proceed to the next. The pipeline stays neat, readable, and above all, lean on memory usage.


So, next time you're about to loop through a large dataset, or if you need a simple iteration with an elegant twist, give generator functions a spin. You might just find a new favored tool in your Python toolkit. Dive in, yield away, and see how they can streamline your code and lower your memory footprint.

And there you go, if handling sizable data without a cumbersome memory toll sounds appealing to you, generators could very well be your next favorite Python feature. Keep iterating, keep optimizing, and keep the code flowing!

More articles

Understanding  Go Closures

When two funcs aren't quite the same

Designing REST APIs for long-running tasks

Going beyond the request-response pattern

Automating ingress SSL Certificates on Kubernetes with Cert-Manager

Effortless traffic encryption for ingress resources

The downsides of source-available software licenses

And how it differs from real open-source licenses

Configure linux debian to boot into a fullscreen application

Running kiosk-mode applications with confidence

How to use ansible with vagrant environments

Painlessly connect vagrant infrastructure and ansible playbooks