When I first laid hands on “Programming Pearls” by Jon Bentley a unique guide into the world of coding, I was a mere 10/11-year-old. This gift came from my uncle, who was kind of responding to my budding passion for programming. I recall delving into its pages with breathless anticipation, exploring the art of refining algorithms, uncovering secrets to enhance computational prowess, and learning about crafting algorithms with both elegance and efficiency. Each chapter, a new breath of knowledge, filled me with an excitement only a young person could understand. Maybe I didn’t understand everything in terms of coding, but I remember the great stories, associated with the book.

But a few years later (pre-internet era), seeing how far computer technology had advanced, I remember began to doubt the point of these optimisations. After all, we have faster and faster processors, more memory, better compilers - does anyone else bother with minor tweaks to inefficient bits of code?

Now I think back to a lesson in ‘Programming Pearls’ in the age of big data. When we are processing terabytes of data on thousands of servers, every optimisation matters.

Take the algorithm that controls an autonomous vehicle. It processes gigabytes of sensor data every second. Improving it by 1% can significantly ease the burden on the system and translate into real savings.

Similarly, AI algorithms trained on huge data sets. A minor optimisation that reduces learning time by 1 day, at the scale of a technology company, allows a new product to be rolled out faster and gain a competitive advantage.

In the world of big data, there are no limits to how much we can optimize things. Even the fastest computers have limitations. This is why the teachings contained in ‘Programming Pearls’ are gaining new relevance. They allow us to make better use of the resources in a world overtaken by growing data.