Algorithms shape modern computing. But where did they come from?

By Arjun Karki

Algorithms are vessels mastered by humankind in order to solve its problems. While much is said in the media of these seemingly magical catch-alls for complicated computer processing, less is understood about them by the general population.

One must first comprehend journey that algorithms took to get to where they are now.

The term “algorithm” is understood in association with computer computation. It was originally a mathematical term from the 8th century by Persian mathemetician Muhammad ibn Musa al-Khwarizmi, also known simply as al-Khwarizmi. Algorithms are a procedure for solving a mathematical problem in a finite number of steps that frequently involves the repetition of mathematical operations – addition, subtraction, multiplication, division, or a combination of them.

In the modern computational world, the problems solving process is determined by certain logic involving the uses of algebraic assumptions. This mathematical process, through the course of uses, has penetrated every aspect of human life. It has made human life easier in some ways but also put at risk and uncertainty. The risks and uncertainty are due to negatives impacts and uses of the innovations.

The concept of the algorithm created a resilient virtual space, in this digital age and enriches the life of people. From household applications to astronomical computation, the uses of algorithms through machines have reshaped our planet.

Today, regardless of software and new hardware made, the underlying principle is robust algorithmic computations. The algorithm not only helps to execute our tasks, facilitates computation, and analyzes data but it is also helping in decision-making. The algorithm has reshaped our industries and reshaped our ways of fabricating digital interactions- may that be through social media or websites.

The reflections are there even in physical or face-to-face communications. The power of the machine is unparalleled – unlike the manual work of a human being, it generates data by itself and processes it into actionable insights. The use of artificial intelligence in every aspect of life is an example of the most sophisticated uses of algorithms.

Therefore, the algorithm has transcended boundaries, penetrating the fabric resemblances, repetition, and the cosmos. Now, humans can build machines that operate autonomously thanks to algorithmic evolutions.

Through Generative AI, for instance, the machine is capable of converting content from one form to another – convert text into suitable videos, presentations, and audio-visual information. Language translation is another powerful feature that Generative AI has facilitated us.

A historical review of the algorithm from ancient times to the modern age helps us better understand it. Here, we review literature and interviews to analyze contributions of some of the greatest mathematicians and computational scientists in world history – al-Khwarimzi and Alan Turing.

Where did the algorithm come from?

Looking back, it is assumed that the idea of algorithms already existed in practice much before the term was coined.

The idea of calculation using tally bars to solve recurring problems like weather, counting days and nights, visible or invisible patterns, livestock, and other artesian goods was there in the earliest Sumerian civilizations of Mesopotamia—around 4,000 BC, though numbers were introduced much later.

Even before the innovations, life penetrated the repetitive cycle of problems caused by nature like changing seasons, weathers, calamities, or by human activities like pollutions, degradations, and other.

Observing all these things, al-Khwarizmi introduced Hindu–Arabic numerals, helping to develop the concepts of an algorithm and algebra as we now know them. Also sometimes called “the father of algebra,” modern linguists believe the term algorithm comes from the Latin translation of al-Khwarizmi’s name, “Algoritmi.”

All these mathematicians’ works postulated several standard rules to resolve repetitive problems noticed in the world. Those standard rules were used in the technologies to resolve the problems. The modern technologies have strong foundation of the mathematical operation in the forms of logical computation.

Al-Khwarizmi and other pioneers

Al-Khwarizmi was born in 780 A.D. and lived in what is now Bagdad, the capital city of Iraq. He was born in Persia, now known as Uzbekistan. Little is known definitively by historians about the life of al-Khwarizmi. His name meant “native of Khwarazm,” part of Greater Iran and now part of Turkmenistan and Uzbekistan.

A documentary produced by Al Jazeera portrayed Al-Khwarizmi as a very inquisitive young man who wanted to quench his thirst for knowledge. Historians believe al-Khwarizmi’s professors encouraged him to study in Bagdad to develop his mathematical knowledge further. After moving there sometime in the 9th century, he became a member of the “House of Wisdom.”

The House of Wisdom, an intellectual center for scholars, enhanced al-Khwarimzi’s intellect widely leading him founding the concepts of algorithms.

Al-Khwarizmi is credited for his contribution to geometry, astronomy, and cartography. He wrote a masterpiece called the “Hindu Art of Reckoning.” The book was later translated into Latin.

This particular book introduced Hindu Arabic numerals, to the West and the rest of the world. Al-Khwarizmi systematizes mathematics through the invention of numerals and several other analytical solutions. For the first time in history, he showed the role of logic in mathematics through the invention of algebraic forms of computation.

By the invention of algebra its byproducts forms of variables – dependent and independent came into existence that are widely used today in the modern technological computation.

Computing the inter-relationships between these variables, specifically known and unknown, helps to drive the solution of any type of problem. The algebraic concepts – assigning the alpha or alpha-numeric values to the unknown variables gave rise to another mathematical dimension called an algebraic equation. Algebraic expression and equations are the foundational structure of the algorithmic idea.

Before the invention of machines – and computers in particular – algorithms were thought of as tedious tasks that were all about the calculation of repetitive procedures. When the data sets are large, there are more chances of calculation errors.

However, mathematicians developed many theorems and principles to understand the recurring patterns in the world. Invention of the assigning systems like algebra has further strengthened the mathematical discipline. In the 19th century, the computer was invented and programming to run the computer was all written using algebraic expression.

The logic behind them was algorithmic as the focus of the machine was to solve lengthy and repetitive mathematical tasks. Charles Babbage, known as the “father of computer science,” was the first to establish machines to perform input, process, storage, and output functions.

The first programmer of the world, Lady Ada Augusta Lovelace, was a mathematician. Charles Babbage himself was also a mathematician. Because of the contributions of these two greatest mathematicians, we have computers today.

In this modern age, although we understand computer science and mathematics as two separate dimensions, their root are the same. The source of programming is mathematics, and algorithms involve mathematical logic. In the modern age, this logic is extended further in modern computing technologies, and thus the idea of an “algorithm” is expanded.

World War II and the work of Alan Turing

One of the noteworthy names to recall in these two centuries was Alan Turing. In 1936, Turing wrote a paper called “On Computable Numbers with an Application to the Entscheidungsproblem” and applied the concept to a machine called Universal Machine.

The universal machine was later called the “Turing Machine.” Through this machine, Turing showed the possibility of computing anything in the universe. This particular invention led a clear direction towards modern technological computation. Alan Turing was born in England in 1912. From a very young age, Alan Turing displayed an exceptional intelligence and interest in mathematics and science. He attended Sherborne School, and then King’s College and did his PhD from Princeton University in 1938.

A colored, aesthetic and abstract representation of Alan Turing – the father of AI. Photo by Imagin_Artistry on Adobe Stock.

Turing started working right after his graduation in 1934 at a British code-breaking organization. He proved the central limit mathematical theorem in the course of his graduate dissertation. Due to this brilliant work, he was selected for a fellowship at Princeton University.

Turing is recognized for his extensive work in the wartimes, particularly for breaking Hitler’s ciphers. His works also established cryptology as an important field of work. Turing’s algorithmic practice emerged from his contribution to World War II through the work of cracking code using the Enigma machine.

German military extensively used Radio in World War II, so understanding radio messages was a challenge. The Enigma machine utilized Turing’s concept and cracked the Nazi’s code. Turing’s groundbreaking decoding work led to the victory in the Battle of the Atlantic and saved hundreds of thousands of Allied Forces’ lives.

It used logical computation called ALU (Arithmetic and Logic Unit) which is considered as the “brain” of modern computer systems. The ALU is a part of the processing unit, where it involves logical computation, and it is the main processing capability of the computer.

Turing is credited for the concept of the processing unit that made computers smarter. The term “supercomputer” was differentiated from general-purpose computers because of the Turing machine.

Brains meet brawn

The “Enigma machine” was invented in the early 1900s and was extensively used by the Germans to command their armies. It was used by the Germans in the 1930s and throughout the World War II. Through this machine, the message was encrypted and decrypted. It was a very sophisticated letter scrambler. German soldiers were trained to encode and decode messages using these machines.

At the time of war, it is important to keep messages secret. So, the armies on the battlefields were trained on which day, and which encryption and decryption methods they had to use. One of the brilliant works that the enigma machine could do is encrypt messages in thousands of different ways, and each type they could use differently so that no one could understand their messages.

During the time of war, the sky was full of different radio signals. Nazi dictator Adolf Hitler deployed hundreds of thousands of people to encrypt the message for the armies on the battlefield. The encrypted messages were unbreakable. It was a foolproof weapon of Hitler utilized the time of World War II for communication.

However, a minor mistake unlocked the secret logic. Usually, machines detect repetitive patterns of information. Over the course of the war effort, the setting in the enigma machine for coding by the Germans armies was unknowingly repeated, according to Stephen Budiansky’s book “Battle of wits: The complete story of codebreaking in World War II.”

Turing could figure out the logic of message decryption through these repetitive patterns. Then, Hitler’s secret messages for his armies on the battlefields were discovered. For the first time in the history of computing, the algorithm the powerful applications of algorithms were demonstrated by the Enigma machines.

Contemporary algorithms

In the modern computation world, the algorithm is understood as computer coding for solving certain problems. It is a step-based logical procedure that accepts certain inputs and processes to give output.

Jeff Erickson in his book “Algorithms says that computer programs are always represented through algorithms, but the algorithm by itself is not a program, rather it is a mechanical procedure that can be implemented in any programming language.

He further points out that the use of algorithms is powerful to tell a mathematical story in computation, as computer code is a poor medium for storytelling and so is colloquial English. The algorithm uses a combination of pseudo-code and structured English unambiguously through mathematical notations to analyze big data that is otherwise almost impossible to compute manually.

Big data involves large computations. In the modern age, such big data can be generated by the machine itself. An algorithm involves lots of idiomatic structure – especially conditionals, loops, functions, and recursion – that works as the source of elements.

Javascript program in a vscode code editor with Dracula theme. Photo by Joan Gamell on Unsplash.

Along with time, the world marched forward into innovations and now it has been made possible to solve complex and much-sophisticated problems using logical representations of the problems. Whatever buzzwords we use nowadays – like cloud computation, big data, and data analytics (Google Analytics, social media analytics, and other business website analytics) – are due to the powerful machines and innovation to accept complex mathematical computations.

The idea of the algorithm is expanding to the extent unknown. Modern devices are made fit and workable for almost all human spheres. This has expanded innovations and extension of its power from all walks of life. Very recently, artificial intelligence has been a catchword used in computation to incorporate algorithmic utility in diverse fields. Inventions are rapidly happening in both hardware and software.

The most powerful language processing example we have encountered recently is AI Programming called ChatGPT, which is an analysis of a large language database collected and generated by the machine itself and retaining synopsis or in detail at the users’ inquiry. Therefore, innovations are much more sophisticated and powerful in the modern age.

About Reece Nations