An “Algorithm,” in computer science, is defined as a computational system, which returns a result based on the verification of the truthfulness of certain conditions.
In other words, the algorithm can thus be defined as data processing that takes place according to a series of defined steps and returns a unique result to a problem.
If we were to draw an algorithm, we might think of it as a flowchart: the order we give to the instructions is crucial for the result to be as expected.
Today the term “algorithm” is back in the limelight as it is frequently associated with that of “Artificial Intelligence,” “Machine Learning,” and marketing automation.
Particularly in the latter case, systems not only execute what is provided, but by executing it they improve and rewrite the algorithm itself.
In this context, reference is often made to a particular type of algorithms, the predictive ones. They work and learn on the basis of collected data to predict future trends.
There are different types of algorithms, but there are common characteristics that we can identify in:
- atomicity-the steps of the scheme must not be further decomposable;
- unambiguity-the steps in the outline must be unambiguously interpretable;
- finiteness-there must be a defined number of steps;
- termination-the algorithm must occur in a defined time;
- effectiveness-youonly have to arrive at one result.
Algorithm is a fundamental concept in computer science, and the earliest notions and references can even be found in documents dating back to the 17th century B.C.
However, the term as we know it today comes from a Latin transcription of the name of a Persian mathematician al-Khwarizmi.
The most famous mathematical formalisation is undoubtedly that of the Turing machine dating back to the 1930s.