          # Algorithm

• flowchart of an algorithm (euclid's algorithm) for calculating the greatest common divisor (g.c.d.) of two numbers a and b in locations named a and b. the algorithm proceeds by successive subtractions in two loops: if the test b ≥ a yields "yes" or "true" (more accurately, the number b in location b is greater than or equal to the number a in location a) then, the algorithm specifies b ← b − a (meaning the number ba replaces the old b). similarly, if a > b, then a ← a − b. the process terminates when (the contents of) b is 0, yielding the g.c.d. in a. (algorithm derived from scott 2009:13; symbols and drawing style from tausworthe 1977). ada lovelace's diagram from "note g", the first published computer algorithm.

in mathematics and computer science, an algorithm (əm/ ( listen)) is a finite sequence of well-defined, computer-implementable instructions, typically to solve a class of problems or to perform a computation. algorithms are always unambiguous and are used as specifications for performing calculations, data processing, automated reasoning, and other tasks.

as an effective method, an algorithm can be expressed within a finite amount of space and time, and in a well-defined formal language for calculating a function. starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. the transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input.

the concept of algorithm has existed since antiquity. arithmetic algorithms, such as a division algorithm, was used by ancient babylonian mathematicians c. 2500 bc and egyptian mathematicians c. 1550 bc. greek mathematicians later used algorithms in the sieve of eratosthenes for finding prime numbers, and the euclidean algorithm for finding the greatest common divisor of two numbers. arabic mathematicians such as al-kindi in the 9th century used cryptographic algorithms for code-breaking, based on frequency analysis.

the word algorithm itself is derived from the 9th-century persian mathematician muḥammad ibn mūsā al-khwārizmī, latinized algoritmi. a partial formalization of what would become the modern concept of algorithm began with attempts to solve the entscheidungsproblem (decision problem) posed by david hilbert in 1928. later formalizations were framed as attempts to define "effective calculability" or "effective method". those formalizations included the gödelherbrandkleene recursive functions of 1930, 1934 and 1935, alonzo church's lambda calculus of 1936, emil post's formulation 1 of 1936, and alan turing's turing machines of 1936–37 and 1939.

• etymology
• informal definition
• formalization
• design
• implementation
• computer algorithms
• examples
• algorithmic analysis
• classification
• continuous algorithms
• legal issues
• history: development of the notion of "algorithm"
## For other uses, see Algorithm (disambiguation). Flowchart of an algorithm (Euclid's algorithm) for calculating the greatest common divisor (g.c.d.) of two numbers a and b in locations named A and B. The algorithm proceeds by successive subtractions in two loops: IF the test B ≥ A yields "yes" or "true" (more accurately, the number b in location B is greater than or equal to the number a in location A) THEN, the algorithm specifies B ← B − A (meaning the number b − a replaces the old b). Similarly, IF A > B, THEN A ← A − B. The process terminates when (the contents of) B is 0, yielding the g.c.d. in A. (Algorithm derived from Scott 2009:13; symbols and drawing style from Tausworthe 1977). Ada Lovelace's diagram from "note G", the first published computer algorithm. In mathematics and computer science, an algorithm (əm/ ( listen)) is a finite sequence of well-defined, computer-implementable instructions, typically to solve a class of problems or to perform a computation. Algorithms are always unambiguous and are used as specifications for performing calculations, data processing, automated reasoning, and other tasks. As an effective method, an algorithm can be expressed within a finite amount of space and time, and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty), the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomized algorithms, incorporate random input. The concept of algorithm has existed since antiquity. Arithmetic algorithms, such as a division algorithm, was used by ancient Babylonian mathematicians c. 2500 BC and Egyptian mathematicians c. 1550 BC. Greek mathematicians later used algorithms in the sieve of Eratosthenes for finding prime numbers, and the Euclidean algorithm for finding the greatest common divisor of two numbers. Arabic mathematicians such as Al-Kindi in the 9th century used cryptographic algorithms for code-breaking, based on frequency analysis. The word algorithm itself is derived from the 9th-century Persian mathematician Muḥammad ibn Mūsā al-Khwārizmī, Latinized Algoritmi. A partial formalization of what would become the modern concept of algorithm began with attempts to solve the Entscheidungsproblem (decision problem) posed by David Hilbert in 1928. Later formalizations were framed as attempts to define "effective calculability" or "effective method". Those formalizations included the Gödel–Herbrand–Kleene recursive functions of 1930, 1934 and 1935, Alonzo Church's lambda calculus of 1936, Emil Post's Formulation 1 of 1936, and Alan Turing's Turing machines of 1936–37 and 1939. Contents 1 Etymology 2 Informal definition 3 Formalization 3.1 Expressing algorithms 4 Design 5 Implementation 6 Computer algorithms 7 Examples 7.1 Algorithm example 7.2 Euclid's algorithm 7.2.1 Computer language for Euclid's algorithm 7.2.2 An inelegant program for Euclid's algorithm 7.2.3 An elegant program for Euclid's algorithm 7.3 Testing the Euclid algorithms 7.4 Measuring and improving the Euclid algorithms 8 Algorithmic analysis 8.1 Formal versus empirical 8.2 Execution efficiency 9 Classification 9.1 By implementation 9.2 By design paradigm 9.3 Optimization problems 9.4 By field of study 9.5 By complexity 10 Continuous algorithms 11 Legal issues 12 History: Development of the notion of "algorithm" 12.1 Ancient Near East 12.2 Discrete and distinguishable symbols 12.3 Manipulation of symbols as "place holders" for numbers: algebra 12.4 Cryptographic algorithms 12.5 Mechanical contrivances with discrete states 12.6 Mathematics during the 19th century up to the mid-20th century 12.7 Emil Post (1936) and Alan Turing (1936–37, 1939) 12.8 J.B. Rosser (1939) and S.C. Kleene (1943) 12.9 History after 1950 13 See also 14 Notes 15 Bibliography 16 Further reading 17 External links  