# Factor calculator

In mathematical calculations, factorization is often used - the decomposition of natural numbers into prime factors. Each of them divides the original number without a remainder and, after decomposition, is written as part of a simple series along with other factors.

In general, this looks like a = b × c × d..., where a is the original number; b, c, d are the prime factors of which it consists. The latter can have different degrees/multiplicity, and the expression a*a can always be represented as a² as part of a series.

The ultimate goal of the calculation is to find the least common multiple (LCM) and the greatest common divisor (GCD). Most often, factorization is used in arithmetic operations with simple fractions. It is considered a computationally complex action, and when working with large numbers, it is carried out using electronic computing devices.

## Historical background

The possibility of decomposing integers into prime factors was proved in Ancient Greece. According to the fundamental theorem of arithmetic, every positive number can be decomposed into a product of prime numbers, which, in turn, cannot be decomposed into any integer other than one. In elementary algebra, factorization was primarily used to factorize polynomials and find roots of factors.

In the 9th century, the Arab scholar Al-Khwarizmi applied factorization to simplify equations and described the process in detail in the Compendium of Calculations on Completion and Balancing. And the next significant event for factorization occurred only eight centuries later - in 1631, when the English astronomer and mathematician Thomas Harriot published in his work Artis tables for addition, subtraction, multiplication and division of monomials, trinomials and binomials. In his work, the scientist used the factorization method in a new way, presenting an expression of the form (a − b) × (a + c) as b × a + c × a + b × c.

And mass decomposition of natural numbers into prime factors began to be used only in the 17th century - with the "light hand" of the French mathematician Pierre Fermat, as well as his contemporary and compatriot Adrien Marie Legendre. The first used the expansion method to represent numbers as a difference of squares, and the second applied this method to continued fractions. In the 8th century, research was continued by Leonhard Euler and Carl Friedrich Gauss, and the end of the 20th century became the key for this direction in mathematics.

Thus, in 1977, Ronald Lynn Rivest, Leonard Max Adleman, and Adi Shamir developed a new algorithm for calculating RSA, which opened up new possibilities for encryption. RSA allows you to factorize 30-digit numbers containing more than 100 decimal places. Initially, the algorithm was used to encrypt and decrypt data, and later it formed the basis of electronic cryptography.

In 1994, at the initiative of the Dutch scientist Arjen Klaas Lenstra, a 129-digit (428-bit) number was first factorized. The process involved 1600 powerful computers, which carried out the calculation in 2 days. And the preparation of a system of linear equations took Lenstra and his research team of 600 people more than 200 days.

This study became a kind of record in the history of factorization and allowed us to draw important conclusions. In particular, that a 512-bit cipher can be broken, despite its complexity. But this will require a lot of time (several months) and serious capital investments (at the time of 1994 - several million dollars).