DeepMind’s AI Discovers a New Method of Computing

— metamorworks / Shutterstock.com

An advanced artificial intelligence created by the company DeepMind has discovered a new way to multiply numbers. Described as the first advance of its kind in more than 50 years, this discovery could increase the speed of certain calculations by 20%.

Improve matrix calculation

The matrix multiplication, which consists of multiplying two grids of numbers, is a fundamental computing task used by a wide range of software, including graphics, neural networks and scientific simulations. Therefore, even a small improvement in the efficiency of these algorithms could result in significant performance gains and noticeable power savings.

For centuries it was believed that the most efficient way to multiply matrices was related to the number of elements to be multiplied, with a task becoming proportionately more complex as the size of the matrices increased. However, the mathematician Volker Strassen proved in 1969 that the multiplication of a matrix of two rows of two numbers with another of the same size did not necessarily involve eight multiplications and that, thanks to a ” pirouette clever, it could be reduced to seven.

Known as Strassen’s algorithm, this approach requires a few more additions, but is still much more efficient, since computers perform this type of operation much faster than multiplications. Widely used for more than half a century, it has recently been surpassed by the technique discovered byartificial intelligence AlphaTensor of DeepMindfaster and also better adapted to current hardware.

an-artificial-intelligence-image-video
— YAKOBCHUK V / Shutterstock.com

Without any prior knowledge of the solutions currently in use, AlphaTensor had to create a working algorithm that would accomplish the task in a minimum of steps. Artificial intelligence has finally found a way to multiply two matrices of four rows of four numbers using only 47 multiplications, which surpasses Strassen’s 49 multiplications.

An Algorithmic Breakthrough Announcing Others

If the results, detailed in the review Nature, turn out to be mathematically sound, they turn out to be quite counter-intuitive for humans. ” Currently, we don’t really know how the system achieved this says Hussein Fawzi of DeepMind. ” In a way, neural networks have intuition about what looks good and what looks bad. I think there’s some theoretical work to be done on that to figure out how exactly deep learning manages to do this stuff. »

According to Oded Lachish, of the University of London, this algorithmic breakthrough announces others. ” I think we will see AI-generated results for other problems of a similar nature. Fewer operations not only mean faster results, but also less energy spent. »

As impressive as AlphaTensor’s capabilities are, the advances made don’t necessarily mean that human coders will soon be sidelined. ” Should programmers be worried? Maybe in the distant future. Automatic optimization has been practiced for decades in the microprocessor design industry and it is just another important tool in the coder’s arsenal. concludes Lachish.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.