Sep 24, 2021
A Computer Breakthrough Helps Solve a Complex Math Problem 1 Million Times Faster
Posted by Kelvin Dafiaghor in categories: information science, mathematics, robotics/AI
Reservoir computing, a machine learning algorithm that mimics the workings of the human brain, is revolutionizing how scientists tackle the most complex data processing challenges, and now, researchers have discovered a new technique that can make it up to a million times faster on specific tasks while using far fewer computing resources with less data input.
With the next-generation technique, the researchers were able to solve a complex computing problem in less than a second on a desktop computer — and these overly complex problems, such as forecasting the evolution of dynamic systems like weather that change over time, are exactly why reservoir computing was developed in the early 2000s.
These systems can be extremely difficult to predict, with the “butterfly effect” being a well-known example. The concept, which is closely associated with the work of mathematician and meteorologist Edward Lorenz, essentially describes how a butterfly fluttering its wings can influence the weather weeks later. Reservoir computing is well-suited for learning such dynamic systems and can provide accurate projections of how they will behave in the future; however, the larger and more complex the system, more computing resources, a network of artificial neurons, and more time are required to obtain accurate forecasts.