![]() However, its applicability might be limited due to the algorithms being numerically unstable: ![]() Since 1969 Strassen’s algorithm has famously stood as the fastest way to multiply 2 matrices - but with #AlphaTensor we’ve found a new algorithm that’s faster, with potential to improve efficiency by 10-20% across trillions of calculations per day! - Demis Hassabis October 5, 2022 Since matrix multiplication is so pervasive in modern computing, the paper claims that these new algorithms could lead to a 10-20% improvement across trillions of calculations. Deep learning has made amazing progress in some areas, but this ain't (yet) one of them. On the other hand, the TCS community has a slightly different opinion about the work:Ī Nobel Prize and a Fields Medal for slightly improving the rank of matrix mult over GF(2) in a few cases? As math, it wouldn't even be accepted in STOC/FOCS/SODA. We live in exciting times!- Lex Fridman October 6, 2022 ![]() These are early steps in AI inventing new ideas in math and physics worthy of a Nobel Prize and Fields Medal. Some in the ML community hail it as yet another outstanding achievement for deep RL :Ĭongrats to for developing AlphaTensor, a truly remarkable application of deep reinforcement learning to algorithm discovery. The algorithm in the paper, called AlpaTensor, can find fast matrix multiplication algorithms for some fixed-size matrices. ![]() The recent paper, Discovering faster matrix multiplication algorithms with reinforcement learning by DeepMind, has been garnering much attention from both the ML and TCS communities.
0 Comments
Leave a Reply. |