Skip to yearly menu bar Skip to main content


Poster

Learning to compute Gröbner bases

Hiroshi Kera · Yuki Ishihara · Yuta Kambe · Tristan Vaccon · Kazuhiro Yokoyama

East Exhibit Hall A-C #1004
[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Solving a polynomial system, or computing an associated Gröbner basis, has been a fundamental task in computational algebra. However, it is also known for its notorious doubly exponential time complexity in the number of variables in the worst case. This paper is the first to address the learning of Gröbner basis computation with Transformers. The training requires many pairs of a polynomial system and the associated Gröbner basis, raising two novel algebraic problems: random generation of Gröbner bases and transforming them into non-Gröbner ones, termed as backward Gröbner problem. We resolve these problems with 0-dimensional radical ideals, the ideals appearing in various applications. Further, we propose a hybrid input embedding to handle coefficient tokens with continuity bias and avoid the growth of the vocabulary set. The experiments show that our dataset generation method is a few orders of magnitude faster than a naive approach, overcoming a crucial challenge in learning to compute Gröbner bases, and Gröbner computation is learnable in a particular class.

Live content is unavailable. Log in and register to view live content