Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Second Workshop on Efficient Natural Language and Speech Processing (ENLSP-II)

SymbolicGPT: A Generative Transformer Model for Symbolic Regression

Mojtaba Valipour · Bowen You · Maysum H Panju · Ali Ghodsi

Keywords: [ ENLSP-Main ]


Abstract:

Symbolic regression is the task of identifying a mathematical expression that best fits a provided dataset of input and output values. Due to the richness of the space of mathematical expressions, symbolic regression is generally a challenging problem. While conventional approaches based on genetic evolution algorithms have been used for decades, deep learning-based methods are relatively new and an active research area. In this work, we present SymbolicGPT, a novel transformer-based language model for symbolic regression. This model exploits the advantages of probabilistic language models like GPT, including strength in performance, scalability and flexibility. Through comprehensive experiments, we show that our model performs strongly compared to competing models.

Chat is not available.