Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop: Machine Learning and the Physical Sciences

Towards an Astronomical Foundation Model for Stars

Henry Leung


Abstract:

Rapid strides are currently being made in the field of artificial intelligence using Large Language Models (LLMs) with Transformers architecture. Aside from some use of the base technical components of Transformers---the attention mechanism---the real potential of Transformers in creating artificial intelligence in astronomy has not yet been explored. Here, we introduce a novel perspective on such model in data-driven astronomy by proposing a framework for astronomical data that use the same core techniques and architecture as used by natural-language LLMs. Using a variety of observations and labels of stars as an example, we build a prototype of a foundation model and we show that this model can be easily adapted and trained with cross-survey astronomical data sets. This single model has the ability to perform both discriminative and generative tasks even though the model was not trained to do any specific task that we test it on. This demonstrates that foundation models in astronomy are well within reach and will play a large role in the analysis of current and future large surveys.

Chat is not available.