Timezone: »

Learning Fractional White Noises in Neural Stochastic Differential Equations
Anh Tong · Thanh Nguyen-Tang · Toan Tran · Jaesik Choi

Thu Dec 01 09:00 AM -- 11:00 AM (PST) @ Hall J #430

Differential equations play important roles in modeling complex physical systems. Recent advances present interesting research directions by combining differential equations with neural networks. By including noise, stochastic differential equations (SDEs) allows us to model data with uncertainty and measure imprecision. There are many variants of noises known to exist in many real-world data. For example, previously white noises are idealized and induced by Brownian motions. Nevertheless, there is a lack of machine learning models that can handle such noises. In this paper, we introduce a generalized fractional white noise to existing models and propose an efficient approximation of noise sample paths based on classical integration methods and sparse Gaussian processes. Our experimental results demonstrate that the proposed model can capture noise characteristics such as continuity from various time series data, therefore improving model fittings over existing models. We examine how we can apply our approach to score-based generative models, showing that there exists a case of our generalized noise resulting in a better image generation measure.

Author Information

Anh Tong (KAIST)
Thanh Nguyen-Tang (Johns Hopkins University)
Toan Tran (Vinai artificial intelligence application and research JSC)
Jaesik Choi (KAIST)

More from the Same Authors