TEMPiRL: Foundational Compounding Temporal Drift Theory for Temporal-Graph Adaptation in Large Language Models
Abstract
The basic architecture of a foundation model, including Large Language Models (LLMs), does not align with how time varies in sequential data as a continuous and conditioning variable, so they cannot learn from evolving information. This constraint is exemplified in interactive systems where LLMs are deployed. This study introduces a mathematical framework, called TEMPiRL for analysis of parameter-efficient temporal-graph adaptations. The framework posits an additive architecture that includes low-rank modulators and Time2Vec embeddings to make temporal and graph structured embeddings. TEMPiRL offers three main theoretical guarantees: a Lipschitz-based bound on drift of the model output that is proportional to the norm of the low-rank adapter; a Rademacher complexity bound on the generalization error that grows with the rank, r, of the low-rank adapter; and a formal condition for performance that captures the tradeoffs of the strength of the temporal signal, in terms of expectation with respect to a time average (decision rule), with the approximation error, and with the estimation error. This work provides a foundation for the future of additive foundation model architectures which allows for continually adaptable models.