Timezone: »
Learning with streaming data has attracted much attention during the past few years.Though most studies consider data stream with fixed features, in real practice the features may be evolvable. For example, features of data gathered by limited lifespan sensors will change when these sensors are substituted by new ones. In this paper, we propose a novel learning paradigm: Feature Evolvable Streaming Learning where old features would vanish and new features would occur. Rather than relying on only the current features, we attempt to recover the vanished features and exploit it to improve performance. Specifically, we learn two models from the recovered features and the current features, respectively. To benefit from the recovered features, we develop two ensemble methods. In the first method, we combine the predictions from two models and theoretically show that with the assistance of old features, the performance on new features can be improved. In the second approach, we dynamically select the best single prediction and establish a better performance guarantee when the best model switches. Experiments on both synthetic and real data validate the effectiveness of our proposal.
Author Information
Bo-Jian Hou (LAMDA Group)
Lijun Zhang (Nanjing University (NJU))
Zhi-Hua Zhou (Nanjing University)
More from the Same Authors
-
2021 Poster: Revisiting Smoothed Online Learning »
Lijun Zhang · Wei Jiang · Shiyin Lu · Tianbao Yang -
2021 Poster: Dual Adaptivity: A Universal Algorithm for Minimizing the Adaptive Regret of Convex Functions »
Lijun Zhang · Guanghui Wang · Wei-Wei Tu · Wei Jiang · Zhi-Hua Zhou -
2021 Poster: Online Convex Optimization with Continuous Switching Constraint »
Guanghui Wang · Yuanyu Wan · Tianbao Yang · Lijun Zhang -
2020 Poster: Dynamic Regret of Convex and Smooth Functions »
Peng Zhao · Yu-Jie Zhang · Lijun Zhang · Zhi-Hua Zhou -
2018 Poster: Adaptive Online Learning in Dynamic Environments »
Lijun Zhang · Shiyin Lu · Zhi-Hua Zhou -
2018 Poster: $\ell_1$-regression with Heavy-tailed Distributions »
Lijun Zhang · Zhi-Hua Zhou -
2018 Poster: Fast Rates of ERM and Stochastic Approximation: Adaptive to Error Bound Conditions »
Mingrui Liu · Xiaoxuan Zhang · Lijun Zhang · Rong Jin · Tianbao Yang -
2017 Poster: Scalable Demand-Aware Recommendation »
Jinfeng Yi · Cho-Jui Hsieh · Kush Varshney · Lijun Zhang · Yao Li -
2017 Poster: Improved Dynamic Regret for Non-degenerate Functions »
Lijun Zhang · Tianbao Yang · Jinfeng Yi · Rong Jin · Zhi-Hua Zhou -
2017 Poster: Subset Selection under Noise »
Chao Qian · Jing-Cheng Shi · Yang Yu · Ke Tang · Zhi-Hua Zhou -
2015 Poster: Subset Selection by Pareto Optimization »
Chao Qian · Yang Yu · Zhi-Hua Zhou -
2013 Poster: Mixed Optimization for Smooth Functions »
Mehrdad Mahdavi · Lijun Zhang · Rong Jin -
2013 Poster: Linear Convergence with Condition Number Independent Access of Full Gradients »
Lijun Zhang · Mehrdad Mahdavi · Rong Jin