Timezone: »
In this work, we study data-driven decision-making and depart from the classical identically and independently distributed (i.i.d.) assumption. We present a new framework in which historical samples are generated from unknown and different distributions, which we dub \textit{heterogeneous environments}. These distributions are assumed to lie in a heterogeneity ball with known radius and centered around the (also) unknown future (out-of-sample) distribution on which the performance of a decision will be evaluated. We quantify the asymptotic worst-case regret that is achievable by central data-driven policies such as Sample Average Approximation, but also by rate-optimal ones, as a function of the radius of the heterogeneity ball. Our work shows that the type of achievable performance varies considerably across different combinations of problem classes and notions of heterogeneity. We demonstrate the versatility of our framework by comparing achievable guarantees for the heterogeneous version of widely studied data-driven problems such as pricing, ski-rental, and newsvendor. En route, we establish a new connection between data-driven decision-making and distributionally robust optimization.
Author Information
Omar Besbes (Columbia University)
Will Ma (Columbia University)
Omar Mouchtaki (Columbia University)
More from the Same Authors
-
2022 Poster: Online Bipartite Matching with Advice: Tight Robustness-Consistency Tradeoffs for the Two-Stage Model »
Billy Jin · Will Ma -
2021 Poster: Improved Guarantees for Offline Stochastic Matching via new Ordered Contention Resolution Schemes »
Brian Brubach · Nathaniel Grammel · Will Ma · Aravind Srinivasan -
2020 Poster: The Convex Relaxation Barrier, Revisited: Tightened Single-Neuron Relaxations for Neural Network Verification »
Christian Tjandraatmadja · Ross Anderson · Joey Huchette · Will Ma · KRUNAL KISHOR PATEL · Juan Pablo Vielma