First-Order Methods for Two-Stage Stochastic Optimization

Sun 26.01 12:30 - 13:00

Abstract: The emergence of big data has highlighted the growing importance of data-driven approaches in stochastic optimization. Sample Average Approximation (SAA) is a widely used method, known for its simplicity and compatibility with first-order optimization techniques. However, SAA often exhibits overfitting when the amount of data is limited, and it struggles to obtain feasible solution when the feasible set is only given implicitly. To address these limitations, distributionally robust optimization (DRO) approaches, particularly those based on Wasserstein-type ambiguity sets, have been proposed as an alternative, offering improved performance at the cost of higher computational complexity. Our research explores solving the Sample Robust Optimization model, which is equivalent to distributionally robust optimization with the type−∞ Wasserstein ambiguity set, through first-order methods. Specifically, we focus on the two-stage settings, and define conditions under which tractable first-order algorithms to solve the SRO problem can be utilized. We then applied first order methods such as Mirror Descent, and primal-dual methods such as Mirror-Prox for both the Stochstic and Online settings. To support these developments, we validate the performance of our methods through numerical experiments. These experiments compare the effectiveness, out-of-sample performance, and convergence rates of SRO, SAA, and type−1 Wasserstein-based DRO across offline and online applications, showcasing the practical utility of our approach.

Speaker

Ido Iutcu

Technion

  • Advisors Shimrit Shtern

  • Academic Degree M.Sc.