Simultaneous Incremental Gradient Method for Inconsistent Convex Optimization Problem

Thomas Katsekpor

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the minimization of additive cost function, which consists of a large number of smooth convex functions, over a large number of closed convex sets. This type of minimization arises in a number of settings, including large-scale data processing applications and distributed optimization across a finite number of agents, and statistical estimation problems. We adopt a first-order simultaneous incremental gradient method which involves taking finite number of steps sequentially along the gradients of the component functions, with intermediate adjustment of the variables after processing each component function in a cyclic order. Then, the last step or variable obtained at the end of a cycle is projected onto all the constraint sets and the new iterate is the relaxed convex combination of such projections. We prove the convergence of this algorithm from any starting point to the feasible solution of the optimization problem in the consistent case where the intersection of the convex sets is non-empty and to a weighted least squares type solution in the inconsistent case.

Original languageEnglish
Article number221
JournalMediterranean Journal of Mathematics
Volume22
Issue number8
DOIs
Publication statusPublished - Dec 2025

Keywords

  • Simultaneous projection
  • closed convex sets
  • consistent and inconsistent case
  • continuously differentiable functions
  • gradient method
  • relaxation parameter

Fingerprint

Dive into the research topics of 'Simultaneous Incremental Gradient Method for Inconsistent Convex Optimization Problem'. Together they form a unique fingerprint.

Cite this