Prof. Andrea Walther(Department of Mathematics, Humboldt University Berlin)
hosted by Seminar Series on Scientific Computing
"On a semismooth conjugate gradient method"
In machine learning and other large scale applications, nowadays deterministic and stochastic variants of the steepest descent method are widely used for the minimization of objectives that are only piecewise smooth. As alternative, in this talk we present a deterministic descent method based on the generalization of rescaled conjugate gradients proposed by Phil Wolfe in 1975 for objectives that are convex. Without this assumption the new method exploits semismoothness to obtain conjugate pairs of generalized gradients such that it can only converge to Clarke stationary points. In addition to the theoretical analysis, we present preliminary numerical results.
|Time:||Thursday, 23.06.2022, 12:00|
|Place:||hybrid: Room 32-349 and via Zoom|