skip to main content
Caltech

H.B. Keller Colloquium

Monday, February 27, 2023
4:00pm to 5:00pm
Add to Cal
Annenberg 105
Swarm-Based Gradient Descent Method for Non-Convex Optimization
Eitan Tadmor, Distinguished University Professor, Department of Mathematics, University of Maryland,

We introduce a new swarm-based gradient descent (SBGD) method for non-convex optimization. The swarm consists of agents, identified with positions x and masses m.

There are three key aspects to the SBGD dynamics:

(i) persistent transition of mass from high to lower ground;

(ii) a random choice of marching direction aligned with the orientation of the steepest gradient descent; and

(iii) a time stepping protocol, h(x,m), which decreases with m.

The interplay between positions and masses leads to dynamic distinction between 'leaders' and 'explorers'. Heavier agents lead the swarm near local minima with small time steps.

Lighter agents explore the landscape in random directions with large time steps, and lead to improve position, i.e., reduce the ‘loss' for the swarm.

Convergence analysis and numerical simulations demonstrate the effectiveness of SBGD method as a global optimizer.

For more information, please contact DIANA BOHLER by phone at 16263951768 or by email at [email protected].