Optimization Nuggets: Stochastic Polyak Step-size, Part 2 : Keep the gradient flowing

Optimization Nuggets: Stochastic Polyak Step-size, Part 2
by: Keep the gradient flowing
blow post content copied from  Planet SciPy
click here to view original post

This blog post discusses the convergence rate of the Stochastic Gradient Descent with Stochastic Polyak Step-size (SGD-SPS) algorithm for minimizing a finite sum objective. Building upon the proof of the previous post, we show that the convergence rate can be improved to O(1/t) under the additional assumption that …

November 19, 2023 at 04:30AM
Click here for more details...

The original post is available in Planet SciPy by Keep the gradient flowing
this post has been published as it is through automation. Automation script brings all the top bloggers post under a single umbrella.
The purpose of this blog, Follow the top Salesforce bloggers and collect all blogs in a single place through automation.