Nowadays, different kinds of problems such as modeling, optimal control, and machine learning can be formulated as an optimization problem. Gradient descent is the most popular method to solve such problem and many accelerated gradient descents have been designed to improve the performance. In this paper, we will analyze the basic gradient descent, momentum gradient descent, and Nesterov accelerated gradient descent from the system perspective and it is found that all of them can be formulated as a feedback control problem for tracking an extreme point. On this basis, a unified gradient descent design procedure is given, where a high order transfer function is considered. Furthermore, as an extension, both a fractional integrator and a general fractional transfer function are considered, which resulting in the fractional gradient descent. Due to the infinite-dimensional property of fractional order systems, numerical inverse Laplace transform and Matlab command stmcb() are used to realize a finite-order implementation for the fractional gradient descent. Besides the simplified design procedure, it is found that the convergence rate of fractional gradient descent is more robust to the step size by simulating results.

This content is only available via PDF.
You do not currently have access to this content.