GitHub / EliaFantini / ZO-AdaMM-vs-FO-AdaMM-convergence-and-minima-shape-comparison
Implementation and comparison of zero order vs first order method on the AdaMM (aka AMSGrad) optimizer: analysis of convergence rates and minima shape
Fork of OptML-KEC/optml-mini-project
Stars: 0
Forks: 0
Open issues: 0
License: None
Language: Jupyter Notebook
Size: 752 MB
Dependencies parsed at: Pending
Created at: almost 3 years ago
Updated at: almost 3 years ago
Pushed at: over 2 years ago
Last synced at: about 2 years ago
Topics: amsgrad, cnn-filters, convergence-analysis, convergence-rate, cosine-similarity, deep-learning, first-order-adamm, first-order-methods, machine-learning, minima-analysis, optimization-algorithms, optimizers, python, pytorch, t-sne, zero-order-adammm, zero-order-methods, zo-sgd