GitHub / NILAY2233 / Machine_Learning---Learning-Gradient-Descent-optimization-techniques
Gradient Descent is a technique used to fine-tune machine learning algorithms with differentiable loss functions. It's an open-ended mathematical expression, tirelessly calculating the first-order derivative of a loss function and making precise parameter adjustments.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/NILAY2233%2FMachine_Learning---Learning-Gradient-Descent-optimization-techniques
PURL: pkg:github/NILAY2233/Machine_Learning---Learning-Gradient-Descent-optimization-techniques
Stars: 0
Forks: 0
Open issues: 0
License: None
Language: Jupyter Notebook
Size: 718 KB
Dependencies parsed at: Pending
Created at: about 1 year ago
Updated at: about 1 year ago
Pushed at: about 1 year ago
Last synced at: about 1 year ago
Topics: batch-gradient-descent, gradient-descent, minibatch-gradient-descent, stochastic-gradient-descent