GitHub / pngo1997 / Neural-Network-Modifications-Hyperparameter-Experiments
Modifies a neural network's hyperparameters, activation functions, cost functions, and regularization methods to improve training performance and generalization.
Stars: 0
Forks: 0
Open issues: 0
License: None
Language: Jupyter Notebook
Size: 73.2 KB
Dependencies parsed at: Pending
Created at: 4 months ago
Updated at: 4 months ago
Pushed at: 4 months ago
Last synced at: 3 months ago
Topics: activation, deep-learning, dropout-rates, epoch, hyperparameter-optimization, leaky-relu, neural-network, neural-network-training, python, regularization, relu, sigmoid-function, tanh