An open API service providing repository metadata for many open source software ecosystems.

GitHub topics: relu-derivative

mansi-k/BackPropagation

Implemented back-propagation algorithm on a neural network from scratch using Tanh and ReLU derivatives and performed experiments for learning purpose

Language: Jupyter Notebook - Size: 348 KB - Last synced at: about 2 months ago - Pushed at: almost 4 years ago - Stars: 0 - Forks: 0

bhattbhavesh91/why-is-relu-non-linear

A small walk-through to show why ReLU is non linear!

Language: Jupyter Notebook - Size: 119 KB - Last synced at: about 2 months ago - Pushed at: about 4 years ago - Stars: 2 - Forks: 5

aex-nirvael/ReLU

Backward pass of ReLU activation function for a neural network.

Language: Python - Size: 1.95 KB - Last synced at: about 2 years ago - Pushed at: over 5 years ago - Stars: 0 - Forks: 0

juliusberner/regularity_relu_network

Towards a regularity theory for ReLU networks (construction of approximating networks, ReLU derivative at zero, theory)

Language: Jupyter Notebook - Size: 229 KB - Last synced at: about 2 years ago - Pushed at: over 5 years ago - Stars: 0 - Forks: 0