R-based implementation of a Stochastic Gradient Descent method neural network with a width of 50 and one hidden layer. The algorithm performs a parameter update through a forward - backward propagation. minibatching size of 50. The aim is to estimate the parameters of the model from a training dataset of a 1-dimensional function.
-
Notifications
You must be signed in to change notification settings - Fork 0
R-based implementation of a Stochastic Gradient Descent method neural network.
License
LeeoBianchi/SDG_NeuralNetwork
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
R-based implementation of a Stochastic Gradient Descent method neural network.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published