New ask Hacker News story: Neograd – A deep learning framework created from scratch using Python and NumPy

Neograd – A deep learning framework created from scratch using Python and NumPy
2 by pranftw | 0 comments on Hacker News.
Hey everyone! I released v0.0.2 of neograd, a deep learning framework created from scratch using Python and NumPy, with automatic differentiation capabilities. I’d taken for granted that I understood how convolutions work. Just implement a sliding window, perform element-wise multiplication, take its sum, sounds so simple right? Add to that - accounting for the running time of the algorithm, backward pass to get its gradients and convolutions over volumes, this turned out to be an excruciating undertaking. This release includes: - Gradient checking to check the correctness of gradients that are calculated by autograd - Optimization algorithms like Momentum, RMSProp, and Adam - 2D, 3D Convolution and 2D, 3D Pooling layers for Convolutional Neural Networks - Save trained models, weights to disk and load them whenever required - Add checkpoints while training the model - Documentation hosted at https://ift.tt/FcCQ5Ev Checkout the GitHub repo - https://ift.tt/2IeXDUM Explore the new features on Google Colab - https://ift.tt/vTNk9Kt https://ift.tt/npPzGHX

Comments