Neural Network Learns Sine Function with Backprop in Python

Build a neural network from scratch in NumPy to learn the sine function using backpropagation for gradient estimation. 🚀

Neural Network Learns Sine Function with Backprop in Python
Machine Learning & Simulation
3.1K views • May 30, 2023
Neural Network Learns Sine Function with Backprop in Python

About this video

Backpropagation is a method to obtain a gradient estimate for the weights and biases in a neural network. As a special case of reverse-mode automatic differentiation, it is a function transformation of the forward pass. Let's implement it in NumPy. Here is the code: https://github.com/Ceyron/machine-learning-and-simulation/blob/main/english/neural_network_sine_learning/nn_learns_sine_custom_backpropagation_in_numpy.ipynb

-------

👉 This educational series is supported by the world-leaders in integrating machine learning and artificial intelligence with simulation and scientific computing, Pasteur Labs and Institute for Simulation Intelligence. Check out https://simulation.science/ for more on their pursuit of 'Nobel-Turing' technologies (https://arxiv.org/abs/2112.03235 ), and for partnership or career opportunities.

-------

📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-learning-and-simulation

📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: https://www.linkedin.com/in/felix-koehler and https://twitter.com/felix_m_koehler

💸 : If you want to support my work on the channel, you can become a Patreon here: https://www.patreon.com/MLsim

🪙: Or you can make a one-time donation via PayPal: https://www.paypal.com/paypalme/FelixMKoehler

-------

Timestamps:
00:00 Intro
02:00 The dataset
02:25 MLP architecture with sigmoid activation function
03:26 Forward/Primal pass
06:40 Xavier Glorot weight initialization
08:06 Backward/Reverse pass
14:15 "Learning": approximately solving an optimization problem
15:10 More details on the backward pass and pullback operations
16:52 Imports
17:07 Setting random seed
17:24 Constants/Hyperparameters
18:08 Toy dataset generation
19:56 Defining nonlinear activation functions
20:39 Implementing Parameter initialization
24:45 Implementing Forward pass
27:20 Implementing loss function
29:06 backward function of the loss
30:36 Backward pass of the network
45:29 Training loop
48:15 Plot loss history
48:36 Plot trained network prediction
49:20 Summary
50:59 Outro

Video Information

Views

3.1K

Likes

81

Duration

52:04

Published

May 30, 2023

User Reviews

4.5
(3)
Rate:

Related Trending Topics

LIVE TRENDS

Related trending topics. Click any trend to explore more videos.