Week 2: Training And Gradients
Objective
Understand how a model improves by measuring loss, computing gradients, and updating parameters with gradient descent.
Required Videos
- 3Blue1Brown: “Gradient descent, how neural networks learn”
- Optional: any short derivative refresher if slopes feel fuzzy
Tasks
- Run
python week2_training.py.
- Open the saved loss plot and describe its shape.
- Change the learning rate and compare the results.
- Change the number of training steps and see what happens.
Deliverables
- A saved loss plot
- A note describing which learning rate worked best
- A short explanation of why the loss changed over time
Checkpoint Questions
- What does the loss function measure?
- What are we changing after each training step?
- Why is the gradient useful?
- What can go wrong if the learning rate is too high?
Stretch Idea
Add random noise to the training data and see how the final fit changes.