AI Course/week02_training_and_gradients/README.md

Course file

README.md

week02_training_and_gradients/README.md

Week 2: Training And Gradients

Objective

Understand how a model improves by measuring loss, computing gradients, and updating parameters with gradient descent.

Required Videos

  • 3Blue1Brown: “Gradient descent, how neural networks learn”
  • Optional: any short derivative refresher if slopes feel fuzzy

Tasks

  1. Run python week2_training.py.
  2. Open the saved loss plot and describe its shape.
  3. Change the learning rate and compare the results.
  4. Change the number of training steps and see what happens.

Deliverables

  • A saved loss plot
  • A note describing which learning rate worked best
  • A short explanation of why the loss changed over time

Checkpoint Questions

  • What does the loss function measure?
  • What are we changing after each training step?
  • Why is the gradient useful?
  • What can go wrong if the learning rate is too high?

Stretch Idea

Add random noise to the training data and see how the final fit changes.