Finite Difference vs Backpropagation

This post explores 2 techniques of updating model weights for neural networks - Forward and Backward Propagation. While they both achieve the same end goal of computing derivates of the cost function and updating model weights, there implementation makes a big difference in the runtime and practicality of choosing one over the other in most of the models.




Enjoy Reading This Article?

Here are some more articles you might like to read next:

  • Information Theory & Encoding
  • Theory of Computation
  • Bloom Filters
  • Code to Give 2025
  • Max Flow vs Max Flow Value