undefined min read
PyTorch Internals 05 - How the Autograd Graph and Engine Work
Autograd is not just automatic differentiation; it is a graph-construction and backward-execution runtime
Autograd is not just automatic differentiation; it is a graph-construction and backward-execution runtime
Custom autograd functions are a practical place to define forward-backward contracts before dropping to lower-level extensions
Backward design is really a question about what to save, what to recompute, and how to preserve correct semantics