undefined min read
PyTorch Internals 12 - Backward Implementation Patterns and Saved-State Strategy
Backward design is really a question about what to save, what to recompute, and how to preserve correct semantics
Backward design is really a question about what to save, what to recompute, and how to preserve correct semantics
The goal of studying PyTorch internals is not trivia, but the ability to connect custom operators, kernel work, profiling, and distributed runtime behavior