Jae's Tech Blog
Home Archive About Game

Posts tagged "nccl"

January 12, 2026 undefined min read

Distributed LLM Training 03 - All-Reduce, Ring, and How to Read Communication Cost

To reason about distributed training performance, you need a concrete mental model for all-reduce and collective communication cost

Lectures
Read more
January 24, 2026 undefined min read

Distributed LLM Training 07 - NCCL and Topology: Why the Same GPU Count Can Behave Very Differently

In distributed training, performance is often shaped more by how GPUs are connected than by the raw number of GPUs

Lectures
Read more
February 26, 2026 undefined min read

Distributed LLM Training 18 - Deadlocks, Timeouts, and OOMs: Debugging Distributed Training

Debugging distributed training is about narrowing down which rank, which collective, and which state transition went wrong

Lectures
Read more

© 2025 Jae ยท Notes on systems, software, and building things carefully.

RSS