Name | Date | Size | #Lines | LOC | ||
---|---|---|---|---|---|---|
.. | - | - | ||||
README.md | H A D | 25-Apr-2025 | 343 | 8 | 4 | |
_IR.py | H A D | 25-Apr-2025 | 47.5 KiB | 1,244 | 906 | |
__init__.py | H A D | 25-Apr-2025 | 661 | 29 | 25 | |
_backward.py | H A D | 25-Apr-2025 | 13.4 KiB | 371 | 288 | |
_debug.py | H A D | 25-Apr-2025 | 557 | 22 | 15 | |
_unflatten.py | H A D | 25-Apr-2025 | 741 | 28 | 21 | |
_utils.py | H A D | 25-Apr-2025 | 2.5 KiB | 100 | 75 | |
microbatch.py | H A D | 25-Apr-2025 | 15.9 KiB | 470 | 299 | |
schedules.py | H A D | 25-Apr-2025 | 87 KiB | 2,163 | 1,659 | |
stage.py | H A D | 25-Apr-2025 | 56.6 KiB | 1,469 | 1,114 |
README.md
1# Pipeline Parallelism for PyTorch 2 3`torch.distributed.pipelining` is a package for implementing pipeline parallelism on your model. 4 5Our documentation is available [here](https://pytorch.org/docs/main/distributed.pipelining.html). 6 7 8