zhengbochao 6c7f7663e1 Initial commit 3 nedēļas atpakaļ
..
README.md 6c7f7663e1 Initial commit 3 nedēļas atpakaļ
__init__.py 6c7f7663e1 Initial commit 3 nedēļas atpakaļ
distributed_data_parallel.py 6c7f7663e1 Initial commit 3 nedēļas atpakaļ
run.sh 6c7f7663e1 Initial commit 3 nedēļas atpakaļ

README.md

distributed_data_parallel.py and run.sh show an example using Amp with apex.parallel.DistributedDataParallel or torch.nn.parallel.DistributedDataParallel and the Pytorch multiprocess launcher script, torch.distributed.launch. The use of Amp with DistributedDataParallel does not need to change from ordinary single-process use. The only gotcha is that wrapping your model with DistributedDataParallel must come after the call to amp.initialize. Test via

bash run.sh

This is intended purely as an instructional example, not a performance showcase.