Answers for "torch distributed address already in use"

0

torch distributed address already in use

#Kill zombie processes set of by torch.distributed launch
# if a small number of gpus use
$ ps -a
$ kill -9 [pid]
#for larger numbers of gpus
$ kill $(ps aux | grep YOUR_TRAINING_SCRIPT.py | grep -v grep | awk '{print $2}')
Posted by: Guest on July-09-2020

Code answers related to "torch distributed address already in use"

Python Answers by Framework

Browse Popular Code Answers by Language