About hyperparameters sweeping for DDP program

Hi, I have a program which needs multiple GPUs to run at the same time, currently I use DDP to launch the program. I wonder how can I do the sweeping , the program will still be launched in DDP mode (using all GPUs) at each trial. Thanks!


Hi Weipeng,

I am happy to help you with this. Are you using PyTorch or PyTorch lightning?

Hi Weipeng,

I am following up on your recent issue with DDP. If you require further assistance please let do reach out again.