You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Documentation says that torch compile is not supported over distributed training right now. Since torch compile can speed up training as much as 2x using Lightning Trainer is without compile is no longer cost efficient and it would be great to support it.
It's bit unclear to me what happens if I compile the model before passing to Lightning module, will it be used as compiled model over DDP or not?
The text was updated successfully, but these errors were encountered:
🚀 Feature
Documentation says that torch compile is not supported over distributed training right now. Since torch compile can speed up training as much as 2x using Lightning Trainer is without compile is no longer cost efficient and it would be great to support it.
It's bit unclear to me what happens if I compile the model before passing to Lightning module, will it be used as compiled model over DDP or not?
The text was updated successfully, but these errors were encountered: