-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
multiprocessing cpu only training #222
Comments
good question. we only support distributed GPU but would welcome a PR to support multi-CPU options. |
Hah! I was just looking for this. If nothing else, it would make testing DDP much easier. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
I would like to take this up. |
Cool! Thx @skepticleo |
Looks like this was recently added in #1158 :) |
I would like to know if it's possible to train a model with multiprocess parallelism (no GPU available) using Lightning (sync analogue of https://pytorch.org/docs/stable/notes/multiprocessing.html#hogwild) ? After a quick glance, I've the impression that in
Trainer
all available options for parallelism are GPU based (if I'm not mistakentorch.DPD
supports multiproc CPU-only training).The text was updated successfully, but these errors were encountered: