-
Notifications
You must be signed in to change notification settings - Fork 505
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PyTorch Lightning TPU support #1611
Comments
Hey @williamFalcon, Are you not on Workplace chat anymore? I was looking for you there. How are we supposed to organize our foodie tour of NY (and possibly other places)? As for PyTorch Lightning + TPUs it sounds cool but the PyTorch/XLA team is already pretty committed for H1. If it's going to happen I think you'll have to do it. Is that cool? Also, while we can respond to specific questions about using the PyTorch/XLA API, we don't typically work out how to use it from a particular system. Have you check out our docs yet? The code to run on TPUs is very similar to the code to run on other device types. |
@mruberry haha. I'm back at NYU for a bit :) I can do it, I just remember there were 3 minor changes I had to make but I forgot. |
Integrated! https://colab.research.google.com/drive/1-_LKx4HwAxl5M6xPJmqAAu444LTDQoa3#scrollTo=kr8cql-aaKnC |
Did you add the |
yup! would love a sanity check here if you have time. |
❓ Questions and Help
Hi! I spoke with @dlibenzi at NeurIPS about adding TPU support to Lightning. He had mentioned 3-4 steps we needed to do to support it. Any chance someone can list those out again?
The API needs to be implemented in this file:
https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/distrib_data_parallel.py
The text was updated successfully, but these errors were encountered: