-
-
Notifications
You must be signed in to change notification settings - Fork 641
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transformers example #1656
transformers example #1656
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR @ahmedo42 ! Looks good !
I left few comment to improve docs and the code.
Let me try to run it locally and see how it works...
@ahmedo42 I updated a bit more the example (maybe same things should be done on cifar10 as well). |
this definitely is cleaner , I agree that same changes should be done to the CIFAR10 example
yes, I checked on GPU , TPU on Kaggle and Colab ( as I Don't have GPU 😄 ) , the default batch size (128) was meant for TPU . |
Could you please rerun the check from your side to ensure that the example is working. I checked on GPU(s). Maybe, what remains is TPUs... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Thanks @ahmedo42 !
It works on TPU |
Fixes #960
Description:
Transformers example using huggingface and Ignite , works on TPU & GPU , tried to stick as much as possible to the code in
CIFAR10 example .
Check list: