-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Segmentation Fault with Distilled Models on CPU when word_timestamps=True #1283
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Post config.json file of that model. |
|
This model is not for faster-whisper. |
config.json |
"alignment_heads" is same as in original whisper https://huggingface.co/openai/whisper-base/blob/main/generation_config.json I think it should be different for distil model. |
That's what I directly got through the following code: |
Did you check in original Whisper if |
Thanks for the suggestion. I've tested this further. Using the Hugging Face
Using the
Using openai-whisper library
|
Can you share that ct2 model? |
|
transformers word timestamps is return_timestamps='word' not True. This model also errors in hf and openai because the distil process removes layers which makes alignment heads refer to heads in nonexistant layers. |
I am encountering a "Segmentation fault (core dumped)" error when using faster-whisper under specific conditions:
The error does not occur if:
The text was updated successfully, but these errors were encountered: