We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug: Currently the config for LLaMA 3.2 3B is not using tied weights. Only the 1B model is currently supported (https://github.com/facebookresearch/fairseq2/blob/main/src/fairseq2/models/llama/_config.py#L257)
Describe how to reproduce: Loaded LLaMA 3.2 3B and confirmed weights are tied
In [13]: (model.decoder_frontend.embed.weight == model.final_proj.weight).all() Out[13]: tensor(True)
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Describe the bug:
Currently the config for LLaMA 3.2 3B is not using tied weights. Only the 1B model is currently supported (https://github.com/facebookresearch/fairseq2/blob/main/src/fairseq2/models/llama/_config.py#L257)
Describe how to reproduce:
Loaded LLaMA 3.2 3B and confirmed weights are tied
The text was updated successfully, but these errors were encountered: