Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] AttributeError: 'torch_dtype' Not Found in VLLMModelConfig When dtype Not Specified #601

Open
fabfish opened this issue Mar 4, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@fabfish
Copy link

fabfish commented Mar 4, 2025

When using vllm without specifying a dtype, I encountered the following error:

AttributeError: 'VLLMModelConfig' object has no attribute 'torch_dtype'

I was running open-r1 MATH500 test:

MODEL_ARGS="pretrained=$MODEL,max_model_length=32768,gpu_memory_utilization=0.8,data_parallel_size=$NUM_GPUS,generation_parameters={max_new_tokens:32768,temperature:0.7,top_p:0.95}"

OUTPUT_DIR=data/evals/$MODEL

lighteval vllm $MODEL_ARGS "custom|math_500|0|0" \
    --custom-tasks src/open_r1/evaluate_math.py \
    --use-chat-template \
    --output-dir $OUTPUT_DIR \
    --save-details 

This error originates from this code section:
utils.py#L55

if config is not None:
    return config.torch_dtype

Proposed Solution

To improve robustness, check if config.torch_dtype exists before accessing it / Update the documentation to ensure dtype must be passed to avoid this error.

@fabfish fabfish added the bug Something isn't working label Mar 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant