We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cuda版本:12.0 python版本:3.10.12 transformers版本:4.41.2 openai版本:1.35.7 gpu: A800 80G
No response
对于微调后的同一任务,在openai_api_server开头部分添加代码
seed = 42 torch.manual_seed(seed) np.random.seed(seed) if torch.cuda.is_available(): torch.cuda.manual_seed_all(seed) torch.backends.cudnn.deterministic = True torch.backends.cudnn.benchmark = False
以及更改openai_api_request部分
def simple_chat_test(prompt,use_stream=False): messages = [ { "role": "user", "content": prompt } ] response = client.chat.completions.create( model="glm-4", messages=messages, stream=use_stream, max_tokens=256, temperature=0, seed=42, top_p=1e-9, ) # print(response) if response: if use_stream: print(use_stream) # for chunk in response: # yield chunk.choices[0].delta.content else: return response.choices[0].message.content else: print("Error:", response.status_code)
均无法保证输出的可复现性,连续多次使用相同的prompt询问会出现不一致的情况。
希望调整参数可以使输出稳定一致可复现
The text was updated successfully, but these errors were encountered:
固定一下torch的种子和transformers传入的种子
Sorry, something went wrong.
尝试过没有效果
zRzRzRzRzRzRzR
No branches or pull requests
System Info / 系統信息
cuda版本:12.0
python版本:3.10.12
transformers版本:4.41.2
openai版本:1.35.7
gpu: A800 80G
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
Reproduction / 复现过程
对于微调后的同一任务,在openai_api_server开头部分添加代码
以及更改openai_api_request部分
均无法保证输出的可复现性,连续多次使用相同的prompt询问会出现不一致的情况。
Expected behavior / 期待表现
希望调整参数可以使输出稳定一致可复现
The text was updated successfully, but these errors were encountered: