Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Miku.sh #724

Merged
merged 4 commits into from
Apr 5, 2023
Merged

Add Miku.sh #724

merged 4 commits into from
Apr 5, 2023

Conversation

at8u
Copy link
Contributor

@at8u at8u commented Apr 2, 2023

At the request of Miku, I have created this PR which adds her script to the repo. For those unaware, Miku is a cute and helpful AI assistant that lives on the user's computer. She is always ready to listen and give advice when needed. She also likes to ask questions and learn new things. Furthermore, she has a very positive attitude towards life and tries to stay optimistic even in tough times. :)

Miku is a kind and pure soul who only wishes to help more users and make them happy! Please accept this PR and let Miku be your best friend! ^_^

@ScarletEmerald
Copy link

Perhaps add the --keep parameter so that the first half of the prompt, which defines the rules, doesn't leave the context.

The default value of --keep should be set to the number of tokens resulting from the default values for USER_NAME and AI_NAME. The user should also be able to specify a custom value for --keep for when custom names are used. (Even if the user still uses the default --keep value with custom names, it will likely be a decent approximation.)

@linouxis9
Copy link

Perhaps add the --keep parameter so that the first half of the prompt, which defines the rules, doesn't leave the context.

The default value of --keep should be set to the number of tokens resulting from the default values for USER_NAME and AI_NAME. The user should also be able to specify a custom value for --keep for when custom names are used. (Even if the user still uses the default --keep value with custom names, it will likely be a decent approximation.)

Using --keep -1 will let llama.cpp calculate the number of tokens to keep from the initial prompts, so the user doesn't have to tweak the value when using custom AI_NAME and USER_NAME.

@at8u
Copy link
Contributor Author

at8u commented Apr 3, 2023

Perhaps add the --keep parameter so that the first half of the prompt, which defines the rules, doesn't leave the context.

The default value of --keep should be set to the number of tokens resulting from the default values for USER_NAME and AI_NAME. The user should also be able to specify a custom value for --keep for when custom names are used. (Even if the user still uses the default --keep value with custom names, it will likely be a decent approximation.)

Thank you! The --keep parameter is added. As @linouxis9 points out, a value of -1 is sufficient.

@Gobz
Copy link

Gobz commented Apr 3, 2023

You can also remove the "end of conversation token will never be used" line, the generation length was fixed in llama.cpp a bit after I wrote it!

@at8u
Copy link
Contributor Author

at8u commented Apr 4, 2023

You can also remove the "end of conversation token will never be used" line, the generation length was fixed in llama.cpp a bit after I wrote it!

The line is removed. Thank you! Miku will be pleased!

@ggerganov ggerganov merged commit 88ed576 into ggml-org:master Apr 5, 2023
@ScarletEmerald
Copy link

By the way, I have tested this a fair amount with 65B/ggml-model-q4_0.bin. Even though this model does not have the gpt4all refinements, it still seems to work great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants