-
Notifications
You must be signed in to change notification settings - Fork 11.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[User] How can instructions be sent without entering interactive mode? #1689
Comments
I'm trying to understand your question. It sounds like you simply want to prompt a language model with a single instruction. In this case I'd use -p in ./main to specify the instruction. For example;
|
@JackJollimore is it possible to get the answer from another terminal app like dotnet ? to get the result from llama cpp ? |
Essentially, you want the results of llama.cpp printed to another terminal (dotnet) - It sounds viable, but I haven't done it myself. It probably involves the llama.cpp api, or possibly a server. Hopefully, there's someone available that knows for sure and will respond. |
@JackJollimore thanks - I think about server also but the server is not designed to open and close the llama cpp, so I thought calling directly the llama cpp can get the result from the caller |
Thanks for answering! :) It's answer:
The way I understand it to access the instruction tuned version I need the -ins flag, which automatically opens interactive mode. I would like to sent a single instruction and process the answer further, for now. |
I'm trying to understand, but I'm having difficulty. It sounds like what you want is a single, succinct response to your instruction. The way a model responds depends the model, the prompt template, and the words used to instruct it. With the -p parameter, Llama.cpp prompts the language model without entering interactive mode. Include the -ins parameter if you need to interact with the response. If I need a single, succinct response then I'd prompt an instruction-based model, like WizardLM, by adding to the -p parameter in the main prompt. For example..
You might try adding the word "Consise", "Ensure your response is brief", and "Respond with a single, articulate sentence that completes the request", to your instruction. For example..
Please clarify if I'm misinterpreting your message. |
I'd love to help you if I knew the answer, but I don't, so I suggest checking discussion, or opening a new issue so that others who are more informed can better help. |
Sorry for late reply, several things broke. To clarify: I have a small server that takes post requests and forwards them as inputs to the command line. So is there a way to enter an instruction, over the command line, without using interactive mode? Just sent one instruction, get one reply? |
Be default llama.cpp ./main runs without interactive mode. An instruction, or prompt generation may be sent to llama.cpp with -p parameter.
This parameter will not be interactive unless including either -i, or -ins parameters. Ensure to exclude -i and -ins from ./main. |
I simply accepted interactive mode for now, my use case could work with it. |
@BrLlan Hey, I have the similar problem. So now you just simply cat the results by a shell script? |
I would like a script to pass a a single instruction and receive an answer.
The text was updated successfully, but these errors were encountered: