You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+15-1
Original file line number
Diff line number
Diff line change
@@ -56,6 +56,7 @@ The test cases do a good job of providing discrete examples for each of the API
56
56
-[Chat with Images](#chat-with-images)
57
57
-[Embedding Generation](#embedding-generation)
58
58
-[Debug Information](#debug-information)
59
+
-[Manual Requests](#manual-requests)
59
60
-[Single-header vs Separate Headers](#single-header-vs-separate-headers)
60
61
-[About this software:](#about-this-software)
61
62
-[License](#license)
@@ -389,7 +390,20 @@ Debug logging for requests and replies to the server can easily be turned on and
389
390
ollama::show_requests(true);
390
391
ollama::show_replies(true);
391
392
```
392
-
393
+
394
+
### Manual Requests
395
+
For those looking for greater control of the requests sent to the ollama server, manual requests can be created through the `ollama::request` class. This class extends `nlohmann::json` and can be treated as a standard JSON object.
This provides the most customization of the request. Users should take care to ensure that valid fields are provided, otherwise an exception will likely be thrown on response. Manual requests can be made for generate, chat, and embedding endpoints.
406
+
393
407
## Single-header vs Separate Headers
394
408
For convenience, ollama-hpp includes a single-header version of the library in `singleheader/ollama.hpp` which bundles the core ollama.hpp code with single-header versions of nlohmann json, httplib, and base64.h. Each of these libraries is available under the MIT license and their respective licenses are included.
395
409
The single-header include can be regenerated from these standalone files by running `./make_single_header.sh`
if (ollama::log_requests) std::cout << request_string << std::endl;
@@ -400,14 +404,19 @@ class Ollama
400
404
if (ollama::use_exceptions) throwollama::exception("No response returned from server "+this->server_url+". Error was: "+httplib::to_string( res.error() ));
401
405
}
402
406
403
-
return response;
407
+
return response;
404
408
}
405
409
406
-
// Generate a streaming reply where a user-defined callback function is invoked when each token is received.
if (ollama::log_requests) std::cout << request_string << std::endl;
@@ -35190,14 +35194,19 @@ class Ollama
35190
35194
if (ollama::use_exceptions) throw ollama::exception("No response returned from server "+this->server_url+". Error was: "+httplib::to_string( res.error() ));
35191
35195
}
35192
35196
35193
-
return response;
35197
+
return response;
35194
35198
}
35195
35199
35196
-
// Generate a streaming reply where a user-defined callback function is invoked when each token is received.
0 commit comments