You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, it is not a real issue, but i want to share here that we made a running app with interactive mode on mobiles using your repo.
You can find our repo here : https://github.com/Bip-Rep/sherpa
Unfortunately it doesnt have your latest commit because there is an error during runtime but we made a fork here https://github.com/Bip-Rep/llama.cpp You need to be on the "for_mobile" branch to build the libraries. We are using an older working commit on this branch.
We translated the main functions of llama.cpp in dart.
Working demo
Click on the image to view the video on YouTube.
Hope this helps.
The text was updated successfully, but these errors were encountered:
Looks really cool ! 👍
This definitely captures the essence of "inference on the edge", what this project is all about. 😄
This would be otherwise moved to a discussion but if there is a runtime error on the newer commits, well that definitely sounds like a issue. Could you elaborate on that?
Have fun
Hi, it is not a real issue, but i want to share here that we made a running app with interactive mode on mobiles using your repo.
You can find our repo here : https://github.com/Bip-Rep/sherpa
Unfortunately it doesnt have your latest commit because there is an error during runtime but we made a fork here https://github.com/Bip-Rep/llama.cpp You need to be on the "for_mobile" branch to build the libraries. We are using an older working commit on this branch.
We translated the main functions of llama.cpp in dart.
Working demo
Click on the image to view the video on YouTube.
Hope this helps.
The text was updated successfully, but these errors were encountered: