中文版
Running LLaMA, a ChapGPT-like large language model released by Meta on Android phone locally. I use antimatter15/alpaca.cpp, which is forked from ggerganov/llama.
Oobabooga was very slow, I tried h2ogpt and was good, I could pass it docs too for custom training but still slow.
Lord of the language models is the easiest to setup, nice interface but still a bit slow.
Ollama is the fastest I tried so far, I couldn't make it's web based ui work yet, hope to have success. And then I need a way to pass it custom docs
This is the only guide I found for android, on linux I just discovered ollama which is great
Removed by mod
Thanks for the susuggestions. Is there any way to run it with 4GB ram? Maybe with smaller models of 2B instead of 7B?
Removed by mod
Oobabooga was very slow, I tried h2ogpt and was good, I could pass it docs too for custom training but still slow. Lord of the language models is the easiest to setup, nice interface but still a bit slow. Ollama is the fastest I tried so far, I couldn't make it's web based ui work yet, hope to have success. And then I need a way to pass it custom docs