- cross-posted to:
- chatgpt@lemdro.id
- cross-posted to:
- chatgpt@lemdro.id
crossposted from !chatgpt@lemdro.id
You must log in or register to comment.
I want to like assistants like this, but I can't see myself using one regularly until I can run it locally on my device.
Yep. My next phone is going to have at least 16GB of RAM so I can run a modestly capable LLM on it.
I hope LLMs encourage vendors to stop being so skimpy with storage and RAM. 8/128 has been the norm for like 4 years now. Why is it not advancing?!?
And it's even worse on the iPhone side. 6GB...