Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.


But you can't just switch between installed models like in ollama, can you?





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: