Hacker News
new
past
comments
ask
show
jobs
points
by
magic_hamster
2 days ago
|
comments
by
fortyseven
1 days ago
|
[-]
Strangely, I haven't had a lot of luck with vLLM; I finally ended up ditching Ollama and going straight to the tap with llama-serve in llamacpp. No regrets.
reply