Hacker News
new
past
comments
ask
show
jobs
points
by
alfiedotwtf
14 hours ago
|
comments
by
pshirshov
10 hours ago
|
next
[-]
In that case the model descriptor on ollama.com is incorrect, because it defaults to 16k. So I have to manually change that to 64/128k. I think you are talking about maximum context size.
reply
by
trvz
10 hours ago
|
prev
|
[-]
No, the default context in Ollama varies by the memory available:
https://docs.ollama.com/context-length
reply