upvote
In that case the model descriptor on ollama.com is incorrect, because it defaults to 16k. So I have to manually change that to 64/128k. I think you are talking about maximum context size.
reply
No, the default context in Ollama varies by the memory available: https://docs.ollama.com/context-length
reply