[1] https://github.com/ggml-org/llama.cpp/blob/master/grammars/R...
No one (approximately) outside of Anthropic knows since the chat template is applied on the API backend; we only known the shape of the API request. You can get a rough idea of what it might be like from the chat templates published for various open models, but the actual details are opaque.