Hacker News
new
past
comments
ask
show
jobs
points
by
thedevilslawyer
22 hours ago
|
comments
by
pyonpyon
22 hours ago
|
[-]
I would say it certainly can be more dense but even if it's more dense, the tokenizers count it as more. Last time I checked in OpenAI tokenizer for my agents.md it ate 30/40%~ more tokens than the English version at roughly 1:1 meaning.
reply