upvote
Appreciate the data point. M5 Max would also be interesting to see once available in desktop form.
reply
can you post the final result (or as far as you got before you killed it) to show us how cohesive and good it is? I'd like to see an example of the output of this.
reply
Since the output is quite long, here is a link: https://pastebin.com/k76wiVGP
reply
Why does this G character appear to prefix most of the output? ("Ġlike")
reply
It is a tokenizer artifact most likely (https://github.com/huggingface/transformers/issues/4786). So the output is not properly decoded in this case, it should just be a space.
reply
The original tokens have Ġ instead of space. I had this issue too when writing an inference engine for Qwen. You have to "normalize" those special characters.
reply