upvote
I'd rather see a distill on the 26B model that uses only 3.8B parameters at inference time. Seems like it will be wildly productive to use for locally-hosted stuff
reply
gemma4-31b-it-claude-opus-4-6-distilled-abliterated-heretic-GGUF-q4-k-m
reply