upvote
The impact on context tokens would be more of a 'you're holding it wrong' problem, no? The GH MCP burning tokens is an issue on the GH MCP server, not the protocol itself. (I would say that since the gh CLI would be strongly represented in the training dataset, it would be more beneficial to just use the CLI in this case though.)

I do think that we should adopt Amp's MCPs-on-skills model that I've mentioned in my original comment more (hence allowing on-demand context management).

reply
Verbosity of the output seems orthogonal to the cli vs mcp distinction? When I made mcp tools and noticed a lot of tokens being used, I changed the default to output less and added options to expose different kinds of detailed info depending what the model wants. CLI can support similar behavior.
reply
In the front page there's a project that attempts to reduce tje boilerplate of mcp output in claude code

Eventually I hope that models themselves become smarter and don't save the whole 54k tokens in their context window

reply