When you use the API
There’s some exceptions eg Claude Max
> Copilot Chat uses one premium request per user prompt, multiplied by the model's rate.
> Each prompt to Copilot CLI uses one premium request with the default model. For other models, this is multiplied by the model's rate.
> Copilot coding agent uses one premium request per session, multiplied by the model's rate. A session begins when you ask Copilot to create a pull request or make one or more changes to an existing pull request.
https://docs.github.com/en/copilot/concepts/billing/copilot-...
and now I see your comment mentions that explicitly. The output was quite unambiguous. :shrug:
That being said, I don't know why anyone would want to pay for LLM access anywhere else.
ChatGPT and claude.ai (free) and GitHub Copilot Pro ($100/yr) seem to be the best combination to me at the moment.
Use other flows under standard billing to do iterative planning, spec building, and resource loading for a substantive change set. EG, something 5k+ loc, 10+ file.
Then throw that spec document as your single prompt to the copilot per-request-billed agent. Include in the prompt a caveat that We are being billed per user request. Try to go as far as possible given the prompt. If you encounter difficult underspecified decision points, as far as possible, implement multiple options and indicate in the completion document where selections must be made by the user. Implement specified test structures, and run against your implementation until full passing.
Most of my major chunks of code are written this way, and I never manage to use up the 100 available prompts.
Just at the absolute best deal in the AI market.
A cloud agent works iteratively on your requests, making multiple commits.
I put large features into my requests and the agent has no problem making hundreds of changes.
In the past we had to buy an expensive license of some niche software, used by a small team, for a VP "in case he wanted to look".
Worse in many gov agencies, whenever they buy software, if it's relatively cheap, everyone gets it.