upvote
That's pretty much the approach we took with context-mode. Tool outputs get processed in a sandbox, only a stub summary comes back into context, and the full details stay in a searchable FTS5 index the model can query on demand. Not trained into the model itself, but gets you most of the way there as a plugin today.
reply
Is it because of caching? If the context changes arbitrarily every turn then you would have to throw away the cache.
reply