What’s the fireable offense? Does the boss want to stitch those tools together themselves?
If the output is crap- regardless of the tool- that’s a different story, and one we don’t have enough info to evaluate.
It depends how mission critical his brainstorming is for the company. LLMs can brainstorm too.
That means OP’s job may be _safer_, because they are getting higher leverage on their time.
It’s their colleague who’s ignoring AI that I see as higher risk.