This is a polite way of saying unreliable and untrustworthy.
The problem facing enterprise is best understood by viewing LLMs as any other unreliable program.
> We’ve found that treating LLMs as suggestion engines rather than decision makers changes the architecture completely.
Sure does. Look at the damage LLM "suggestions" are doing to open source projects wordwide.