upvote
> pluggable knowledge banks.

plugs in knowledge bank LLM: ... I know kung fu.

reply
Agreed, I suspect that LLMs in the future will have separate (possibly standardized) decoding/encoding layers that plug into logic layers.
reply
This is interesting. Would this mean less space for hallucination as well (depending on the breadth of knowledge applied to a specific task)?
reply
[dead]
reply