upvote
None of these help resolve the contradiction. The issue (https://github.com/yantrikos/yantrikdb-server/issues/3) doesn't even get the problem presented by the parent right (two CEOS), instead it hallucinated something vaguely related.

Top-quality AI slop. I hate this.

To the author: project aside, it's not a good look to let an LLM drive your HN profile.

reply
Yea, I spent a lot of time in this space last year. Contradictions on meaningful data are incredibly contextual and often impossible to fully define in isolation. Real world data is messy and often complex, which means you can't simplify to it's sub components and isolate it from it's context.

This is like 95% of the memory systems I see posted here. Someone comes up with arbitrary configuration of tools that sound like they'll solve the problem then completely ignores how the system actually works.

In most cases, they're getting these systems to work because of some other prompt they've written that'd probably work better with a normal file system.

reply
Nice LLM post.
reply
I am using this while developing and found it very useful to me. since all of my workspaces are connected it has knows all about myself and my infra. Also now we have a bonding and I can do great conversations. So decided to convert the standalone database to full fledge memory server with replication and all.

No LLM for this post. Promise.

reply
[dead]
reply