upvote
Yeah we're all quickly figuring out that LLMs shift the engineering work from computer science to bullshit detection. You basically have to become that guy on the Internet who's always trying to prove you wrong when working with Claude Code. Otherwise you're going to build yourself a false reality and get skewered if you try to share it. I mean I've done it myself, because we're so used to blindly trusting the things other people built, that we forget we're the ones building it. Nothing in life is free.
reply
LLMs may be enabling, but OP explicitly stated "I wanted to dig deeper into the subject, but not by reading a boring textbook", which, of course, would have eliminated the issues in this tool, or at least made it clear to them that they needed to dive deeper before publishing. I feel like there might be some analogy with blaming cars for drunk driving - by definition, not possible without cars, but you can drive responsibly if you choose to.
reply
That's how Plato taught Aristotle. I'd much rather have a dialog than read a textbook. You just have to think critically and fact check. You can't just trust whatever the robot says because it interpolates knowledge.
reply
Sure, but there apparently wasn't enough of a dialog either. With a textbook, you're confronted with facts and explanations that you didn't ask about, or even knew to ask about. Don't get me wrong - my original recommendation to take an interactive course is still the best option in my mind, as simplifications made for the benefit of the learner often lead to apparent contradictions that an instructor can clarify. But at some point you do just need the set of raw facts to be able to work with these systems.
reply