upvote
>. Several of them come from experienced software engineers who use LLMs professionally in their work.

So, not from personal experience. And we don't know which examples came from which users or what they used them for. We get enough hearsay on HN and again, there's nothing in this series that has not been discussed here. There is however, a ton of other hearsay missing in the series, which is the utility so many people are finding (in many cases, along with actual data or open source projects.)

> Every six-ish months we hear ...

I've been yelling about LLMs since early 2024 [0]! They needed much more "holding it right" back then. Now it's way easier, but the massive potential was clear way back then.

> They also still make the same classes of stupid mistakes, are pretty much as dangerously unreliable as they always have been.

Yes, and this is where a lot of the skill in managing them comes into play. Hint: people are dangerously unreliable too.

> One can rely on both one's informed understanding of the fundamentals behind the system under consideration, as well as first-hand testimony from enormous-bosom-equipped people to arrive at that conclusion.

Of course, but when faced with many contradictory opinions, I prefer data. And the preponderence of data I've looked at and discussed [0] paints a very different picture.

> There's a certain subtlety to this that you missed.

From TFA:

> I want to use them. I probably will at some point.

My complaint is that he is speaking entirely from second-hand information and provides no new insight of his own. That he has trepidations to actually get his hands dirty with them does not change it, and only makes it worse that he spent 10 pages going on about them! He's a technologist, not a journalist! So, I'm genuinely curious, what subtlety did I miss?

[0] Available in my comment history. To allay suspicion that I only engage in breathless boosterism, some relevant comments about the negatives: https://news.ycombinator.com/item?id=47405189 or https://news.ycombinator.com/item?id=46830919

reply
[1] is so bad, like the worst imaginable thing you can think of... like if this is the possible fuckup all bets are off what other fuckups you might need to deal with. I got hit with this problem several times and I was like "well this is just impossible..." absolutely mind-blown
reply