Maybe it will make them output better text, but it doesn’t make them better writers. That’d be like saying (to borrow the analogy from the post) that using an excavator makes you better at lifting weights. It doesn’t. You don’t improve, you don’t get better, it’s only the produced artefact which becomes superficially different.
> If you're going to be doing much writing, you should have your own prompt that can help with your voice and style.
The point of the article is the thinking. Style is something completely orthogonal. It’s irrelevant to the discussion.
AI is almost the exact opposite. It's verbose fluff that's only superficially structured well. It's worse than average
(waiting for someone to reply that I can tell the AI to be concise and meaningful)
"You're describing the default output, and you're right — it's bad. But that's like judging a programming language by its tutorial examples.
The actual skill is in the prompting, editing, and knowing when to throw the output away entirely. I use LLMs daily for technical writing and the first draft is almost never the final product. It's a starting point I can reshape faster than staring at a blank page.
The real problem isn't that AI can't produce concise, precise writing — it's that most people accept the first completion and hit send. That's a user problem, not a tool problem."
and that's because people have a weird sort of stylistic cargo-culting that they use to evaluate their writing rather than deciding "does this communicate my ideas efficiently"?
for example, young grad students will always write the most opaque and complicated science papers. from their novice perspective, EVERY paper they read is a little opaque and complicated so they try to emulate that in their writing.
office workers do the same thing. every email from corporate is bland and boring and uses far too many words to say nothing. you want your style to match theirs, so you dump it into an AI machine and you're thrilled that your writing has become just as vapid and verbose as your CEO.
LLMs and agents work the same way. They’re power tools. Skill and judgment determine whether you build more, or lose fingers faster.
No one finds AI-assisted prose/code/ideas boring, per se. They find bad prose/code/ideas boring. "AI makes you boring" is this generation's version of complaining about typing or cellular phones. AI is just a tool; it's up to humans how to use it.
If they don't care enough to improve themselves at the task in the first place then why would they improve at all? Osmosis?
If this worked then letting a world renown author write all my letters for me will make me a better writer. Right?
Who cares if you're a "good writer?" Are you "easy to understand" is the real achievement.