My current pet peave is using period instead of comma, as in:
> My people lived the other side of this equation. Not the factory floor. The receiving end.
Ostensibly this is supposed to add gravitas, but it's very often done in places where that gravitas isn't needed, and it comes off as if I'm reading the script for an action movie trailer.
Quite paradoxical: when its a person's native language we can spot it a mile away but there's no shortage of engineers who claim how good the code output is.
Whatever the reason for the default tone of AI in English, it's still there when generating code. It makes me think that the senior engineers who claim that it produces awesome output just don't understand the specific programming language as a someone who thinks in it almost natively.
Content creators are starting to include these traits into their scripts now, too. It's uncanny when you (literally) hear it.
Why would you assume this when the more likely reasonable is that the 'content creators' are just pasting LLM output?
All the top results had the same AI feel to them. The same format and structure.
The best part? None of them said yes or not. None of them answered the question. They just listed common dairy and non-vegan ingredients to look out for. So, all that AI and nobody put in the ingredients list. Lol
The text has few of the obvious AI tells. The only thing that, to me, looks characteristic of LLM-generated text is the short and terse sentence structure, but this has been a "prestigious" way to write in English since Hemingway.
The most obvious patterns here are: antithesis constructions, words choices and distribution, attempt at profundity in every paragraph but instead are runs of text that doing say anything, and even the perfect use of compound hyphenation. I think and can appreciate that there is definitely an attempt at personalization and guidance to make it less LLM-y and not just a default prompt, but it’s still kind of obvious. You could use a detector tool too of course.
This article is clearly LLM-generated, even the title. A key indicator is that it almost makes sense: we forgot how to manufacture because that got sent to a different nation. The coding thing isn’t getting sent anywhere, so humanity is forgetting how to code. The distinction undermines a lot of the emotional baggage about offshoring that the article wants you to bring along.
Hemingway writes simple sentences with a kind of detachment to make the emotional flow of his stories as transparent as possible.
LLM slop reads more like slide bullet points extrapolated to prose-length text
Find some pre 2020 that are, and you'd have a point.