upvote
> I think what we have is mostly AGI.

I agree that the term AGI is the problem. If I have something as intelligent as a mouse that should be AGI, if I have something a intelligent as a bird that should be AGI. Same is it's as intelligent as a 2 year old human, or someone with an IQ of 75. Those would all clearly be Artificial, General Intelligences.

But the problem is the term AGI also oddly has this bar that if must be equal to our better than human (a standard that the majority of humans would fail based on definition of intelligence alone). Plus multidisciplinary better than all humans (which it would take a super genius to have a human accomplish).

Given the current definition of you took a bright high schooler and made them artificial they wouldn't count as AGI which makes the definition silly.

And that is separate from the entire concept of sentience - which it's unclear if it's a requirement for intelligence.

It's all a bunch of squishy definitions mashed together.

reply
Yeah, LLMs fulfill any goalpost I had in my mind years ago for what AGI would look like, like the starship voice AI in Star Trek, or merely a chat bot that could handle arbitrary input.

Crazy how fast people acclimate to sci-fi tech.

reply
The Mass Effect universe distinguishes between AI, which is smart enough to be a person—like EDI or the geth—and VI (virtual intelligence), which is more or less a chatbot interface to some data system. So if you encounter a directory on the Citadel, say, and it projects a hologram of a human or asari that you can ask questions about where to go, that would be VI. You don't need to worry about its feelings, because while it understands you in natural language, it's not really sentient or thinking.

What we have today in the form of LLMs would be a VI under Mass Effect's rules, and not a very good one.

reply
Peter F Hamiltons Sci-Fi novels, do something similar they differentiate between SI (Sentient Intelligence) which is basically their own being, and is not used by people as it would be essentially slavery. And for General Purpose "AI" they use RI which is Restricted Intelligence with strict limits placed around them.
reply
The SI on Peter Hamilton's Commonwealth duology is pretty badass!
reply
This is a great analogy.

The term AGI so obviously means something way smarter than what we have. We do have something impressive but it’s very limited.

reply
The term AGI explicitly refers to something as smart as us: humans are the baseline for what "General Intelligence" means.
reply
To clarify what I meant, “what we have” means “the AI capabilities we currently have,” not “our intelligence.”

I.e., what I mean is that we don’t have any AI system close to human intelligence.

reply