I agree that the term AGI is the problem. If I have something as intelligent as a mouse that should be AGI, if I have something a intelligent as a bird that should be AGI. Same is it's as intelligent as a 2 year old human, or someone with an IQ of 75. Those would all clearly be Artificial, General Intelligences.
But the problem is the term AGI also oddly has this bar that if must be equal to our better than human (a standard that the majority of humans would fail based on definition of intelligence alone). Plus multidisciplinary better than all humans (which it would take a super genius to have a human accomplish).
Given the current definition of you took a bright high schooler and made them artificial they wouldn't count as AGI which makes the definition silly.
And that is separate from the entire concept of sentience - which it's unclear if it's a requirement for intelligence.
It's all a bunch of squishy definitions mashed together.
Crazy how fast people acclimate to sci-fi tech.
What we have today in the form of LLMs would be a VI under Mass Effect's rules, and not a very good one.
The term AGI so obviously means something way smarter than what we have. We do have something impressive but it’s very limited.
I.e., what I mean is that we don’t have any AI system close to human intelligence.