I'm just pointing out "we don't need this right now" isn't necessarily an argument against "we don't need this".
There is a saying that isn't perfect but may apply: better to have it and not need it then to need it and not have it.
Here is another way of looking at it. Let's say agents don't meet the hyped up expectations and we build all of this robust tooling for nothing. So we have all of this work towards creating autonomous testing systems but we don't have the autonomous agents. That still seems like a decent outcome.
When we plan around optimistic views of the future, we tend to build generally useful things.