If only... Despite providing a useful service, they are not as nice towards site owners as one would hope.
Internet Archive says:
> We see the future of web archiving relying less on robots.txt file declarations geared toward search engines
https://blog.archive.org/2017/04/17/robots-txt-meant-for-sea...
They are not alone in that. The "Archiveteam", a different organization, not to be confused with archive.org, also doesn't respect robots.txt according to their wiki: https://wiki.archiveteam.org/index.php?title=Robots.txt
I think it is safe to say that there is little consideration for site owners from the largest archiving organizations today. Whether there should be is a different debate.
> The "Archiveteam", a different organization, not to be confused with archive.org, also doesn't respect robots.txt according to their wiki
"Archiveteam" exists in a different context. Their usual purpose is to get a copy of something quickly because it's expected to go offline soon. This both a) makes it irrelevant for ordinary sites in ordinary times and b) gives the ones about to shut down an obvious thing to do, i.e. just give them a better/more efficient way to make a full archive of the site you're about to shut down.