You don't even need to do requests if you are the owner of the URL. Robot.txt changes are applied in retrospect, which means you can disallow crawls to /abc, request a re-crawl, and all snapshots from the past which match this new rule will be removed.
Trying to search the Wayback machine almost always gives me their made-up 498 error, and when I do get a result the interface for scrolling through dates is janky at best.