However, following the tempo map is a very different challenge than following user-directed edits between warp markers, and neither RubberBand nor Staffpad really offer a good API for this.
In addition, the GUI side of this poses a lot of questions: do you regenerate waveforms on the fly to be accurate, or just use a GUI-only scaling of an existing waveform, to display things during the editing operation.
We would certainly like to do this, and have a pretty good idea of how to do it. The devil, as usual, is in the details, and there are rather a lot of them.
There's also the detail that having clips be bpm-synced addresses somewhere between 50% and 90% of user needs for audio warping, which reduces the priority for doing the human-edited workflow.
just use GUI scaling, and only IF the prior is too challenging