upvote
The biggest issue here is that the best library for doing audio warping (ZPlane) is not available to us. We already do realtime audio warping for clip playback, just like Ableton, using RubberBand (and might consider using Staffpad at some point, which we have available for static stretches).

However, following the tempo map is a very different challenge than following user-directed edits between warp markers, and neither RubberBand nor Staffpad really offer a good API for this.

In addition, the GUI side of this poses a lot of questions: do you regenerate waveforms on the fly to be accurate, or just use a GUI-only scaling of an existing waveform, to display things during the editing operation.

We would certainly like to do this, and have a pretty good idea of how to do it. The devil, as usual, is in the details, and there are rather a lot of them.

There's also the detail that having clips be bpm-synced addresses somewhere between 50% and 90% of user needs for audio warping, which reduces the priority for doing the human-edited workflow.

reply
>do you regenerate waveforms on the fly to be accurate, or just use a GUI-only scaling of an existing waveform, to display things during the editing operation

just use GUI scaling, and only IF the prior is too challenging

reply
It's not as if a constantly changing single-axis non-linear transform is trivial to accomplish in the GUI either :(
reply
You often want sample accurate waveform visualization when tuning samples that are time or pitch warped to set start and loop points at zero crossings to avoid clicks without needing fades.
reply