This is an instance of someone familiar with complex file access patterns not understanding the normal use case for these services.
The people using these bidirectional sync services want last writer wins behavior. The mild and moderately technical people I work with all get it and work with it. They know how to use the UI to look for old versions if someone accidentally overwrites their file.
Your characterization as complete chaos with constant problems does not mesh with the reality of the countless low-tech teams I've seen use Dropbox type services since they were launched.
No, I'm not joking. We used to allow arbitrary paths in a cloud API I owned. Within about a month someone had figured out that the cost to store a single byte file was effectively zero, and they could encode arbitrary files into the paths of those things. It wasn't too long before there was a library to do it on Github. We had to put limits on it because otherwise people would store their data in the path, not the file.
Reason - to not overcomplicate or give appearance of nickel-and-diming
Would there be any engineering/management pushback on the customer side? "we have to write a tiny script", "this is non-standard" / "why are you the only ones who charge us for filenames?"
(have limited knowledge here)
You can build such a system yourself quite trivially by getting an FTP account, mounting it locally with curlftpfs, and then using SVN or CVS on the mounted filesystem. From Windows or Mac, this FTP account could be accessed through built-in software.
If we remove the whole linux section and just ask "why not map a folder in Explorer" it's a reasonable question, probably even more reasonable in 2026 than in 2007. The network got faster and more reliable, and the dropbox access got slower.
But the moment that hits normal users, yeah, mess
It works perfectly fine as long as you keep how it works in mind, and probably most importantly don't have multiple users working directly on the same file at once.
I've been using these systems for over a decade at this point and never had a problem. And if I ever do have one, my real backup solution has me covered.
“Every file is only ever written to from a single client, and will be asynchronously made available to all other clients, and after some period of time has elapsed you can safely switch to always writing to the file from a different client”.
:P
What do you use and how do you test / reconcile to make sure it’s not missing files? I find OneDrive extremely hard to deal with because the backup systems don’t seem to be 100% reliable.
I think there are a lot of solutions these days that error on the side of claiming success.
That being said i understand how it works at a high level.