The problem right now is that they can be held liable for distributing CSAM content on their services and, since April 3, they can also be fined if they try to detect that content. It's an impossible situation.
Now, I'm not claiming that these companies always have noble intentions. But there's nothing nefarious here -- they just want regulatory certainty: do X, Y, and Z and you won't be fined or sued.
My impression is that they don't like the bad PR currently associated with various debates surrounding use of social media by children. At the same time they don't want to implement various policies that would be popular with the general public but would hurt their bottom line (ie they don't want to do the right thing).
So instead they make a big deal about various imperfections to justify draconian solutions that would see them able to implement all sorts of privacy violating measures. Thankfully that failed so now they're engaging in a smear campaign.
The current conduct of these companies in this regard is openly evil.
Implementing end-to-end encryption on relevant communication services could mitigate many risks that come with hosting user content.
It would protect users from Big Tech spying and still allow affected users to report if something sketchy is going on. Best of both worlds.
In any case, it would be a good start.
It's impossible to imagine having democratic societies where four fat cats know everything about everyone and most people know almost nothing about them, where information, instead of being scattered everywhere for resilience, is concentrated in just a few hands.