It does. People drive these entities. People hide behind the liability shields and authority of these entities. Also notice that I generalized with the phrase “…and trusting anyone…”
I'm not an expert in political theory or ethics either, but in my worldview, power relationships matter in these discussions. I believe power and responsibility should go hand in hand, and I hold entities to a standard that is proportional to their power to influence others lives.
If an entity's power is decentralized, for example when it is democratically organized to some degree, then that disperses both power and responsibility.
Uh, what? People have been killing each other over values misalignments since there have been people. We invented civilization in part to protect our farms and granaries from people who disagreed with us on whose grain was in said granaries.
Broad-based alignment doesn't come from nothing, but it is surprisingly easy to achieve when a population recognizes a shared stake. A synthesis between selfishness and altruism emerges when you consider who you can call a "neighbor".
Sure. But it takes work for anything larger than a small, close-knit community. I’m pushing back on the notion that this comes naturally and is a default state. It’s not, at least not relative to people naturally forming in and out groups.
The armchair commenters are probably folks who have never organized a group of people before outside a commercial context.
But that shared stakeholding doesn’t naturally drive alignment. You need journalists, fiction writers, organizers and delegates. Travel and curiosity. These each take effort, resources and organization. It’s something we do well. But it isn’t spontaneous in the way small-group kinship is—it literally emerges if you put people in proximity.
Whatever the difference between naturalness and a state of nature, it has nothing to do with education or middle-class existence.
> i.e. without brain-washing and deliberately working to create out-groups
It really isn't. The whole point of the market system is to collectively align people's actions towards a shared target of "Pareto-optimized total welfare". And even then the alignment is approximate and heavily constrained due to a combination of transaction costs (which also account for e.g. externalities) and information asymmetries. But transaction costs and information asymmetries apply to any system of alignment, including non-market ones. The market (augmented with some pre-determined legal assignment of property rights, potentially including quite complex bundles of rules and regulations) is still your best bet.
What you describe is factually not how human society formed.
I'd strongly suggest reading his books. They profoundly changed my understanding of how human institutions and society form.
No, it does not, and that's Graeber's whole point.
"Markets" are not some sort of physical law of the universe.
A simple example of this is it's the norm in hunter gatherer societies to take care of people who never will make an equal contribution back in the transactional sense.
Because the social ties in those societies are not simply transactions.
If your model fails to accurately describe empirical reality, time to improve/expand the model.
I like economics and math too, but the whole discussion of markets is a terrible starting place for deriving results in ethics/psychology. If you insist though, notice that unions will happen unless some other organization is working to prevent them. What do you suppose this means? People are aligned with each other exactly because they've noticed their coworkers are not corporations or governments.
Although the two are entangled, politics is a more relevant framing than economics here. If people weren't broadly aligned on basic stuff, then autocrats, theocrats, kleptocrats and so on would simply not be interested in dismantling democracies. They make that effort because they must.
Historically, we did essentially the opposite. We figured out many aspects of human ethics and psychology first, and deduced from them how and why markets work as they do.
> ... If people weren't broadly aligned on basic stuff, then autocrats, theocrats, kleptocrats and so on would simply not be interested in dismantling democracies. They make that effort because they must.
This implies that people are only weakly aligned in the first place, otherwise no such attempt at dismantling could ever succeed. That's not a very interesting claim; it does not refute the usefulness of some external mechanism to more directly foster aligned action. Markets do this with a maximum of decentralized power and a minimum of institutional mechanism.
This is not the history, it is a mythology in opposition to the empirical evidence.
Which is why you should read Graeber.
Anyhow, replying is clearly past the point of utility here.
"Understandable in market terms" doesn't mean the thing is actually understood, and in fact may be dangerously misunderstood.
Not OP, but for me, kind family and friends, and various feel-good pieces of fiction and other writing, at least let me envision the possibility of a perfectly kind/dedicated/innocent/naieve individual who is truly on my side 100%. But even that is mostly imagination and fiction... although convincing others of that isn't necessairly an argument worth making.
Commercial entities have a fundamental purpouse of profit. While profit doesn't have to be a zero-sum game - ideally, everyone benefits in a somewhat balanced way - there's some fundamental tension, in that each party's profit is necessairly limited by the other party's.
Government entities have a fundamental purpouse of executing the will of the state, which is rather explicitly not the same thing as the will of you as an individual.
Both commercial and government entities also tend to involve multiple people, which gets statistics working against you - you really gathered that many people who would put your needs above their own, with exactly zero "imposters" - which in this context just means people with a bit of rational self interest?
> I guess I'm trying to wonder why this line of thinking (in theory) doesn't turn to paranoia about everybody. I don't know much ethics or political theory or anything.
Just because you're paranoid, doesn't mean they aren't out to get you. Trust, but verify.
You might not be able to put absolute blind trust in anybody. I certainly can't. However, one can hedge one's bets, and diversify trust. Build social circles of people with good character, good judgement, and calm temperments - and statistics will start working for you. It's unlikely they'll all conspire to betray you simultaniously, especially if you've ensured betrayal costs much and gains little. While petty and jealous people can indeed be irrational enough to betray under such circumstances, it'll be harder for them to create the kind of conspiracy necessary for mass betrayal that might cause significant enough damage to warrant proper paranoia. You might still have to watch out for gaslighters stealing credit (document your work!) and framing people (document your character!) and other such dishonest and manipulative behavior... but if everyone's looking out for the same thing, well, that's just everyone looking out for everyone else! That's a community looking out for each other, and holding everyone honest and accountable. Most find comfort in that, rather than the stress paranoia implies.
Put yourself in a room full of manipulators and schemers, on the other hand, and "parnoia about everyone" might be the only reasonable or rational response!
There was a Japanese visual novel in the 2000s about a girl who was your personal maid, and was so devoted would always take your side in any conflict, accept and support you just the way you are, even if you were a horrid person to your friends. It turns out she was a ghost, or a kind of yokai, or something. Anyhoo, back on 2ch she attracted a fandom, and there was a second group of people on 2ch who labelled her a "useless person manufacturer" because if you actually had a person who always accepted you just the way you are and never pushed back, that can be actually a trap that prevents you from developing.
It's a theme that's relevant today when people have AI servitors that always glaze them. It puts even certain utopian AI fiction, like Richard Stallman's story "Made for You", into a whole new light.
One of the ideas I've toyed with, even before all the AI hype, is a dumb, semi-adversarial servitor. Something to nag or taunt me about chores not done, to interrupt me when I'm doomscrolling, to use as a vessel for precommitment, to challenge me in various ways. I've been too lazy to build it thus far. Many tools overlap the problem space, so I shouldn't be using that as an excuse - perhaps I should give StayFocusd another shot.
Conflict and other stressors - in moderation, within the limits of one's ability to handle - are important for growth and health. A tree shielded from wind is weakened as it fails to develop stress wood and structural strength. A good debate can sharpen my thoughts and mind, walking to lunch keeps my cardiovascular system healthy, rising to life's various challenges gives me the security of knowing I can rise to the occasion and gives me more skills.
Profit is obtained by maximizing traded benefits and minimizing costs. None of this requires taking anything away from any other party.
Gain is obtained by the easiest means available. Your narrow definition of profit is seldom the easiest, cheating is far "superior" especially when it's legal for some.
> None of this requires taking anything away from any other party.
"required" and "preferred" (e.g. because it's far easier) are different like night and day.