Big companies sometimes beg governments to add new regulations that will cost them millions of dollars to comply with. They’ll send expensive lobbyists to Washington and basically say, “Please give us more paperwork! Please make us hire more compliance officers!”
Wtf? Why??
If a sector is full of rules and regulations that cost millions of dollars and lots of legal headaches to comply with, all the scrappy little bootstrapped companies will go out of business, no one new will be able to start up, and the big companies won’t have to compete with anyone. So even though the regulations might be costly for them, the regulations still ultimately get them what they want.
I think a lot of social groups work the same way, including EA. People with a lot of existing social status in EA are like the big companies. Their lifestyles could be threatened by all the new people getting involved in EA. So one natural response is to make more rules. Things like:
“You’re not really an EA unless you live frugally so you can donate a lot”
“I don’t think we should weigh people’s opinions too heavily unless they actually understand Bayesian reasoning”
“Real EAs are vegan”
A lot of the times, when people suggest these new rules, they and their friends wouldn’t have to change anything about their lifestyles, but it means a group of new people wouldn’t really count as EAs. But sometimes they’ll actually suggest something that would require them to change - something like “I’m not a vegetarian but I think EAs should be vegetarian”. My theory is, in the same way as these big companies would rather pay large costs than deal with so many competitors, maybe there’s a part of us that would too.
I don’t think this explains everything that’s going on when people make these kinds of suggestions - for alternative hypotheses, search “costly signal” on the EA Forum - but it’s a theory that makes just enough sense that whenever I suggest a new rule for EAs, I try to pause and ask myself why.
I think another alternative explanation, which I believe, is simply that being EA means being different and those differences are important. Another way of putting that is concern around "watering-down", or "when everything is X, nothing is"; it's about preserving what's special and important (not *just* different). I think the status and costly-signal stuff makes sense as well.
This is a really interesting framing on barriers to entry in EA!! It reminded me a bit of this piece https://forum.effectivealtruism.org/posts/AJtbfPQL7gLqaNAcC/my-bargain-with-the-ea-machine