r/SneerClub archives
newest
bestest
longest
62

I don’t see (or do) much sneering at the Rationalism-adjacent Effective Altruism movement, because they at least seem to be doing something worthwhile. Apparently they are plenty sneerworthy though.

https://twitter.com/NathanpmYoung/status/1464688390620725259

https://preview.redd.it/8df54pm00d281.png?width=1482&format=png&auto=webp&v=enabled&s=b1a3b9cafd8d276ef46223301c0ba21cd02323c0

Personal gripe: even their flipping name is condescending, as if everyone who isn’t in alignment with their program is implicitly saying, “No, actually, I prefer ineffective altruism. Excuse me while I light my money on fire like an idiot.”

That's a little harsh, I think. It's just a marketing slogan trying to convey their distinguishing principle in a couple of words. There are certainly charities with different focus than "effectiveness", like say the Make-a-Wish Foundation, which has a more explicitly sentimental raison d'etre.
Exclusion from the group they created does not imply exclusion from the concepts they value. Your disgusting logic implies that everyone who isn't part of an organization implicitly claims the opposite stance of the organization. If you can't see how that's dumb then you need some education. You and everyone who agrees with you is an idiot for thinking this. If I am wrong about my above point then prove it.
DISCUSTAN lol
Discusting! You'll never address my point, but that is what I expected of you.
What point? Also, we don't use the r-word around here, buddy.
Ah, thanks. I've edited my post so as not to offend. My point is that the organization's name does not imply that those who are not aligned with the organization prefer ineffective altruism. As an example: If you are not a part of "The Israeli Basket Weaving Advocate Group" does that mean that you do not advocate for Israeli Basket Weaving? You can still advocate for something without being aligned with a specific group. EDIT: Join TIBWAG
dang dude sorry I should join Effective Arguing, because I am obviously ineffective at it.
You aren't aligned with them so it must mean you can't be effective at arguing! Cheers for being a good sport.
You seem really angry. Show us where the Rationalists touched you.
I get angry at dummies who think they're smart. At least I know I'm a dummy. Can you defend /u/o7___o7 logic posted above my comment? Am I missing something here? How does not being alinged with a program automatically imply that you are against it? That is not the implication of the name at all and its the poster's fault for thinking that. If I am wrong please let me know, I'm here to learn.
Like always, context and history is important. It’s not just their name that comes off as elitist; it’s their entire organization and the individuals who run and support it.
> even their flipping name is condescending, If you look at the claim made, there is no context provided, and none needed for the conclusion I made: The claim is that the name implies that anyone not aligned with their program is implicitly saying "No, actually, I prefer ineffective altruism. Excuse me while I light my money on fire like an idiot." The claim of implication is solely based on the name, and you'll see how flawed this logic is if you try to apply the same rules to any other organization's name. Context doesn't matter because the claim is not about the organization or it's actions, the claim is about the name and what it implies. This is basic logic. I don't mean to be personal, but for people who type real smart, some of you don't seem to think much. EDIT: If the poster intended something different, as if to say something about context or the organization/individuals, then they should have used the words that mean that. You can't argue for the use of inaccurate language, "oh this is what they must have meant". No I don't care what they must have meant, that is their fault for not using correct English and I will not be held responsible for that. What they SAID is based on flawed logic and should be corrected. Normally I can give the benefit of the doubt to people in communications, but I have made the personal decision that a community that prides itself on picking people apart (not that there's anything wrong with that) doesn't deserve the same leeway. Live by the sword, die by the sword. Best make sure your ducks (or words) are in a row before commenting.
"It's actions". Checkmate.
> It's I admit my mistake, it does not diminish my point. I am not immune from criticism and I make mistakes too. Can anyone else do the same and admit their mistakes, or is the ego too big? A proud bunch on here, I love knocking the proud down a peg, I'll be hanging around if they let me. My points have still not been addressed.
Ok, you are right and we were wrong, Mr (hope that is correct) Moon Doge. I'm happy, finally I no longer have the dumbest username.
I'll take that title, thank you very much... At the risk of getting off topic, is there a story behind Soyweiser?
I randomly picked the word from [Shadowrun](https://en.wikipedia.org/wiki/Shadowrun) because I thought it sounded funny, before I knew about [soylent green](https://en.wikipedia.org/wiki/Soylent_Green) and budweiser (something which I don't associate with the nickname at all). So in a way it is a testament to my own ignorance.

[deleted]

For those who hadn't heard: [link](https://www.reddit.com/r/SneerClub/comments/bat2sl/lets_use_30000_dollars_to_give_away_already/).
Ehhh idk, I’ve skirted the EA community for a good while (basically since its coalescence a decade and some ago, and usually in a detached & pseudonymous capacity, but occasionally also serving as a volunteer, advisor, director, etc. to various orgs) and the MIRI / philosophical AI stuff has always seemed a bit more tangential to me, if not sometimes embarrassing for its inclusion. It’s a major horn of the broader movement, sure, but in eg the more short term animal welfare / rights or developing world health cause areas they’re looked at more as strange and begrudging bedfellows than as foremost thought leaders or anything ¯\\\_(ツ)_/¯ lol
Can you tell us how central fighting global warming is in EA? It’s pretty clearly the largest future negative. Is it something they understand to be important?
EA organisations in general take global warming/climate change - whatever you want to call it - very seriously, and act to that effect. The issue is that the most exciting EA stuff these days for some people is the Bostrom-Yudkowsky stuff about the future of AI and shit like that. I know of a bunch of people who turned away from the general movement and said “fuck this theoretical bullshit, I’m gonna do a direct action”.
Industrial Society and the Methods of Rationality.
> EA organisations in general take global warming/climate change - whatever you want to call it - very seriously, and act to that effect. Is their administration still mostly independent of the wingnut faction? The impression I'd gotten is that EA advertises as one thing, but the people involved in organizing and managing are increasingly drawn from the wingnuts, and this ends up determining organizational priorities even in contradiction of the original and stated principles.
Yes, in my read at least this is the accurate picture. The most exciting wingnut sci-fi stuff eats up the time, money, organisation etc. Animal liberation and climate change get a look in here and there but somebody motivated to get involved tends to be motivated less by the basics and more by the flashy stuff.
I think it's well recognized but not central -- e.g. it's the probably the single most talked about subject in the [effective environmentalism](https://www.facebook.com/groups/1509936222639432) FB group, which has ~2k members; to give context, [effective animal advocacy](https://www.facebook.com/groups/EffectiveAnimalAdvocacy/) has ~6k. There are articles in the [EA sub-magazine](https://www.vox.com/future-perfect/2019/12/2/20976180/climate-change-best-charities-effective-philanthropy), [talks](https://www.eaglobal.org/talks/#) at EA Global, discussions on the [main forum](https://forum.effectivealtruism.org/posts/pcDAvaXBxTjRYMdEo/climate-change-is-neglected-by-ea), etc. A lot of the discussion afaict centers on, as you might expect, more 'quantifiable' searches for the biggest bang-for-your-buck carbon offset, and less mainstream (sulfide)-pie-in-the-(stratospheric)-sky geoengineering / carbon-sequestering solutions, rather than e.g. political lobbying / structural change. My interpretation for the lack of stronger emphasis is the focus on the importance-neglectedness-tractability framework. The bad effects of climate change are important to mitigate, but where humans are concerned a lot of them come from a lack of mobility, food insecurity, exposure to novel pathogens or increased pathogen densities, etc. and are thus plausibly downstream from global poverty & developing world health. There's a lot of mainstream attention paid to climate change already (e.g. [Nature in 2019](https://www.nature.com/articles/d41586-019-02712-3) says "more than half a trillion dollars a year is going into climate-related activities", guided also by, like, actual scientists), so it's arguably not as neglected. And there are many billions of dollars flowing in the exact opposite direction to protect vested interests, not to mention more that might arise to meet farther-reaching intervention, so it's maybe not as tractable? Probably also some founder effects in there, too.
> if you want to save the highest number of conscious beings, the most effective thing you can do with your money is to donate it to MIRI The real bodhisattvas were the machine gods we made along the way.

you guys seen this one?

Apparently the “long term future fund”, for cutting edge research and such, mainly spends its money on funding EA members’ graduate degrees. Pretty good deal if you’re still in undergrad tbh. Get in there!

Edit: actually saying “graduate degrees” was too generous, many of these people were conducting “independent research”, “upskilling”, “leveling up”, “facilitating conversations” and various other totally real activities. Aaaand a bunch of the money ends up at MIRI as well.

>The Fund has supported events such as the Catalyst Biosummit, which brings together synthetic biologists, policymakers, academics, and **biohackers** to collaborate on mitigating biorisks I can't imagine anyone I'd trust less on 'mitigating biorisks' than people who inject themselves with homemade herpes drugs.
Hey also implanting magnets. (E: Dont get me wrong I think that is pretty interesting, not something for EA however). Fun to see how much science fiction driven this all is.
> Fun to see how much science fiction driven this all is. A continuing theme I see in everything rationalist adjacent is how vulnerable it’s is to various flavors of Pascal's Mugging. Science fiction is frequently driven by the idea of very large payoffs (negative or positive), which I think explains your observation. Whether their love for Bayesian statistics is driven by this, or vice-versa, seems much of the snake swallowing it’s own tail. Each justifies the other in turn. The rather bizarre thing is that generally speaking they are aware of this problem of the hope of large payoffs … and then proceed on as if it’s not a problem that needs to be solved and continue to be obsessed by them. SAs recent snippet about Newcomb's “Paradox” brings this to mind. Make the payoffs $100 and $101 and all of a sudden the “argument” for one-boxing disappears, but you rarely see this discussed. You need an excessively large difference in payouts to make the “paradox” appear.
Oh my gosh, the "Why you might choose not to donate to this Fund" section of that is hilarious. In particular because it ignores the very big point of "you don't think that this fund is actually spending its money wisely to accomplish the goals it claims to be trying to accomplish".
Feels a lot like strategies brought into the mainstream by the American conservative-intellectual-industrial apparatus, where a relatively small amount of money and networking for ambitious, resource-strapped college students ensures lifelong friendliness, if not outright loyalty, to the movement.
I didn't realize how incestuous it is--several of the grant recipients are described as friends or roommates or...whatever this is: >at the time of making the grant, Stag was living with me and helping me with various odd jobs, as part of his plan to meet people in the EA community and help out where he could (That guy got $23,000 to do...whatever. No commitment to any specific project.) Maybe the worst is $30,000 (2 separate grants) to a company developing a paid *note-taking app*--founded by the fund manager's friend/roommate. Justified because it'll help the EAs think good, which is good for the future of humanity. Really? How does this stuff get approved without opposition? Are their millionaire donors OK with this?

Because its bad in a boring way. Much worth sneering at, too little that is fun to sneer at.

“Effective Altruism” is a dodge by the ultra wealthy to avoid paying higher taxes. When the ultra wealthy are pressured to pay higher taxes they counter with the idea that they know how best to spend their funds and why are you criticizing their charity? I guess your orphans and widows don’t need my cash? Ignoring the fact that if they were paying tax and not paying lobbyists to ensure the government(s) can’t tax them and can’t run effectively said widows and orphans wouldn’t need charity to feed and house them. My two cents.

Is that argument actually wrong? When was the last time the government raised taxes on the wealthy to pay for cost-effective programs to help the poor? Even in the rare case they do social programs are usually bundled with a bunch of other pork barrel spending.
You can't really use the actions of a corrupted system to criticize hypothetical actions of non-corrupt systems. Well, I guess you can, but it wouldn't actually mean anything. Plenty of nations have systems set up to ensure children don't starve to death in the streets.
you really must like the taste of Elon Musk's taint judging by your comment history.
Great argument. Glad I can have such good discussions on this sub
We're glad you appreciate the sneerclub experience. Please leave us a positive Yelp review.
https://youtu.be/A_pIPTih5iM
https://www.youtube.com/watch?v=xpAvcGcEc0k
Waaah...govmint bad, robber barons good. :(((
The 60s

[deleted]

My very vague impression is that EA usually gets sneered at further down in comment threads, so it might be less obvious.
Working as intended.
[deleted]
[deleted]
[deleted]
Was this the woman who was dismissed by sa as just being hyperbolic and complaining about abuse by everyone or was this somebody else? E: right read the rest of the twitter thread Soy. It was. E2: for completeness sake, she really stressed that her problem was with the sexual abuse and not EA so please be aware of this when this is mentioned. No matter your opinion of EA I think we shouldnt use this as a weapon against EA.
[deleted]
Yeah, it always makes me a little bit uncomfortable when it is brought up here (im not innocent in that however). So think it is important to mention it. It just is sad to me however that she wrote basically a book about sexual abuse including almost a hundred references and thw comments were just people nitpicking. Considering all that it doesnt make me want to sneer but more despair. But even that is a bit hard to mention as she prob didnt want us to discuss this at all, considering our reputation in the Rationalists sphere.

On a certain fringe, it tells people the best thing they can do with their lives is to become rich and then give a portion of their lucre to the neediest, “take from the rich and give to the poor” except the rich is yourself. Sounds like one-person redistribution but maybe it’s more like one-person reputation laundering. It allows people in very comfortable and exploitive positions to feel that they’re doing good deeds for the world without examining the broken system that gave them control of such disproportionate resources in the first place. A world in which billionaires donate to end malaria is better than one in which billionaires just hoard money they can’t use, and probably also better than one in which billionaires donate only to opera houses and art museums, but maybe what we should be striving for is a world in which people’s basic survival doesn’t depend on the personal ethical philosophies of billionaires.

Literally The Gospel of Wealth

We target awful EAers all the time, any worthwhile aspects of the movement are being drowned out by it’s association with rationalism and the associated AI risk robocult. The global poverty side of it has done some good at least though, more focus on evidence and against harmful fake charities is a good thing, albeit with the usual caveats that individual donations can’t really solve systemic problems.

I’ll give credit where credit is due, they convinced me to start giving a percentage of my income to charity (it was around 10% when I started then, little higher these days), but I want nothing to do with them otherwise- partially due to their contributions to MIRI’s bullshit, partially due to how unpleasant they seem to be, partially because fuck utilitarianism, partially because we just have different values, partially because, well… I’m honestly not too impressed with their so-often-vaunted ability to choose good charities. (Most of my donations end up going to independent nonprofit journalism, like ProPublica and The Center for Investigative Reporting, to various environmental organizations, or to the Halo trust, because fuck landmines.)

[deleted]
It's boring, joyless, philosophical pap that so often seems to get people defending horrific supervillain positions, like Peter Singer's disgusting views on disabled children. EA's positions have, well, definitely started warping in those weird directions thanks to longtermism, which is a variant/offshoot of utilitarianism, ugh. But again, utilitarianism is, as I mentioned, just... really boring. Really, really boring. Just by far the least interesting aspect of my critique against EA, and what I'm least interested in discussing.
What are Peter Singer’s views on disabled children?
He wants parents to be able to legally kill their disabled infants in order to preserve their future autonomy. It's super fucked up shit.
You mean after they’re born, or during their time in the womb when they find out their child has a disability?
After they're born. Full on infanticide, not abortion.
Hmm. I’m not entirely opposed, although it depends on their specific ailments. Ideally you would just give your unwanted child to the state.

There’s a decent portion of the movement that is actually pretty wholesome; GiveWell comes to mind, though it does mostly predate EA.

It’s largely just the techbro AI cults and rationalists within that community that suck. I feel like this sub does an appropriate amount of sneering at them.

Peoples’ trust in organizations depends on not thinking about how they!re controlled by people.

And, ironically, trust in people depends on not thinking about how controlled they are by organizations.

Similar with charity donations being seen as more sensible and meaningful than just giving money or food to people on the street, when charities have a long and sordid history of using money in ways that’s worse than useless.

There are plenty of sneer worthy EA individuals and aspects of the EA community, but doesn’t the basic premise make sense? Like we should support charities that actually do stuff, instead of charities that exist as vanity projects for the directors/don’t do anything because they’re poorly run. I feel like most people just donate to whichever charity has the cutest puppy or baby on the marketing materials without checking to see if they do a good job?

Plenty of the critique is from people that want to feel good about donating to no kill animal shelters.

Someone get their Gray Mirror today?

God, no.
What's Gray Mirror?
A neoreactionairy blog. Basically somebody who read a confederacy of dunces and went this is aspiring im giing to create a whole new political movement, and ended up with techmonarchy for racists. I dont see the relevance here btw.