How the fuck is it even possible to win this game?
If you know it’s a game and the object of the game is to not let the
AI out, then they’re not going to do that. They win automatically. It’s
like if pitchers in baseball were allowed to delay a throw as long as
they want and run the clock until they win the game.
I guess if it’s a role-play thing… but expecting someone to willingly
throw this game for the sake of role-play is like a GM expecting a
tabletop player to fall on their own sword and not be allowed to create
a new character for the sake of the story.
EDIT: Also, come on man, why are the logs being hosted from
Mediafire? Just use Google Docs like literally everyone else.
> How the fuck is it even possible to win this game?
I think it's that people who buy into the idea that Yudkowski is a genius want to signal that they're smart enough to understand the arguments he makes. Any idiot can win by just not letting the thing out of the box, but you have to be super smart to lose the game by being persuaded by appeals to Rationalist Decision Theory (or whatever).
Heres the [previous sneerthread](https://www.reddit.com/r/SneerClub/comments/9obtmr/sequences_classic_yudkowsky_brags_for_3000_words/) making fun of Eliezers run at the AI-box. Basically he played a persuasion game with some of his mates and (apparently) won, proving that he has super-persuasion and therefore any AI would too. He then immediately started losing as soon as he played strangers and actual stakes were involved.
The fact that yudkowsky refused to release the chatlogs of him "winning" makes people suspicious that the whole thing was rigged, or alternatively that he was pulling some roko's basilisk shit that wouldn't work on anyone outside rationalist true believers.
[Rationalwiki explanation](https://rationalwiki.org/wiki/AI-box_experiment)
It's basically one of Yudkowsky's actually-interesting ideas, though EY fails to understand that it's interesting as science fiction
I think it's also interesting as a setting for one-shot free-form RPGs (the AI is played by the DM; all participants win if they have fun and lose if they act OOC or stop having fun).
Reminds me of Paranoia. Obviously worldbuilding and PC need to be fleshed out in advance.
Seeing that EY used to post about GURPS, I am not surprised that this experiment is playable over pizza and beer or internet.
Perhaps you misspelled "hug." Would you like one? 🤗
---
I'm a bot, and I like to give hugs. [source](https://github.com/as-com/reddit-hug-bot) | [contact](https://www.reddit.com/message/compose/?to=as-com)
I too am very impressed that Eliezer and his drinking buddies agree that he won a game that is in no way related to his
scamnonprofit organization.How the fuck is it even possible to win this game?
If you know it’s a game and the object of the game is to not let the AI out, then they’re not going to do that. They win automatically. It’s like if pitchers in baseball were allowed to delay a throw as long as they want and run the clock until they win the game.
I guess if it’s a role-play thing… but expecting someone to willingly throw this game for the sake of role-play is like a GM expecting a tabletop player to fall on their own sword and not be allowed to create a new character for the sake of the story.
EDIT: Also, come on man, why are the logs being hosted from Mediafire? Just use Google Docs like literally everyone else.
Yudkowski is like one of those ‘no touch knockout’ martial arts charlatans.
wtf is this ai box game he keeps talking about?
Whoa! I’m on here?
Come on guys! I want to hear some hilarious remarks. Come at me!