Apparently I’ve fallen behind slightly on my weekly update schedule. I planned to update yesterday, at the last minute (of course) and fell prey to the planning fallacy.

Eliezer showed up at the Tortuga Less Wrong meetup yesterday, and it was a blast.

Tortuga is a communal living group in Mountain View. The Boot Camp is in Berkeley, which isn’t exactly close to Mountain View; it’s extremely inconvenient and expensive to get there via public transport. Fortunately some people had cars we could pile into.

There were probably 30 or 40 people there. We did an exercise called “The Strangest Thing an AI Could Tell You”, where we came up with the strangest thing that an AI could tell you where you believed it, rather than believing that the AI was broken or you were going crazy.

One of the interesting questions we came up with is: “why does the AI output this particular statement?” Did you ask it “do all humans have a tail that we just can’t see” and it said yes? Or did you ask it “tell me something surprising” and it produced this gem? The question doesn’t specify, and it seems that the answers are quite different depending on the process by which the AI would have generated the question.

We spent a while talking about this, and then moved onto random other topics, and it was really a blast. More than half the attendees stuck around for at least four hours. Eventually some people went to hang out in the naked hot tub in the back. I went, of course, and Eliezer also went, and now I have had the privilege and bragging rights of playing Truth or Dare in a hot tub naked with Eliezer Yudkowsky.