SIAI is doing great things. But I can point out 3 obvious failure modes SIAI and/or LW are already in:
- Being exclusively human-centric. This is the elephant in the room that nobody will talk about, for fear of scaring off the donors. Humans aren't that great. I look forward to a future where I don't have to deal with them on a regular basis. Understanding the possibilities ahead of us, and yet trying to keep the future safe for humans anyway, is the greatest evil anyone has ever attempted. I study history and I still mean that literally.
- Being super-secretive and paranoid. SIAI says they want to make tools for AI researchers; yet Eliezer doesn't trust even the visiting fellows with what he's working on. Do it open-source, or don't do it.
- Not gathering the data and making the models needed to understand the phenomena they talk about, and to enumerate and build a probability distribution over possible futures. Maybe this falls outside their mission.
Which brings up a failure mode that the rest of us have fallen into:
- Placing the burden of planning for the Singularity entirely on SIAI.
Re: obviousness of possible failure modes
Date: 2010-05-15 05:16 am (UTC)- Being exclusively human-centric. This is the elephant in the room that nobody will talk about, for fear of scaring off the donors. Humans aren't that great. I look forward to a future where I don't have to deal with them on a regular basis. Understanding the possibilities ahead of us, and yet trying to keep the future safe for humans anyway, is the greatest evil anyone has ever attempted. I study history and I still mean that literally.
- Being super-secretive and paranoid. SIAI says they want to make tools for AI researchers; yet Eliezer doesn't trust even the visiting fellows with what he's working on. Do it open-source, or don't do it.
- Not gathering the data and making the models needed to understand the phenomena they talk about, and to enumerate and build a probability distribution over possible futures. Maybe this falls outside their mission.
Which brings up a failure mode that the rest of us have fallen into:
- Placing the burden of planning for the Singularity entirely on SIAI.