Re: intelligence augmentation

Date: 2010-04-17 10:06 pm (UTC)
> Though I feel this discussion is getting rather abstract.

It's no more abstract than asserting that all AGIs, except very specific FAIs constructed with rigorous understanding of preference, are fatal.

> What I'm saying is that I don't *have* fixed preferences outside a very narrow set.

Translated to my terminology, this is still an assertion about your fixed preference (even Microsoft Word gets a fixed preference), namely that your preference involves a lot of indifference to detail. But why would it be this way? And how could you possibly know? We don't know our preference, we can only use it (rather inaptly). Even if preference significantly varies during one's life (varying judgment doesn't imply varying preference!), it's a statement independent of how it can be characterized at specific moments.
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

December 2018

S M T W T F S
      1
2345678
910 1112131415
16171819202122
23242526272829
3031     

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 27th, 2025 07:00 pm
Powered by Dreamwidth Studios