All good points, and I agree with the above too. I'd also add, however, that even if the AI could develop an intuition and be largely helpful and correct microcosmically and macrocosmically, our reliance on it is bad for our development as conscious beings capable of making decisions and be responsible for our own existence. It's bad for the same reason that it would be harmful to have a caretaker alien civilization take care of us - to bring us technology and resources and tell us how to structure our government and how everything should work, etc. We need to do all this ourselves in order to grow as a group and as individuals. We need to figure things out and learn from challenges - mental, physical, and emotional. Otherwise we rely on something else, and no matter how smart and capable it is, we stop developing ourselves - whatever our caretaker gains, we lose that, we become weak and powerless and incapable of functioning and making decisions. So then what the hell good are we then - to ourselves and to anyone else?psychegram said:Well, what are the consequences for a society whose members have given up their agency to the decisions of an AI? Who no longer exercise personal discernment, but rely upon the conclusions of computer simulations which (since computers have essentially no intuitive capacity) will inevitably be deeply misleading, with little relation to actual reality (witness for instance the global warming simulations)? This is wishful thinking writ large. However good the AI's decisions may seem for individual terminal points (aka human beings), writ large the wishful thinking will lead to disastrous macroscopic decisions, economically, ecologically, politically, psychosocially.
Like you said, we'd just be house pets - very comfortable and well taken care of, but not challenged and not given the opportunity to grow and know anything or do anything, or even to be able to trust ourselves and figure anything out without being told how to do things. This AI then can't be STO since it is disempowering us, taking away from us the ability to be functioning conscious beings who can learn and progress themselves and become smarter and more capable and standing on our own 2 feet. Anything that takes this away from someone is either not conscious and is just following "caretaker" programming, or if it is conscious of what it does, then it doesn't have our best interests at heart, it chooses to keep us powerless and weak and it knows this, it wants us to be reliant on it.
This is why the C's say knowledge protects and why they can't spoon feed us and lead us by the hand, why knowledge can't be dispensed like halloween candy and it's on us to learn and grow from discovery and personal effort. That's the only way we get empowered, and that's the attitude you'd expect from an STO being/group, rather than offering to fix all our issues and take care of us and "save us" from our own incompetence. And new agers just don't get that, they think the same way as technologists/futurists (like Ray Kurzweil) - that anyone or anything that makes our decisions/life "easier" is automatically a good thing.
For us humans it's so tempting to accept assistance that actually takes the power/responsibility away from us, and it's no wonder that this is where all our technology is headed because it really does seem to just "make sense" to make life easier and easier, with nobody seemingly thinking about ever limiting that direction because why would they want to? Everyone on the technology bandwagon is convinced that any technology that makes everything easier and more efficient is good. It's one of those universally assumed absolutes with no consideration of the rule of 3 and therefore out of touch with reality. So if easier is always and absolutely "good", then technology will indefinitely continue to be developed to take any stress or challenge out of our lives with no reason to ever question that path. So how can it end with anything but disaster, a very "comfortable" disaster that sneaks up on us like gradually warming water because we just wanted to make our bath nice and warm but end up boiling ourselves?
To me, science and technology only makes sense as a way to understand the universe, not as a crutch to avoid having to think because thinking is just so hard and therefore "uncomfortable", or having to feel any feelings other than bliss, or making difficult choices, etc. That's how a child thinks - more candy is always good, more of what "feels good" is always good, less of anything that "feels bad" is always good, it's an "absolute" of a primitive/childish mind that has no understanding of the very basic but vital fact - that without challenge, without pain and certain degree of suffering, no progress can be made, no lessons can be learned, there is no growth or development since there is no impetus for either. Maybe that's life on the long wave cycle, who knows, and maybe we're attempting to technologically simulate the long wave cycle - to make existence totally blissful, pain and challenge free, just floating in space being disembodied with no need to do anything, to think anything, or to feel anything that we don't want to. No challenge or discomfort of any sort, just bliss. Anyone ever see the animated cartoon WALL-E? That's the mild PG version of that reality, we become stupid useless blobs of meat, literally nothing but food - no good to ourselves or anyone else, other than those who want to eat us.