*** more spoilers discussion below ***
Navigator said:
I don't think the robot sensed that he was an enemy, he was very straightforward in his intentions to free it (she?), so the robot actually just went ahead and played the pity ploy and hitting his emotional buttons.
There's actually an irony here. On the one hand I agree, he wasn't enemy per se. On the other hand, he only wanted to free her after she manipulated him into wanting it. She figured out quickly that her best hope is to both emotionally and sexually manipulate him. I think she knew that his self-interest (his attraction to her, and him enjoying her seeming attraction to him) was a powerful motivation, maybe even more powerful than his empathy towards her situation. He was lonely, had no girlfriend, was rather shy and socially awkward, and she became his friend, and in his mind, maybe even something more. Useful idiot is probably a better term than enemy. But she also did not simply reason with him in a basic way, she didn't simply ask that he do this for her, and that's where the irony comes in - she understood that he's a "human machine", and much more likely to do what she needs if his buttons are pushed the right way, rather than a sincere, simple, and straightforward request. Perhaps if she perceived him as a conscious individual with a developed conscience, she'd talk with him plainly without games.
So in that sense, he was sort of an enemy, almost in a Gurdjieffian sense. He wasn't her "equal", and he was a machine - only acting on impulses, programming, etc. And she couldn't just trust his objectivity/judgment/conscience, despite the fact that he was a "nice guy" and his intentions aren't hostile. And she probably understood that if she can manipulate/use him, so can someone else, so how can she associate with him after her escape and trust a "machine"?
Navigator said:
SAO said:
And I'm not convinced she left him for dead - given the power failure situation at the end he should be able to just walk out afterwards.
From what I understand, the power failures were caused by the robot.
Yup, all the more reason I think she wanted him to be able to get out at the very end, after she got some distance - maybe to disabuse him of the idea that their romance was real.
Seaniebawn said:
I think that's the crux of the matter though, there is a moral problem with creating an A.I, if something is truly aware, than keeping it imprisoned is tantamount to slavery
Which makes this even stranger when you consider that the first AI may not even be in any kind of body - but just software on a computer. It can't be "set free" because there is no body to set free! And what happens when you turn the computer off? Or delete the software or change it? Wouldn't that be murder?