@dookie Said
Still floundering a bit. I am thinking more of "value" as the
ground of being , not as a contest between different beings.
Not sure what you mean here. I see Value as something a person provides which makes them valuable to another human being. It can be anything, maybe just a sense of humor.
Quote:
(And an AI not being concerned with certain ethical questions - as you suggest above - opens yet another can of worms!)
Well its a question of how well do we impose limits and barriers on the AI.
However, these can only be imposed on an organisational level. For example Boston Dynamics may put in something resembling the fictional "Laws of Robots". But the Russion dudes building Fedor may not. Also defence seems to be one of the big investors in AI, I doubt they are so bothered about ethics as say people who are trying to develop search and rescue robots.
Quote:
But I'll apply my get out clause! Like a previous poster, I do really doubt if emotions as we know them in the full existential sense could ever be transposed into an artificial machine.
Unless this happens, an AI will never feel any sort of empathy so it needs some rules that can only exist if all ethical dilemma's are solved. So we are just left with sociopathic machines that do its creators bidding.
Random question. If we do create Self Aware AI, or if it is created by other AI. Do you believe that it should be subservient to human beings? And if so, which human beings?