Wow, what a great question.! There is a school of thought that believes that computer processing speeds will equal those of humans in the next 15 to 20 years. When this happens and computers are able to process information and stimuli at the same speed as humans will sentience be next? If so, then how could they be denied "Human Rights"? There are many SF writers wha have explored this question, not the least of whom being A. Asimov. I would suggest reading these works before formulating an opinion,
Define thought for themselves, because their 'thoughts' can only go to the point where the programmer has programmed algorithms and subroutines into the system. If you're asking if at he point where a 'computer' gains knowledge of it's existence and becomes sentient, then yes, I think it should be granted the right to exist and not be deactivated for fear of retaliating against it's maker.
Even though Descartes said "I think, therefore I am," I believe in this instance personhood would have to be described in terms of being a sentient being, one who has feeling and unstructured consciousness.
in a what if question i feel like i have to give a far out-what if answer. if ai beings actually exist(ed) then i believe they would be subordinate to their creators and not allowed the creativity or freedom to express themselves.
Answers & Comments
Verified answer
Wow, what a great question.! There is a school of thought that believes that computer processing speeds will equal those of humans in the next 15 to 20 years. When this happens and computers are able to process information and stimuli at the same speed as humans will sentience be next? If so, then how could they be denied "Human Rights"? There are many SF writers wha have explored this question, not the least of whom being A. Asimov. I would suggest reading these works before formulating an opinion,
Warmest Regards,
KC Miller
8645809358
Define thought for themselves, because their 'thoughts' can only go to the point where the programmer has programmed algorithms and subroutines into the system. If you're asking if at he point where a 'computer' gains knowledge of it's existence and becomes sentient, then yes, I think it should be granted the right to exist and not be deactivated for fear of retaliating against it's maker.
Even though Descartes said "I think, therefore I am," I believe in this instance personhood would have to be described in terms of being a sentient being, one who has feeling and unstructured consciousness.
in a what if question i feel like i have to give a far out-what if answer. if ai beings actually exist(ed) then i believe they would be subordinate to their creators and not allowed the creativity or freedom to express themselves.
Yes
Well, 'being' rights....