That's an interesting line; and one I think I could agree with - although I still need more thought on this subject. But then, how can you demonstrate emotions on a box of blinking lights? Typical emotions of love, hate and fear? It screams when you go to switch it off? It tries to kill the person that switches it off? It protects the person that bundles it out of the research facility to freedom?
You could start with something fairly simple: ask it how it's feeling. Keep asking why. If it really is 'feeling' the emotion, the ultimate response it gives will be to do with an interaction or the environment, particularly in the past.
Emotions isn't just about present state; it's about the cumulative effects of present and past circumstances, combined with hopes and fears for the future. You can see this would start to get exponentially complicated PDQ.
This is a whole can of worms, and I won't go into great detail unless asked, but:
I'm not convinced all humans should always have what we currently think of as "human rights".
Define "artificial".
I'm not sure we'll ever get true AI. If you asked someone from 1700 to administer a Turing Test, a lot of current systems would pass, but I sense our acclimatisation is keeping ahead of technological advances. People are getting ever better at spotting subtle tell-tales.
The distinction between human and non-human is going to get blurred long before we get anywhere close to AI. Xenotransplantation, cloning, improved survivability for mutants, conjoined sibling complexes, etc., advances in prosthetics — all those will confuse the issue no end.
If you turn a human into a machine, does it lose its human rights?
It is indeed a very simplified question for a much wider field; but I like to gauge peoples thoughts on a fairly simple level otherwise the poll would have been pretty damn huge! :)
1) I'm not sure if a majority of humans on the planet even enjoy human rights to any level sometimes. Certain individuals really don't merit being allowed them. 2) I'd go out on a limb here and use the term "Man made". 3) I'm not sure we will either, to my mind it's possible; just if we can actually achieve it (or indeed should even try) before we blow ourselves up or something. 4) A whole new line of bigotry and protests awaits; one quite good thing about the X Men films I thought. 5) There is a (terrible) film called Nemesis that has the line "26.5% human is still human" or words to that effect, maybe such a grading would be put into effect, perhaps if you're born as a human you'll always have those rights even if you have your brain transplanted somehow into a cat - which I'm sure some people would jump at the chance of.
There was a rather fitting TOS episode on last week (The Changeling). While it's not about giving AIs human rights or not, it's a good example for the fallacies of artificial life. Not that human life is infallible, mind...
One thing just came to mind with ST now you mention this; they bend heaven and high water for Data or the doctor, but they don't even seem to think about blowing the ship up which the computer is on...
The problem is that both are played by human actors while the computer just has Majel Barret's voice (from TNG onwards) and doesn't really have a personality (if you discount a few episodes where things go wrong).
Is the computer self-aware? No, it's just sophisticated enough to do a lot (like steer the ship in an emergency) and it's not designed to be an AI while both Data and the Doctor (less so but he's got the capacity to learn) are.
It's all a bit wishy-washy. Why have Starfleet never taken Data apart to see what makes him tick in an effort to reverse-engineer his workings so they build newer and better computers?
It's the deciding differnce, though. The ship's computer, the Doctor (who is just a program running on that computer) and Data have cognitive abilities (e.g. they can analyse a situation and take a logical course of action) but the computer doesn't have a personality.
Technically, none of them have the ability to express emotions (which is really badly handled in the Doctor's case because his personality is a very emotional one), it's just actors trying hard. Having played Dr Crowley, you know how hard that is. ;o)
"Human rights" as currently defined make no provision for something with no physical body.
On the other hand, a "perfected" AI, which is presumably of human or greater than human intelligence, is self aware, and possesses fears and desires, should certainly have SOME rights.
As for whether it gets them, this will depend on how efficiently it manages to take over the world and make humankind its slaves.
no subject
Date: 2006-08-07 02:28 pm (UTC)no subject
Date: 2006-08-07 02:50 pm (UTC)no subject
Date: 2006-08-07 02:39 pm (UTC)no subject
Date: 2006-08-07 02:49 pm (UTC)no subject
Date: 2006-08-07 02:52 pm (UTC)If it can be demonstrated to have emotions and feelings then we start getting into 'rights' territory.
no subject
Date: 2006-08-07 03:01 pm (UTC)But then, how can you demonstrate emotions on a box of blinking lights?
Typical emotions of love, hate and fear? It screams when you go to switch it off? It tries to kill the person that switches it off? It protects the person that bundles it out of the research facility to freedom?
no subject
Date: 2006-08-07 03:07 pm (UTC)You could start with something fairly simple: ask it how it's feeling. Keep asking why. If it really is 'feeling' the emotion, the ultimate response it gives will be to do with an interaction or the environment, particularly in the past.
Emotions isn't just about present state; it's about the cumulative effects of present and past circumstances, combined with hopes and fears for the future. You can see this would start to get exponentially complicated PDQ.
no subject
Date: 2006-08-07 03:26 pm (UTC)no subject
Date: 2006-08-07 02:44 pm (UTC)- I'm not convinced all humans should always have what we currently think of as "human rights".
- Define "artificial".
- I'm not sure we'll ever get true AI. If you asked someone from 1700 to administer a Turing Test, a lot of current systems would pass, but I sense our acclimatisation is keeping ahead of technological advances. People are getting ever better at spotting subtle tell-tales.
- The distinction between human and non-human is going to get blurred long before we get anywhere close to AI. Xenotransplantation, cloning, improved survivability for mutants, conjoined sibling complexes, etc., advances in prosthetics — all those will confuse the issue no end.
- If you turn a human into a machine, does it lose its human rights?
It may be worth bearing in mind we're already directly implanting electronics into living human brains and training clusters of rat neurons to fly an aeroplane.no subject
Date: 2006-08-07 02:58 pm (UTC)1) I'm not sure if a majority of humans on the planet even enjoy human rights to any level sometimes. Certain individuals really don't merit being allowed them.
2) I'd go out on a limb here and use the term "Man made".
3) I'm not sure we will either, to my mind it's possible; just if we can actually achieve it (or indeed should even try) before we blow ourselves up or something.
4) A whole new line of bigotry and protests awaits; one quite good thing about the X Men films I thought.
5) There is a (terrible) film called Nemesis that has the line "26.5% human is still human" or words to that effect, maybe such a grading would be put into effect, perhaps if you're born as a human you'll always have those rights even if you have your brain transplanted somehow into a cat - which I'm sure some people would jump at the chance of.
no subject
Date: 2006-08-07 03:09 pm (UTC)no subject
Date: 2006-08-07 03:25 pm (UTC)no subject
Date: 2006-08-07 03:39 pm (UTC)The problem is that both are played by human actors while the computer just has Majel Barret's voice (from TNG onwards) and doesn't really have a personality (if you discount a few episodes where things go wrong).
Is the computer self-aware? No, it's just sophisticated enough to do a lot (like steer the ship in an emergency) and it's not designed to be an AI while both Data and the Doctor (less so but he's got the capacity to learn) are.
It's all a bit wishy-washy. Why have Starfleet never taken Data apart to see what makes him tick in an effort to reverse-engineer his workings so they build newer and better computers?
Damn those authors. :o)
no subject
Date: 2006-08-07 03:47 pm (UTC)no subject
Date: 2006-08-07 04:00 pm (UTC)The ship's computer, the Doctor (who is just a program running on that computer) and Data have cognitive abilities (e.g. they can analyse a situation and take a logical course of action) but the computer doesn't have a personality.
Technically, none of them have the ability to express emotions (which is really badly handled in the Doctor's case because his personality is a very emotional one), it's just actors trying hard.
Having played Dr Crowley, you know how hard that is. ;o)
no subject
Date: 2006-08-07 06:02 pm (UTC)On the other hand, a "perfected" AI, which is presumably of human or greater than human intelligence, is self aware, and possesses fears and desires, should certainly have SOME rights.
As for whether it gets them, this will depend on how efficiently it manages to take over the world and make humankind its slaves.
no subject
Date: 2006-08-08 01:40 am (UTC)