robinbloke: (me_reddot)
[personal profile] robinbloke
[Poll #787460]
(deleted comment)

Date: 2006-08-07 02:28 pm (UTC)
From: [identity profile] robinbloke.livejournal.com
I have not heard of that; who is the author?
(deleted comment)

Date: 2006-08-07 02:50 pm (UTC)
From: [identity profile] robinbloke.livejournal.com
Aha, thanks :)

Date: 2006-08-07 02:39 pm (UTC)
From: [identity profile] azekeil.livejournal.com
Error: Need more input. Define 'perfection' in this instance.

Date: 2006-08-07 02:49 pm (UTC)
From: [identity profile] robinbloke.livejournal.com
The creation of a biological/mechanical and/or computer intelligence capable of cognitive processes.

Date: 2006-08-07 02:52 pm (UTC)
From: [identity profile] azekeil.livejournal.com
That does not sound like it should be capable of emotions. If not then I don't believe it should have any 'rights', just as any other algorithm.

If it can be demonstrated to have emotions and feelings then we start getting into 'rights' territory.

Date: 2006-08-07 03:01 pm (UTC)
From: [identity profile] robinbloke.livejournal.com
That's an interesting line; and one I think I could agree with - although I still need more thought on this subject.
But then, how can you demonstrate emotions on a box of blinking lights?
Typical emotions of love, hate and fear? It screams when you go to switch it off? It tries to kill the person that switches it off? It protects the person that bundles it out of the research facility to freedom?

Date: 2006-08-07 03:07 pm (UTC)
From: [identity profile] azekeil.livejournal.com
Well now there's a whole new kettle of fish :)

You could start with something fairly simple: ask it how it's feeling. Keep asking why. If it really is 'feeling' the emotion, the ultimate response it gives will be to do with an interaction or the environment, particularly in the past.

Emotions isn't just about present state; it's about the cumulative effects of present and past circumstances, combined with hopes and fears for the future. You can see this would start to get exponentially complicated PDQ.

Date: 2006-08-07 03:26 pm (UTC)
From: [identity profile] robinbloke.livejournal.com
And this is why psycology is something I need to study :)

Date: 2006-08-07 02:44 pm (UTC)
gerald_duck: (frontal)
From: [personal profile] gerald_duck
This is a whole can of worms, and I won't go into great detail unless asked, but:
  • I'm not convinced all humans should always have what we currently think of as "human rights".
  • Define "artificial".
  • I'm not sure we'll ever get true AI. If you asked someone from 1700 to administer a Turing Test, a lot of current systems would pass, but I sense our acclimatisation is keeping ahead of technological advances. People are getting ever better at spotting subtle tell-tales.
  • The distinction between human and non-human is going to get blurred long before we get anywhere close to AI. Xenotransplantation, cloning, improved survivability for mutants, conjoined sibling complexes, etc., advances in prosthetics — all those will confuse the issue no end.
  • If you turn a human into a machine, does it lose its human rights?
It may be worth bearing in mind we're already directly implanting electronics into living human brains and training clusters of rat neurons to fly an aeroplane.

Date: 2006-08-07 02:58 pm (UTC)
From: [identity profile] robinbloke.livejournal.com
It is indeed a very simplified question for a much wider field; but I like to gauge peoples thoughts on a fairly simple level otherwise the poll would have been pretty damn huge! :)

1) I'm not sure if a majority of humans on the planet even enjoy human rights to any level sometimes. Certain individuals really don't merit being allowed them.
2) I'd go out on a limb here and use the term "Man made".
3) I'm not sure we will either, to my mind it's possible; just if we can actually achieve it (or indeed should even try) before we blow ourselves up or something.
4) A whole new line of bigotry and protests awaits; one quite good thing about the X Men films I thought.
5) There is a (terrible) film called Nemesis that has the line "26.5% human is still human" or words to that effect, maybe such a grading would be put into effect, perhaps if you're born as a human you'll always have those rights even if you have your brain transplanted somehow into a cat - which I'm sure some people would jump at the chance of.

Date: 2006-08-07 03:09 pm (UTC)
From: [identity profile] karohemd.livejournal.com
There was a rather fitting TOS episode on last week (The Changeling). While it's not about giving AIs human rights or not, it's a good example for the fallacies of artificial life. Not that human life is infallible, mind...

Date: 2006-08-07 03:25 pm (UTC)
From: [identity profile] robinbloke.livejournal.com
One thing just came to mind with ST now you mention this; they bend heaven and high water for Data or the doctor, but they don't even seem to think about blowing the ship up which the computer is on...

Date: 2006-08-07 03:39 pm (UTC)
From: [identity profile] karohemd.livejournal.com
Ah, a typical ST quandary, that. ;o)

The problem is that both are played by human actors while the computer just has Majel Barret's voice (from TNG onwards) and doesn't really have a personality (if you discount a few episodes where things go wrong).

Is the computer self-aware? No, it's just sophisticated enough to do a lot (like steer the ship in an emergency) and it's not designed to be an AI while both Data and the Doctor (less so but he's got the capacity to learn) are.

It's all a bit wishy-washy. Why have Starfleet never taken Data apart to see what makes him tick in an effort to reverse-engineer his workings so they build newer and better computers?

Damn those authors. :o)

Date: 2006-08-07 03:47 pm (UTC)
From: [identity profile] robinbloke.livejournal.com
People personalities on a ship? What could possibly go wrong?

Date: 2006-08-07 04:00 pm (UTC)
From: [identity profile] karohemd.livejournal.com
It's the deciding differnce, though.
The ship's computer, the Doctor (who is just a program running on that computer) and Data have cognitive abilities (e.g. they can analyse a situation and take a logical course of action) but the computer doesn't have a personality.

Technically, none of them have the ability to express emotions (which is really badly handled in the Doctor's case because his personality is a very emotional one), it's just actors trying hard.
Having played Dr Crowley, you know how hard that is. ;o)

Date: 2006-08-07 06:02 pm (UTC)
From: [identity profile] dreamfracture.livejournal.com
"Human rights" as currently defined make no provision for something with no physical body.

On the other hand, a "perfected" AI, which is presumably of human or greater than human intelligence, is self aware, and possesses fears and desires, should certainly have SOME rights.

As for whether it gets them, this will depend on how efficiently it manages to take over the world and make humankind its slaves.

Date: 2006-08-08 01:40 am (UTC)
From: [identity profile] dancing-darkly.livejournal.com
I've seen far too many Sci-Fi films to knwow where this sort of thing leads.....I'm even scared of my own computer....

Profile

robinbloke: (Default)
robinbloke

January 2016

S M T W T F S
     12
3456789
10111213141516
17181920212223
24 252627282930
31      

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 21st, 2025 01:11 am
Powered by Dreamwidth Studios