First  Prev  1  2  3  4  5  6  7  8  Next  Last
Post Reply Does a sentient robot deserve rights?
8808 cr points
Send Message: Send PM GB Post
AKR
Offline
Posted 6/13/16 , edited 6/13/16

Nalaniel wrote:


Jophar_Vorin wrote:

Thats actually a good point. Didnt think of it that way before.


Everyone deserves rights, even animals. Unfortunately, change is slow and people need to realise what they're doing is wrong, which often happens when it's too late.


I find it stupid that people are not able to think more than one outcome, they often need to do it the wrong way first...




***Ignore this spoiler, i am bored***
20973 cr points
Send Message: Send PM GB Post
23 / M / Šumeru.
Offline
Posted 6/13/16
Why wouldn't it? Super-AI's are one thing, sentient robotic existences is another.
Posted 6/13/16
no
5282 cr points
Send Message: Send PM GB Post
22 / M
Offline
Posted 6/13/16
Okay lets go through some points of metaphysics. Inorder to be responsible for something you need some control over it. Your actions need to be the result of your decisions. This is opposed to you're decisions happening and then your your actions just following. Approximate determinism is a requirement for you to be responsible.

There needs to be a psychological reason for your actions for them to be yours. To the extent that there wasn't a reason the action happened to you.

We consider and reconsider, actions including methods of thought, with some understanding of what we are doing. Some other mind could be more considerate, less prone to cached thoughts, more willing and able to consider alternatives, and better at rewriting its code, than us.
12636 cr points
Send Message: Send PM GB Post
30 / M / Marshall, Michigan
Offline
Posted 6/13/16 , edited 6/13/16

Ryulightorb wrote:


jtjumper wrote:


Leesburgfolk wrote:

Animals feel emotion but humans farm them, eat them, love them as pets, and abuse them, etc. I do enjoy eating meat but don't enjoy thinking about how much life had to die for that hamburger.

You have to separate emotions and self awareness or sentience. Robots or AI wouldn't be a food source but they would be a resource, like for mining, space exploration, or other hazardous tasks that is harmful for human life. Depending on what you believe or don't believe we humans don't know how far our own "programming" or "evolution" or "intelligent design" can go. Is there a limit?

That being said, I would err on the side of caution and say that if we can make something that feels, or emulates feelings, and can think or alter its own programming, then we have created a version of ourselves and therefore they have the same protective rights.


What would you say in response to what xxJing said?


xxJing wrote:

The only purpose to create a simulation of a human is to be able to do experiments on it to further the prosperity of humanity. Rights exist as a means for society to function, they ensure that people are on equal ground and promote human progress. Giving rights to a simulation defeats the purpose of creating it in the first place.

So no, robots do not deserve rights because having them basically makes them meaningless.


Not really the best point.

Since we are likely to make it one day just to prove that we can.



I don't maybe the question we should ask is why do we give other humans rights in the first place?
If it's because they deserve it, do they deserve it because they're human or because they're intelligent?
If it's because they're fellow humans, how does that lead to giving robots rights?
If it's because they're intelligent, should we give fewer rights to less intelligent people?
Maybe we give rights to other humans so they don't fight with us. Is that a good reason to give AIs rights?

(To clarify, I feel, "Should robots have rights?" would be better framed as "What rights should robots have, if any?" It allows for a better spectrum of views, which makes more sense because this not a black and white issue.)
2101 cr points
Send Message: Send PM GB Post
M / An Island off the...
Offline
Posted 6/13/16

Jophar_Vorin wrote:


Razzalax wrote:

Sure they do


You mean s*x bots? ( ͡° ͜ʖ ͡°)


Nalaniel wrote:

If I can't have any rights, then the robots can't have them, either! T_T


Thats actually a good point. Didnt think of it that way before.


( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)
8808 cr points
Send Message: Send PM GB Post
AKR
Offline
Posted 6/13/16

Razzalax wrote:


Jophar_Vorin wrote:


Razzalax wrote:

Sure they do


You mean s*x bots? ( ͡° ͜ʖ ͡°)


Nalaniel wrote:

If I can't have any rights, then the robots can't have them, either! T_T


Thats actually a good point. Didnt think of it that way before.


( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)


2101 cr points
Send Message: Send PM GB Post
M / An Island off the...
Offline
Posted 6/13/16

Jophar_Vorin wrote:


Razzalax wrote:


Jophar_Vorin wrote:


Razzalax wrote:

Sure they do


You mean s*x bots? ( ͡° ͜ʖ ͡°)


Nalaniel wrote:

If I can't have any rights, then the robots can't have them, either! T_T


Thats actually a good point. Didnt think of it that way before.


( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)( ͡° ͜ʖ ͡°)




the wall of memes and dank kush.exe
12636 cr points
Send Message: Send PM GB Post
30 / M / Marshall, Michigan
Offline
Posted 6/13/16 , edited 6/13/16

Sir_jamesalot wrote:

Don't make them fully sentient.
Problem solved.


That's actually a really good solution.
15235 cr points
Send Message: Send PM GB Post
36 / M
Offline
Posted 6/13/16

jtjumper wrote:


Sir_jamesalot wrote:

Don't make them fully sentient.
Problem solved.


That's actually a really good solution.



Yeah good point. +1

the point I was trying to make is that compared to your belief or non-belief about humanity's creation, do we emulate our creator or biological programming done from nature? If so, we as emulators, are not sentient, or something, and therefore need to tread carefully about not giving rights to something similar to us.

The other point made about simply having a spectrum of rights is probably the best starting point, especially as human kind is slow to change and recognize ideas. Kind of like how slavery used to be fully acceptable and is now abhorrent to us. We have some idiot ancestors ;)

...and our descendants will view us with pity and think of us as idiots (which I agree we usually are).
12636 cr points
Send Message: Send PM GB Post
30 / M / Marshall, Michigan
Offline
Posted 6/13/16 , edited 6/13/16

Leesburgfolk wrote:


jtjumper wrote:


Sir_jamesalot wrote:

Don't make them fully sentient.
Problem solved.


That's actually a really good solution.



Yeah good point. +1

the point I was trying to make is that compared to your belief or non-belief about humanity's creation, do we emulate our creator or biological programming done from nature? If so, we as emulators, are not sentient, or something, and therefore need to tread carefully about not giving rights to something similar to us.

The other point made about simply having a spectrum of rights is probably the best starting point, especially as human kind is slow to change and recognize ideas. Kind of like how slavery used to be fully acceptable and is now abhorrent to us. We have some idiot ancestors ;)

...and our descendants will view us with pity and think of us as idiots (which I agree we usually are).


2 other important things to consider:

1) Why slavery is bad. The reason I would go for would either be human dignity or reciprocity (if no one can be a slave, I'll never have to be one). Neither of these need apply to robots, because of the next point.

2) When we program general AIs, they will likely "think" an d "feel" in ways very different from the way we do. They could be happy to be subservient to us and be more interested in doing there jobs. One problem is that we anthropomorphize AIs to make them relatable when, in fact, they operate very differently. They'd be more like Roombas, or Siri, than like Chobits, which was about anthropomorphizing PCs ( "persecom" is the one of the words Japanese use for computers ). Most AIs may think very differently than we do.
367 cr points
Send Message: Send PM GB Post
24 / M / Cebu, PH
Offline
Posted 6/13/16
If they are sentient then they deserve it
15235 cr points
Send Message: Send PM GB Post
36 / M
Offline
Posted 6/13/16

jtjumper wrote:


Leesburgfolk wrote:


jtjumper wrote:


Sir_jamesalot wrote:

Don't make them fully sentient.
Problem solved.


That's actually a really good solution.



Yeah good point. +1

the point I was trying to make is that compared to your belief or non-belief about humanity's creation, do we emulate our creator or biological programming done from nature? If so, we as emulators, are not sentient, or something, and therefore need to tread carefully about not giving rights to something similar to us.

The other point made about simply having a spectrum of rights is probably the best starting point, especially as human kind is slow to change and recognize ideas. Kind of like how slavery used to be fully acceptable and is now abhorrent to us. We have some idiot ancestors ;)

...and our descendants will view us with pity and think of us as idiots (which I agree we usually are).


2 other important things to consider:

1) Why slavery is bad. The reason I would go for would either be human dignity or reciprocity (if no one can be a slave, I'll never have to be one). Neither of these need apply to robots, because of the next point.

2) When we program general AIs, they will likely "think" an d "feel" in ways very different from the way we do. They could be happy to be subservient to us and be more interested in doing there jobs. One problem is that we anthropomorphize AIs to make them relatable when, in fact, they operate very differently. They'd be more like Roombas, or Siri, than like Chobits, which was about anthropomorphizing PCs ( "persecom" is the one of the words Japanese use for computers ). Most AIs may think very differently than we do.



Yes, I suppose we could program them to like or desire being a slave or second class citizen. I wasn't so much thinking of in terms of human dignity so much as reciprocity at a later date or some kind of "awakening" (too many terminator movies). I just think we couldn't make something as advanced as that without imparting some characteristics of ourselves into it. After all, we can only program what we know or imagine, and what we know is ourselves (masochist or sadists that we are lol)

13626 cr points
Send Message: Send PM GB Post
21 / Australia
Offline
Posted 6/13/16
Do I look like a robot lawyer to you?
12636 cr points
Send Message: Send PM GB Post
30 / M / Marshall, Michigan
Offline
Posted 6/13/16 , edited 7/5/16

Leesburgfolk wrote:


jtjumper wrote:


2 other important things to consider:

1) Why slavery is bad. The reason I would go for would either be human dignity or reciprocity (if no one can be a slave, I'll never have to be one). Neither of these need apply to robots, because of the next point.

2) When we program general AIs, they will likely "think" an d "feel" in ways very different from the way we do. They could be happy to be subservient to us and be more interested in doing there jobs. One problem is that we anthropomorphize AIs to make them relatable when, in fact, they operate very differently. They'd be more like Roombas, or Siri, than like Chobits, which was about anthropomorphizing PCs ( "persecom" is the one of the words Japanese use for computers ). Most AIs may think very differently than we do.



Yes, I suppose we could program them to like or desire being a slave or second class citizen. I wasn't so much thinking of in terms of human dignity so much as reciprocity at a later date or some kind of "awakening" (too many terminator movies). I just think we couldn't make something as advanced as that without imparting some characteristics of ourselves into it. After all, we can only program what we know or imagine, and what we know is ourselves (masochist or sadists that we are lol)



We also know philosophy, which often greatly departs from the human animal mind. As computer programmer, I know that frequently programs programs are built to look like they work one way, but in fact work completely differently. For example, when I wrote a program to parse mathematical formulas I used a shunting-yard algorithm, which was nothing like the way we read formulas into our minds. The end data structure was similar, but the process was different. When doing web design, many of the clever effects I used to appease picky customers didn't have the design-semantic connection it appeared to have. Most general AIs will not look like the human mind, because they will be designed to fulfill a specific purpose.

The Jetsons had Rosie. We have Roombas. Also, human intelligence isn't the only one we have access to. Frequently robots are designed modeled after birds, fish, spiders, dogs. Many robots tracks or wheels, something no robot has. Some machines have been designed with animal brains (literally). Maybe our future tech will contain animal brains.

Anyhow most countries won't give robots equal rights to begin with. Japan demands robots have easily accessible off switches. I know they don't force people to have off switches.
First  Prev  1  2  3  4  5  6  7  8  Next  Last
You must be logged in to post.