First  Prev  1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  Next  Last
Will A.I robots have human rights?
17945 cr points
Send Message: Send PM GB Post
55 / M /
Online
Posted 8/30/14
I can not see humanity making machines that are completely indistinguishable from humans. So that part of the argument is moot. Now we are talking about thinking toasters. Like I said before there will be safeguards made for the AI, but are most people here confusing emotion with thinking? To burrow from the example above: What made the AI cry? Machines can NOT have emotions even if I do think that sometimes some are capricious. Finally I would hope that Asimov's 3 Laws of Robotics will be incorporated into all AIs.



I'm not a Luddite nor am I unsympathetic at the thought of Sentient AIs. But ultimately a tool is just a tool. A good tradesman takes care of his tools. Humanity HAS to become a better steward of the world. Once we do that than AI rights, etc. would not be needed.


17384 cr points
Send Message: Send PM GB Post
23 / M
Offline
Posted 8/30/14

Gafennec wrote:

I can not see humanity making machines that are completely indistinguishable from humans. So that part of the argument is moot. Now we are talking about thinking toasters. Like I said before there will be safeguards made for the AI, but are most people here confusing emotion with thinking? To burrow from the example above: What made the AI cry? Machines can NOT have emotions even if I do think that sometimes some are capricious. Finally I would hope that Asimov's 3 Laws of Robotics will be incorporated into all AIs.

I'm not a Luddite nor am I unsympathetic at the thought of Sentient AIs. But ultimately a tool is just a tool. A good tradesman takes care of his tools. Humanity HAS to become a better steward of the world. Once we do that than AI rights, etc. would not be needed.




If the machines behave like they have emotions, what does it matter if they actually do or not?
When you start reducing things down to what they really are, people are a complicated system made of elements that combine in complicated ways to produce an overall affect outside of the area you denote the human to being.
Looking at it from that perspective, I see no reason why computers can't have emotions, since human behavior can be simulated. So once machines can behave reasonably similar to humans, then we can say they have emotions when they behave like they do.
You shouldn't think of a decision making engine as a tool to achieve a goal, because if you do, then there's nothing stopping you from doing the same thing to humans, and I AM NOT simply a tool for others to play with.
If my toaster can think and make decisions, then I maintain that it should have rights, emotions or not.
10137 cr points
Send Message: Send PM GB Post
24 / F / Johnstown, PA, USA
Offline
Posted 8/30/14 , edited 8/30/14
I imagine that some countries, provinces, etc. would, but that most wouldn't. Personally, if A.I. become similar enough to humans, I would essentially treat them like us. At the very least, I'd view them on the pretty much the same level as I do most animals, though I'd be significantly colder to those that I don't know. Just like how I am with my family dog(s) versus random birds and whatnot.

EDIT:
Also, Gafennic has a good point about taking care of them. Even if they're very inhuman in terms of mannerisms/looks, it's a bit of a shame to bust-up or neglect a nice piece of equipment. I'd almost baby an A.I., the way I would with any sort of prized possession, especially if they're impressive/my own. It'd be like keeping a nice car, knife, gun, china set, etc. in tip-top shape.
15283 cr points
Send Message: Send PM GB Post
23 / M / UK
Offline
Posted 8/30/14
I imagine it would be similar to the question "if we met aliens would they have human rights?” If robots were developed to the level where they could become self-ware, had the ability to learn and to create original works, then, they would likely be given rights. The entire world would have to change its perception however, on defining life, and what makes an individual unique.
12247 cr points
Send Message: Send PM GB Post
21 / M
Offline
Posted 8/30/14
If a robot behaves exactly like a human, that robot would resemble humans in that neither of you can prove to me that you're actually conscious. You can only prove that to yourself.
5945 cr points
Send Message: Send PM GB Post
19 / NB / US
Offline
Posted 8/30/14
This is a sci-fi film-like disaster just waiting to happen.
6013 cr points
Send Message: Send PM GB Post
Offline
Posted 8/30/14
Why would they want to be treated like humans ? We treat each other like garbage.
9138 cr points
Send Message: Send PM GB Post
20 / M / In my head.
Offline
Posted 8/30/14
Seeing as how some guy (an idiot) brought a case to the U.S. Supreme Court trying to give apes and chimpanzees the same rights as humans... it's very possible people (idiots) will try to give robots human rights.
3318 cr points
Send Message: Send PM GB Post
42 / M / NW
Offline
Posted 8/30/14
Well corporations seem to have garnered human rights for themselves, so that blurs the line between people and something considered self aware. Not much of a stretch to remove the human condition from that.
20695 cr points
Send Message: Send PM GB Post
39 / M / Kansas
Offline
Posted 8/30/14
In my mind, they would have equal rights. I see no reason to discriminate. However,

tbclogistic wrote:

Why would they want to be treated like humans ? We treat each other like garbage.

It's taken us centuries to get past the injustice/inequality towards others of the "wrong" race, creed, gender, sexual orientation... oh, who am I kidding, we're not even there yet. So probably there will be enough jerks who treat the robots like crap that it will turn out like Matrix, Terminator, or Geth.

16390 cr points
Send Message: Send PM GB Post
21 / F / Arizona, US
Offline
Posted 8/30/14
No, because they aren't human. Simple as that.
36056 cr points
Send Message: Send PM GB Post
21 / M / Florida
Offline
Posted 8/30/14
The logical answer is no unless the AI is developed enough to be approved for most of the psychologists which is not achievable in a near future. Even so, I think they would deserve special rights just for them.
17945 cr points
Send Message: Send PM GB Post
55 / M /
Online
Posted 8/30/14

Nobodyofimportance wrote:


Gafennec wrote:
I'm not a Luddite nor am I unsympathetic at the thought of Sentient AIs. But ultimately a tool is just a tool. A good tradesman takes care of his tools. Humanity HAS to become a better steward of the world. Once we do that than AI rights, etc. would not be needed.




If the machines behave like they have emotions, what does it matter if they actually do or not?
When you start reducing things down to what they really are, people are a complicated system made of elements that combine in complicated ways to produce an overall affect outside of the area you denote the human to being.
Looking at it from that perspective, I see no reason why computers can't have emotions, since human behavior can be simulated. So once machines can behave reasonably similar to humans, then we can say they have emotions when they behave like they do.
You shouldn't think of a decision making engine as a tool to achieve a goal, because if you do, then there's nothing stopping you from doing the same thing to humans, and I AM NOT simply a tool for others to play with.
If my toaster can think and make decisions, then I maintain that it should have rights, emotions or not.


I highlighted the key part in my previous post. But first off a human being is NOT a tool. Animals are NOT a tool. Plants are NOT a tool. Yes we may feed off the animal and plant, but to denote them as a a tool disrespects their sacrifice. Now a rock can be a tool. A steak is a tool. A screwdriver is a tool. A calculator is a tool. A computer is a tool. A bullet is a tool. A nuclear bomb is a tool. It doesn't matter if something is a tool or not, but how we USE THAT TOOL. Humans can't but help project their hopes and fears on things and that's okay as long as we are proper stewards of the things in out care. I have NEVER said or implied that we should abuse anything in our care. But I will never raise a tool to the same level as a human. If push comes to shove and I had to choose to save a human or an AI; the human will win every time even if he or she was the lowest of the low. I take that back, but I won't invoke Godwin's Law here, but there are a few I'd save the AI over.
17945 cr points
Send Message: Send PM GB Post
55 / M /
Online
Posted 8/30/14

anikevin wrote:

The logical answer is no unless the AI is developed enough to be approved for most of the psychologists which is not achievable in a near future. Even so, I think they would deserve special rights just for them.


I don't know if I'd call them rights, but surely they deserve protection.
1223 cr points
Send Message: Send PM GB Post
ɪ ᴀᴍ ɴᴏᴛ ᴀ ʜᴇʀᴏ
Online
Posted 8/30/14

RedExodus wrote:


Augment wrote:

Humans have human rights.

Animals have animal rights.

Robots will have robot rights.

Logic.


Humans behave like humans.

Animals behave like animals.

Super robots can be even more humane than humans.

I would expect robot rights to be so similar to human rights that I wouldn't be able to distinguish between them much.


No. Even if robots ever become superior some how, it still won't make them human for obvious reasons. Even if they develop emotion and thought, they are still machine, just a far superior version like we are of Homoerectus. That said, I agree they would have similar rights to us, but only because we built them, they live in our society, they learn from us like children, and the desire for them to become more and more like us will only grow. Even if and when humans are extinct possibly robots will develop an even better and more perfect set of rights. Ranting ranting~ Brb I'm going to create an Android, sit back and watch me revolutionize the world.
First  Prev  1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  Next  Last
You must be logged in to post.