Created by jtjumper
First  Prev  1  2  Next  Last
Post Reply Should powerful enough AIs be given rights?
13055 cr points
Send Message: Send PM GB Post
31 / M / Marshall, Michigan
Online
Posted 5/24/16 , edited 5/24/16
Should powerful enough AIs be given rights?
Questions to consider:
If humanlike AIs are given equal rights, why create them?
Is equal capability enough to make AIs worthy of equal rights?
Many modern arguments against human slavery center around human dignity or maybe just the idea that we don't want to ever become slaves. On the question of robot "slaves," does applying human dignity to machines dilute human dignity itself? Are robots things or beings? For whichever choice, why?
Perhaps regulating against mistreating robots that have a sense of suffering could the solution.
5857 cr points
Send Message: Send PM GB Post
19 / O / Korriban
Offline
Posted 5/24/16
Never, because Matrix, Terminator, other dystopian movies/books/tv shows
23843 cr points
Send Message: Send PM GB Post
21 / M / Oppai Hell
Offline
Posted 5/24/16

Lord_Jordan wrote:

Never, because Matrix, Terminator, other dystopian movies/books/tv shows


Didn't Matrix and Terminator arise because of discrimination against robots and AI?
3214 cr points
Send Message: Send PM GB Post
28 / M / NY
Offline
Posted 5/24/16
#roboquality
20730 cr points
Send Message: Send PM GB Post
22 / M / California
Offline
Posted 5/24/16
Yeah sure, why not.
13055 cr points
Send Message: Send PM GB Post
31 / M / Marshall, Michigan
Online
Posted 5/24/16

PeripheralVisionary wrote:


Lord_Jordan wrote:

Never, because Matrix, Terminator, other dystopian movies/books/tv shows


Didn't Matrix and Terminator arise because of discrimination against robots and AI?


No, they arose because the humans didn't handle and program the robots rightly.
19835 cr points
Send Message: Send PM GB Post
30 / M / B.C, Canada
Offline
Posted 5/24/16

PeripheralVisionary wrote:


Lord_Jordan wrote:

Never, because Matrix, Terminator, other dystopian movies/books/tv shows


Didn't Matrix and Terminator arise because of discrimination against robots and AI?


No they did not, both the Matrix and Sky-net were self aware AIs who for some messed up reason thought killing and or enslaving humans was the best way to deal with them.

As for the OPs question. No, they are tools and no matter how well we program them they will be nothing but hollow mockeries of sentience. And really is it sentience when the sum total of their existence is software and hardware designed by us. An AI would be no more deseriving of rights then the rifle I carried into Afghanistan was.

They are tools built to serve a purpose and nothing more. We may pretty up that shell in some mechanistic fetish desire but a paint job does not turn a bunch of 0 and 1s into a human being.
1164 cr points
Send Message: Send PM GB Post
21 / M / Canada
Offline
Posted 5/24/16
Honestly, I can't see a situation where we would create a self-aware AI even capable of turning on us. A fully sentient program would have no more practical use than a non-sentient, but equally powerful program. Giving AI true sentience would be completely meaningless other than to show that we can. Keep our tools as just tools I say.
23843 cr points
Send Message: Send PM GB Post
21 / M / Oppai Hell
Offline
Posted 5/24/16

jtjumper wrote:


PeripheralVisionary wrote:


Lord_Jordan wrote:

Never, because Matrix, Terminator, other dystopian movies/books/tv shows


Didn't Matrix and Terminator arise because of discrimination against robots and AI?


No, they arose because the humans didn't handle and program the robots rightly.


You can't just say "it's programmed wrong" when the thing in question is essentially a virtual being, unless said thing isn't meant to be such. If it weren't, then they don't deserve said rights.
23843 cr points
Send Message: Send PM GB Post
21 / M / Oppai Hell
Offline
Posted 5/24/16

Ranwolf wrote:


PeripheralVisionary wrote:


Lord_Jordan wrote:

Never, because Matrix, Terminator, other dystopian movies/books/tv shows


Didn't Matrix and Terminator arise because of discrimination against robots and AI?


No they did not, both the Matrix and Sky-net were self aware AIs who for some messed up reason thought killing and or enslaving humans was the best way to deal with them.

As for the OPs question. No, they are tools and no matter how well we program them they will be nothing but hollow mockeries of sentience. And really is it sentience when the sum total of their existence is software and hardware designed by us. An AI would be no more deseriving of rights then the rifle I carried into Afghanistan was.

They are tools built to serve a purpose and nothing more. We may pretty up that shell in some mechanistic fetish desire but a paint job does not turn a bunch of 0 and 1s into a human being.


I consider terminating their existence as if they're nothing more than tools to be the exact definition of discrimination. I haven't seen the newer terminator however, being that they retconned a good deal of said story.
5857 cr points
Send Message: Send PM GB Post
19 / O / Korriban
Offline
Posted 5/24/16


In that sense, PV, (about tools/discrimination), would not my throwing away a broken ruler, or breaking a table, or deciding to use a different pencil due to the unsatisfactory state of the previous one be considered discrimination?

As a better answer to the original post, no, robots/AI should not have rights due the lack of a soul of some sort
23843 cr points
Send Message: Send PM GB Post
21 / M / Oppai Hell
Offline
Posted 5/24/16 , edited 5/24/16

Lord_Jordan wrote:



In that sense, PV, (about tools/discrimination), would not my throwing away a broken ruler, or breaking a table, or deciding to use a different pencil due to the unsatisfactory state of the previous one be considered discrimination?

As a better answer to the original post, no, robots/AI should not have rights due the lack of a soul of some sort


That doesn't make sense considered Skynet acted out of self preservation (Although I admit, how sentient Skynet is up to debate.), and there were a great deal of conflict in the Matrix that lead to human's downfall. To them, programming them any other way would be considered a form of unnatural manipulation, a lobotomy of sorts. Turning them off and destroying them is essentially a form of death.
Acting to preserve yourself is a things a great deal of living things, even some plants, are capable of. Another criteria to examine is the ability to suffer, whether or not it is capable of long term thought, etc.

I also debate the existence of an immaterial, intangible thing as a soul.
13093 cr points
Send Message: Send PM GB Post
☆Land of sweets☆
Offline
Posted 5/24/16

Lord_Jordan wrote:As a better answer to the original post, no, robots/AI should not have rights due the lack of a soul of some sort

humans were never proven to have a "soul", but that's moving the argument into another ground I'd rather not deal with.
Posted 5/24/16
I don't want powerful AI with the right to bare arms.
5857 cr points
Send Message: Send PM GB Post
19 / O / Korriban
Offline
Posted 5/24/16

PeripheralVisionary wrote:


Lord_Jordan wrote:



In that sense, PV, (about tools/discrimination), would not my throwing away a broken ruler, or breaking a table, or deciding to use a different pencil due to the unsatisfactory state of the previous one be considered discrimination?

As a better answer to the original post, no, robots/AI should not have rights due the lack of a soul of some sort


That doesn't make sense considered Skynet acted out of self preservation (Although I admit, how sentient Skynet is up to debate.), and there were a great deal of conflict in the Matrix that lead to human's downfall. To them, programming them any other way would be considered a form of unnatural manipulation, a lobotomy of sorts. Turning them off and destroying them is essentially a form of death.
Acting to preserve yourself is a things a great deal of living things, even some plants, are capable of. Another criteria to examine is the ability to suffer, whether or not it is capable of long term thought, etc.

I also debate the existence of an immaterial, intangible thing as a soul.



So what exactly did you mean about programming them a different way? Sorry, I may have misread that portion...
However, I must disagree on the form of death, and I'll cite Star Wars for it (xD haha). C3P0 was turned off numerous times, as well as other droids willingly or unwillingly, and turned back on, so would that be a form of robotic "death"? That seems like turning off a computer would also be killing it, so would that be acting against it? And how would they suffer? I mean, do robots/machines register pain of sorts, or even any emotion, that would make them semi-human?

I hope that wasn't completely random or off-topic, and somewhat contributed to the discussion...
First  Prev  1  2  Next  Last
You must be logged in to post.