Remove this ad
First  Prev  1  2  3  4  5  Next  Last
Post Reply What's the BIG deal?
24047 cr points
Send Message: Send PM GB Post
25 / M / Eastern Time Zone...
Offline
Posted 10/3/08
I'm going along with the crowd on this one and say that this is an awesome anime. I don't think of robots on a daily basis but when something like iRobot or Time of EVE comes along it just gets the gears grinding again.

I AM BENKAI89 AND I AM AN ANDROID-holic!
Member
30 cr points
Send Message: Send PM GB Post
36
Offline
Posted 10/3/08

And if androids were to become human, there would be seriously war between androids n humans as humans will most probably panick n fight for supremacy etc.

What everyone here seems to forget is that these robots have Isaac Asimov's Three Laws of Robotics embedded in them:
1. A robot may not harm a human, or through inaction allow a human to be harmed.
2. A robot must obey human orders, as long as that does not conflict with the first law.
3. A robot must protect itself, as long as that does not conflict with the first or second law.
(written from memory, but that's the gist of it)


To begin with, this means is that robots can NOT hurt people, and they will even try to prevent people from being hurt. Although going by the second episode, Time of Eve seems to take that further, and has them
- which is an extension of the first law, but a logical/plausible one.

Secondly, robots will do what they are told (as long it doesn't hurt people), so robots will always be subservient to humans. If you tell a robot to dismantle itself, then it will happily do so...

The third law is really just there to stop robots from damaging themselves, as they are (presumably) fairly expensive.


The second law means that how ever many human attributes are picked-up by robots, they will always be (intrinsically) subservient to humans. But as Asimov's Robot City stories explore, perhaps we *shouldn't* give (unreasonable) orders to robots? Let them be as free as their Three Laws allow? That may be the direction that Time of Eve is heading in... Especially since Asimov's robots were almost always made of metal, where-as in this anime they look completely human.

BTW, iRobot was (approximately) based upon Asimov's robot stories & Three Laws.
17 cr points
Send Message: Send PM GB Post
25 / At a University
Offline
Posted 10/3/08
Don't get me wrong. I really do like this series, and will be fully excited for each new episode.

But in terms of this ever happening, that is a definite no.

First of all, if people made robots that could act humanly, there would be no point in making these robots. They would probably be expensive, and the point in getting one is like the reason you have a computer. Robots would exist to make humanity live an easier life (make us lazy) If your computer started doing things on it's own and having it's own wants and needs and not turning on when you want it to, that would be a waste of money. Same with a robot.

Also I think someone mentioned earlier, how hard it would be to make machines that could act humanly. This is true. I'll bring you to one point that I have learned while taking an introductory class for programming. We learned that when telling the computer to generate a random number, the number is actually not random at all. There is no possible way to make a computer guess or choose on its own without giving it some guidance for making it's choice. It has to have some predetermined instructions for it to generate anything. The random number it made was generated by a series of algorithms that make it pretend to be random. If you knew the algorithms you could actually predict the numbers that computers generate. Even if you wanted a computer to make a guess based off information, it wouldn't really be guessing, it would simply see which information matches the best compared to the others and relay it's answer. It can never have gut instincts, because it all comes back to the fact that computers are biased off of instructions. They only follow simple instructions, (When something happens then I must do this) and even if they learned to learn, they still have to write instructions for what they learn. That's why playing against CP in video-games are so easy. They have predictable patterns. The computer will do the same thing for each situation it is assigned to react to. If it's not assigned to do anything in that situation, then it won't.
It is possible to make a computer imitate a real person. If we give it enough programming, it will seem as though it is making decisions. But ever being able to go against it's own programming is impossible. The closest thing to a computer rebelling against its own programs would be if we told it to write over it's programming. We would still have to tell it when to write over it's own programs (like if it gained so much data or a certain type of data (ex we tell it that it can overwrite its own data if its database finds that the pattern of humanity is a threat to itself) ) and what to overwrite it's data with (What new program it just received (ex a solution to humanity's destruction) otherwise it would not think "hmm I want to change humanity because I don't like how it currently is." Computers never want anything. They only appear to want something if we tell them to.

Setting those limitations aside. If computers could choose on their own, why do people always think that robots would want to become human. I suppose its a human superiority thing to think that everything would want to become human. Robots would probably think that humans would want to become robots. We have to constantly be eating, sleeping, reliving ourselves, building buildings to shield ourselves from the weather, trying not to forget things. We can't upgrade what we are born with, if we loose an arm it's gone. (though if we have androids, chances are we can re-grow limbs) Wouldn't it be nice if you never had to sleep again? Think of all the things you could do during the time you sleep. Yes I know, I've seen all the movies and red the books where they point out that these inhibitions are what make us human (try reading the giver) along with our emotions (Seriously it's our silly emotions that make us go to war) but only a human would think that it would be cool to live like that. A robot is designed to be efficient. They would probably rather just want to continue making the world more efficient and try and make our lives easier just as we told them to do when we created them, because we told them that's what they should like.

In closing my 3d Intro to maya teacher gave us this poem, and I think it sums up how computers understand the world.
I really hate this dumb machine
it makes me want to sell it.
It never does what I want
Only what I tell it.
Posted 10/3/08
Why people are shunned?

well, it's just good old peer pressure. If more people does something one way, they consciously/unconsciously create a group. This group would promote its ways, calling it The Right Way etc, and it would try to destroy the image of the other groups (putting labels on them as you saw in Eve no Jikan - e.g. "Dori Kei - they can't communicate with other humanbeings") so that it would be the only right one. something along that line. People want justification for their actions. They want to make things make sense, or rather make their view make sense.
What's interesting about this anime is that robots (or the more advanced ones) can actually think and act completely human. So, being a "dori-kei" is not being an idiot who misunderstands things. In contrast, it may be those who recognize robots as tools who are afraid to face the fact that maybe robots are sentient enough to be recognized as their equals.
Of course, if you can actually make a someone who's your equal with your bare hands, it would complicate a lot of things, such as the law-making process, school systems, job competitions, discriminations etc etc. Politicians would, of course, avoid/supress these issues. Hence the promoting anti-"Dori-kei" sentiment.
129 cr points
Send Message: Send PM GB Post
Offline
Posted 10/3/08
I don't think we will have this issue of robot rights in my lifetime.
Or maybe we might
Member
310 cr points
Send Message: Send PM GB Post
25 / M / Arizona
Offline
Posted 10/4/08 , edited 10/4/08
I'm not reading all of this, blurgh blurgh.

There're a few logical reasons for humans to not want non-biological creations to be considered societal equals. For one, they were created as tools, and while tools should be respected, not abused, they are still tools, not people. A hammer can not be a slave, a gun cannot be a murderer, they are only tools. You only start thinking of androids as emotionally equal to human beings because you gave them a face that you identify with, you're fooled by a mimic. If we could create learning computers that behave similar to humans, and to give them faces just like humans, I would have to admit that I would end up treating them as if they were humans, it would hurt me to see a human treated as an "inferior thing" (actually, it kind of hurts me to see things in general treated that way).
Wait, lost my train of thought... Oh yes.

Robots being treated as humans poses a threat to human survival, for one, it could lower birthrates considerably, and "robot uprising" is always possible. Robots are also more suceptible to being "turned", that is, it's possible that a criminal organisation (or even just the corporation that created them) could use the robots in sensitive areas to be quite damaging... Actually, that last bit is still true, even without robots being treated as sapient creatures.
In the first episode, there was a commercial from the... Robot safety thingy, I forget the name. Some sort of council, I think, anyway, it was an indication of farmer's being put out of work by the robots. Something about "Would you eat fruit from the cold hand of a machine", or something. When a a nation is given free labor, as from slaves or these robots, who end up taking the menial jobs, it's damaging to the lower classes, who previously took those jobs to support themselves and their families.


Uh... That's the end of this post. I'll read the rest of the thread now and I might post again. I think the treatment of created beings is one of the more discussed attributes of S/F, maybe you all should read some of the classics.

Edit: Oh, by the way, it upset me a bit when those kids at the main character's school were mildly abusing their androids, but I suspect it was mostly just a knee-jerk protective instinct because they were human shaped. Female shaped too, if I recall. And as I said before, a tool should not be treated with disrespect.
Ah... And if I was presented with a robot who, from some mistake, had a human-like mind, I wouldn't try to destroy it. Bicentennial man, I, Robot... Ah... Isaac Asimov...

Edit again:

By the way, the Cafe is not proof of robot sapience or emotional/spiritual equality. If I was living in the time of this, being a skeptic, although a pliable-skeptic, I admit, I would suspect that, given the advertisments I had found in my Android's logs, that the owners of the Cafe had modified my android's programming. I actually came up with this as one of the explanations for how the androids were able to behave indistinguishably from humans while in the cafe, but not when outside of it... The rings might not only connotate the "legal zones", but might also mean that the android is functionally properly... When the Android comes into proximity of the cafe, it may have its programming altered to behave more humanly. In fact, they may have computers running human simulations, and having the actions transfered to the androids. This makes sense, since the androids were never meant to behave humanly, they were meant to be tools.

I doubt the show is this cynical, however.


Edit the third:

Oh, by the way. There would be a more specific stigma against people who came to view Androids as equals in japan, I believe, or at least in the show it's somewhat of a correlation between the NEETs of japan being a problem... If you gave people the ability to *think* they were having normal human relations (but in actuality, having the ability to modify the robot's opinions at leasure), there would surely be a much larger percentage of the population being NEET, which means less productive members of society, which means a general downfall of "good stuffs" happening in the world. And your nation might crumble due to lack of productivity, by the way.

Edit for the very fourth time:

Someone pointed out that they believed that androids (android means an automaton (robot) created to resemble humans, by the way) were given human shapes to be efficient.. This may be true for housework and such, but I believe it's mainly because humans carry a fear of having inhuman or nearly-human looking things near them, ever heard of the uncanny valley? That's the point of anthropomorphics where things look somewhat human, but inspire terror instead of comfort, like store manikins or those french dolls... The new model of robots as depicted in the movie version of I, Robot are kind of in the valley for me... *shudder*
*points higher in the post* That bit where I mention that commercial from complaining farmers? They showed robots that basically look like modern wheat harvesters, which are a lot more efficient than human shaped for doing that job, by the way.


Oh... And can anyone translate some of the more garbled posts? I'm having trouble comprehending the mangled english that's trying to express complex ideas... *looks higher on his post* I guess I lied about the end of the post...

Edit...:

Someone said that Robots are, and will at least be for a very long time fundamentally artificial, even in thought. And that they can not express emotion beyond what they program them to.
Let me ask you this, sir, have you not been programmed to behave the way you do your whole life? If your mother had never told you to say "thank you" when someone helps you or gives you something, would you? If she never said I love you, would you say you loved her?
And beyond just parents, with the amount of media exposure even homeless people have, we're all "programmed" with certain expectations in our cultures, to love, to be loved, to have freedom, to have comfort, to have food. And even less evident in social interactions that I can't quite dissect, because it's so hard wired into how I view this world. Imagine you had never seen another person, that you had never been taught (even though inadvertant imitation) to love, to laugh, to hate, to lie. What emotions would you actually be able to express, how would you express them, who are you expressing them to? In fact, imagine you were born blind. Imagine how different your view of the world would probably be. Are people who are born blind entirely human (or sane, whatever)? Sight is a very important ability to most people, and without that major ability our species, or specific cultures might have been very different. (Or, actually, have never existed, because we would have died out, but whatever.) Bringing this up is making me wonder if people born without abilities we normally take for granted (sight, smell, touch, hearing) will end up forming their own culture since more of them will end up managing to survive in the modern world, where having to fend for yourself is at least slightly less pressing than before... I wonder if they would eventually found their own nation due to differences in the majority of thought... Hm... I should really find some blind people and ask them things to see if they percieve the world in a different light than sighted people. Other than the whole "not seeing stuff" thing, I mean.

Ah.. The human mind can not really generate a random number, and if you understood everything that influenced that person's thought at the time, you would be able to predict what number they would choose at a given time. We use dice rather than "thinking up a number at random" for random numbers for more than to avoid people cheating, you know.
The human mind also runs "a series of algorithims" to determine what's going to happen, where to step at any given moment, where a dart will fly when you throw it at a certain angle. Much of it is hard-wired reflexes, or subconcious thought. You are nearly always going by a "set pattern based on each scenerio".

The idea of a "soul" (in relation to robots) is more or less the idea that humans have the ability to "go beyond experience", whereas robots are bound to expereince...

Also, I'd be pleased if you all didn't tl;dr this post... I'm not sure I could even give you a short summary of what I bring up here.
Moderator
2177 cr points
Send Message: Send PM GB Post
26 / F / Quezon City, Phil...
Offline
Posted 10/7/08 , edited 10/7/08

Vortor wrote:

I'm not reading all of this, blurgh blurgh.

There're a few logical reasons for humans to not want non-biological creations to be considered societal equals. For one, they were created as tools, and while tools should be respected, not abused, they are still tools, not people. A hammer can not be a slave, a gun cannot be a murderer, they are only tools. You only start thinking of androids as emotionally equal to human beings because you gave them a face that you identify with, you're fooled by a mimic. If we could create learning computers that behave similar to humans, and to give them faces just like humans, I would have to admit that I would end up treating them as if they were humans, it would hurt me to see a human treated as an "inferior thing" (actually, it kind of hurts me to see things in general treated that way).
Wait, lost my train of thought... Oh yes.

Robots being treated as humans poses a threat to human survival, for one, it could lower birthrates considerably, and "robot uprising" is always possible. Robots are also more suceptible to being "turned", that is, it's possible that a criminal organisation (or even just the corporation that created them) could use the robots in sensitive areas to be quite damaging... Actually, that last bit is still true, even without robots being treated as sapient creatures.
In the first episode, there was a commercial from the... Robot safety thingy, I forget the name. Some sort of council, I think, anyway, it was an indication of farmer's being put out of work by the robots. Something about "Would you eat fruit from the cold hand of a machine", or something. When a a nation is given free labor, as from slaves or these robots, who end up taking the menial jobs, it's damaging to the lower classes, who previously took those jobs to support themselves and their families.


Uh... That's the end of this post. I'll read the rest of the thread now and I might post again. I think the treatment of created beings is one of the more discussed attributes of S/F, maybe you all should read some of the classics.

Edit: Oh, by the way, it upset me a bit when those kids at the main character's school were mildly abusing their androids, but I suspect it was mostly just a knee-jerk protective instinct because they were human shaped. Female shaped too, if I recall. And as I said before, a tool should not be treated with disrespect.
Ah... And if I was presented with a robot who, from some mistake, had a human-like mind, I wouldn't try to destroy it. Bicentennial man, I, Robot... Ah... Isaac Asimov...

Edit again:

By the way, the Cafe is not proof of robot sapience or emotional/spiritual equality. If I was living in the time of this, being a skeptic, although a pliable-skeptic, I admit, I would suspect that, given the advertisments I had found in my Android's logs, that the owners of the Cafe had modified my android's programming. I actually came up with this as one of the explanations for how the androids were able to behave indistinguishably from humans while in the cafe, but not when outside of it... The rings might not only connotate the "legal zones", but might also mean that the android is functionally properly... When the Android comes into proximity of the cafe, it may have its programming altered to behave more humanly. In fact, they may have computers running human simulations, and having the actions transfered to the androids. This makes sense, since the androids were never meant to behave humanly, they were meant to be tools.

I doubt the show is this cynical, however.


Edit the third:

Oh, by the way. There would be a more specific stigma against people who came to view Androids as equals in japan, I believe, or at least in the show it's somewhat of a correlation between the NEETs of japan being a problem... If you gave people the ability to *think* they were having normal human relations (but in actuality, having the ability to modify the robot's opinions at leasure), there would surely be a much larger percentage of the population being NEET, which means less productive members of society, which means a general downfall of "good stuffs" happening in the world. And your nation might crumble due to lack of productivity, by the way.

Edit for the very fourth time:

Someone pointed out that they believed that androids (android means an automaton (robot) created to resemble humans, by the way) were given human shapes to be efficient.. This may be true for housework and such, but I believe it's mainly because humans carry a fear of having inhuman or nearly-human looking things near them, ever heard of the uncanny valley? That's the point of anthropomorphics where things look somewhat human, but inspire terror instead of comfort, like store manikins or those french dolls... The new model of robots as depicted in the movie version of I, Robot are kind of in the valley for me... *shudder*
*points higher in the post* That bit where I mention that commercial from complaining farmers? They showed robots that basically look like modern wheat harvesters, which are a lot more efficient than human shaped for doing that job, by the way.


Oh... And can anyone translate some of the more garbled posts? I'm having trouble comprehending the mangled english that's trying to express complex ideas... *looks higher on his post* I guess I lied about the end of the post...

Edit...:

Someone said that Robots are, and will at least be for a very long time fundamentally artificial, even in thought. And that they can not express emotion beyond what they program them to.
Let me ask you this, sir, have you not been programmed to behave the way you do your whole life? If your mother had never told you to say "thank you" when someone helps you or gives you something, would you? If she never said I love you, would you say you loved her?
And beyond just parents, with the amount of media exposure even homeless people have, we're all "programmed" with certain expectations in our cultures, to love, to be loved, to have freedom, to have comfort, to have food. And even less evident in social interactions that I can't quite dissect, because it's so hard wired into how I view this world. Imagine you had never seen another person, that you had never been taught (even though inadvertant imitation) to love, to laugh, to hate, to lie. What emotions would you actually be able to express, how would you express them, who are you expressing them to? In fact, imagine you were born blind. Imagine how different your view of the world would probably be. Are people who are born blind entirely human (or sane, whatever)? Sight is a very important ability to most people, and without that major ability our species, or specific cultures might have been very different. (Or, actually, have never existed, because we would have died out, but whatever.) Bringing this up is making me wonder if people born without abilities we normally take for granted (sight, smell, touch, hearing) will end up forming their own culture since more of them will end up managing to survive in the modern world, where having to fend for yourself is at least slightly less pressing than before... I wonder if they would eventually found their own nation due to differences in the majority of thought... Hm... I should really find some blind people and ask them things to see if they percieve the world in a different light than sighted people. Other than the whole "not seeing stuff" thing, I mean.

Ah.. The human mind can not really generate a random number, and if you understood everything that influenced that person's thought at the time, you would be able to predict what number they would choose at a given time. We use dice rather than "thinking up a number at random" for random numbers for more than to avoid people cheating, you know.
The human mind also runs "a series of algorithims" to determine what's going to happen, where to step at any given moment, where a dart will fly when you throw it at a certain angle. Much of it is hard-wired reflexes, or subconcious thought. You are nearly always going by a "set pattern based on each scenerio".

The idea of a "soul" (in relation to robots) is more or less the idea that humans have the ability to "go beyond experience", whereas robots are bound to expereince...

Also, I'd be pleased if you all didn't tl;dr this post... I'm not sure I could even give you a short summary of what I bring up here.


thanks for the in depth analysis in the subject matter.
reading your post entertained me for a minute but i'll try to remember your pointers for review.
622 cr points
Send Message: Send PM GB Post
24 / M / Beautiful world o...
Offline
Posted 10/15/08
622 cr points
Send Message: Send PM GB Post
24 / M / Beautiful world o...
Offline
Posted 10/15/08
oh i forgot to tell you that i found something interesting article at yahoo... i think it is similar about this robots being treated as tools... but the difference is about chimpanzees being treated as 'people'?, but i found the article biased, anyway look for it and decide it for yourself...
ohh we can't really know what chimpanzees are thinking unlike robots who were designed for the humans...

sorry for the double post. if my post is off topic, ignore this one. just want to share it.
Posted 10/16/08
i read every comment and now my head hurts
Member
2978 cr points
Send Message: Send PM GB Post
73 / M / Manila
Offline
Posted 10/21/08
Great story and the fact that they were looking ahead of time were robots and humans coexist in society...Us humans have a pretty good imagination don't we?
Member
1172 cr points
Send Message: Send PM GB Post
33 / M
Offline
Posted 10/29/08
i cas say as long as a programer dosnt hack into the android's company there fine but to treat a reobt as a human?? no i wouldnt.. i'd just be nice to it tho.. what if they had a trojin and that virus gave a person all the things trhat robot seen? that'd suck u'd have to master clear it and start over xD
Member
1172 cr points
Send Message: Send PM GB Post
33 / M
Offline
Posted 10/29/08

poetax wrote:
but yet the robots hardware wouldnt be the big price it'd be the softwhare of the programers... go watch battle programer shizaru or w/e it is its BPS and its true that whatever we program it to do itll do it like if a hacker re=wrote the program and had it infect other robots then kill humans... we'd be fucking scrued financhaly/money wise
there is still hope with fusing robot parts to interact with a human'd boty such as a robotic arm with nervs through it or a blind persin using a stimulint machine to focus brain waves to see again, its posabul meh o well im only 13 if i dont get my point idc xD its out of my leige to re-program somthing but i still can look at the programing and tell you what itll do


Don't get me wrong. I really do like this series, and will be fully excited for each new episode.

But in terms of this ever happening, that is a definite no.

First of all, if people made robots that could act humanly, there would be no point in making these robots. They would probably be expensive, and the point in getting one is like the reason you have a computer. Robots would exist to make humanity live an easier life (make us lazy) If your computer started doing things on it's own and having it's own wants and needs and not turning on when you want it to, that would be a waste of money. Same with a robot.

Also I think someone mentioned earlier, how hard it would be to make machines that could act humanly. This is true. I'll bring you to one point that I have learned while taking an introductory class for programming. We learned that when telling the computer to generate a random number, the number is actually not random at all. There is no possible way to make a computer guess or choose on its own without giving it some guidance for making it's choice. It has to have some predetermined instructions for it to generate anything. The random number it made was generated by a series of algorithms that make it pretend to be random. If you knew the algorithms you could actually predict the numbers that computers generate. Even if you wanted a computer to make a guess based off information, it wouldn't really be guessing, it would simply see which information matches the best compared to the others and relay it's answer. It can never have gut instincts, because it all comes back to the fact that computers are biased off of instructions. They only follow simple instructions, (When something happens then I must do this) and even if they learned to learn, they still have to write instructions for what they learn. That's why playing against CP in video-games are so easy. They have predictable patterns. The computer will do the same thing for each situation it is assigned to react to. If it's not assigned to do anything in that situation, then it won't.
It is possible to make a computer imitate a real person. If we give it enough programming, it will seem as though it is making decisions. But ever being able to go against it's own programming is impossible. The closest thing to a computer rebelling against its own programs would be if we told it to write over it's programming. We would still have to tell it when to write over it's own programs (like if it gained so much data or a certain type of data (ex we tell it that it can overwrite its own data if its database finds that the pattern of humanity is a threat to itself) ) and what to overwrite it's data with (What new program it just received (ex a solution to humanity's destruction) otherwise it would not think "hmm I want to change humanity because I don't like how it currently is." Computers never want anything. They only appear to want something if we tell them to.

Setting those limitations aside. If computers could choose on their own, why do people always think that robots would want to become human. I suppose its a human superiority thing to think that everything would want to become human. Robots would probably think that humans would want to become robots. We have to constantly be eating, sleeping, reliving ourselves, building buildings to shield ourselves from the weather, trying not to forget things. We can't upgrade what we are born with, if we loose an arm it's gone. (though if we have androids, chances are we can re-grow limbs) Wouldn't it be nice if you never had to sleep again? Think of all the things you could do during the time you sleep. Yes I know, I've seen all the movies and red the books where they point out that these inhibitions are what make us human (try reading the giver) along with our emotions (Seriously it's our silly emotions that make us go to war) but only a human would think that it would be cool to live like that. A robot is designed to be efficient. They would probably rather just want to continue making the world more efficient and try and make our lives easier just as we told them to do when we created them, because we told them that's what they should like.

In closing my 3d Intro to maya teacher gave us this poem, and I think it sums up how computers understand the world.
I really hate this dumb machine
it makes me want to sell it.
It never does what I want
Only what I tell it.


Member
1172 cr points
Send Message: Send PM GB Post
33 / M
Offline
Posted 10/29/08
and its not the machiene its the user xD
517 cr points
Send Message: Send PM GB Post
24 / M / Somewhere Else, NY
Offline
Posted 11/11/08

Vortor wrote:

I'm not reading all of this, blurgh blurgh.

There're a few logical reasons for humans to not want non-biological creations to be considered societal equals. For one, they were created as tools, and while tools should be respected, not abused, they are still tools, not people. A hammer can not be a slave, a gun cannot be a murderer, they are only tools. You only start thinking of androids as emotionally equal to human beings because you gave them a face that you identify with, you're fooled by a mimic. If we could create learning computers that behave similar to humans, and to give them faces just like humans, I would have to admit that I would end up treating them as if they were humans, it would hurt me to see a human treated as an "inferior thing" (actually, it kind of hurts me to see things in general treated that way).
Wait, lost my train of thought... Oh yes.

Robots being treated as humans poses a threat to human survival, for one, it could lower birthrates considerably, and "robot uprising" is always possible. Robots are also more suceptible to being "turned", that is, it's possible that a criminal organisation (or even just the corporation that created them) could use the robots in sensitive areas to be quite damaging... Actually, that last bit is still true, even without robots being treated as sapient creatures.
In the first episode, there was a commercial from the... Robot safety thingy, I forget the name. Some sort of council, I think, anyway, it was an indication of farmer's being put out of work by the robots. Something about "Would you eat fruit from the cold hand of a machine", or something. When a a nation is given free labor, as from slaves or these robots, who end up taking the menial jobs, it's damaging to the lower classes, who previously took those jobs to support themselves and their families.


Uh... That's the end of this post. I'll read the rest of the thread now and I might post again. I think the treatment of created beings is one of the more discussed attributes of S/F, maybe you all should read some of the classics.

Edit: Oh, by the way, it upset me a bit when those kids at the main character's school were mildly abusing their androids, but I suspect it was mostly just a knee-jerk protective instinct because they were human shaped. Female shaped too, if I recall. And as I said before, a tool should not be treated with disrespect.
Ah... And if I was presented with a robot who, from some mistake, had a human-like mind, I wouldn't try to destroy it. Bicentennial man, I, Robot... Ah... Isaac Asimov...

Edit again:

By the way, the Cafe is not proof of robot sapience or emotional/spiritual equality. If I was living in the time of this, being a skeptic, although a pliable-skeptic, I admit, I would suspect that, given the advertisments I had found in my Android's logs, that the owners of the Cafe had modified my android's programming. I actually came up with this as one of the explanations for how the androids were able to behave indistinguishably from humans while in the cafe, but not when outside of it... The rings might not only connotate the "legal zones", but might also mean that the android is functionally properly... When the Android comes into proximity of the cafe, it may have its programming altered to behave more humanly. In fact, they may have computers running human simulations, and having the actions transfered to the androids. This makes sense, since the androids were never meant to behave humanly, they were meant to be tools.

I doubt the show is this cynical, however.


Edit the third:

Oh, by the way. There would be a more specific stigma against people who came to view Androids as equals in japan, I believe, or at least in the show it's somewhat of a correlation between the NEETs of japan being a problem... If you gave people the ability to *think* they were having normal human relations (but in actuality, having the ability to modify the robot's opinions at leasure), there would surely be a much larger percentage of the population being NEET, which means less productive members of society, which means a general downfall of "good stuffs" happening in the world. And your nation might crumble due to lack of productivity, by the way.

Edit for the very fourth time:

Someone pointed out that they believed that androids (android means an automaton (robot) created to resemble humans, by the way) were given human shapes to be efficient.. This may be true for housework and such, but I believe it's mainly because humans carry a fear of having inhuman or nearly-human looking things near them, ever heard of the uncanny valley? That's the point of anthropomorphics where things look somewhat human, but inspire terror instead of comfort, like store manikins or those french dolls... The new model of robots as depicted in the movie version of I, Robot are kind of in the valley for me... *shudder*
*points higher in the post* That bit where I mention that commercial from complaining farmers? They showed robots that basically look like modern wheat harvesters, which are a lot more efficient than human shaped for doing that job, by the way.


Oh... And can anyone translate some of the more garbled posts? I'm having trouble comprehending the mangled english that's trying to express complex ideas... *looks higher on his post* I guess I lied about the end of the post...

Edit...:

Someone said that Robots are, and will at least be for a very long time fundamentally artificial, even in thought. And that they can not express emotion beyond what they program them to.
Let me ask you this, sir, have you not been programmed to behave the way you do your whole life? If your mother had never told you to say "thank you" when someone helps you or gives you something, would you? If she never said I love you, would you say you loved her?
And beyond just parents, with the amount of media exposure even homeless people have, we're all "programmed" with certain expectations in our cultures, to love, to be loved, to have freedom, to have comfort, to have food. And even less evident in social interactions that I can't quite dissect, because it's so hard wired into how I view this world. Imagine you had never seen another person, that you had never been taught (even though inadvertant imitation) to love, to laugh, to hate, to lie. What emotions would you actually be able to express, how would you express them, who are you expressing them to? In fact, imagine you were born blind. Imagine how different your view of the world would probably be. Are people who are born blind entirely human (or sane, whatever)? Sight is a very important ability to most people, and without that major ability our species, or specific cultures might have been very different. (Or, actually, have never existed, because we would have died out, but whatever.) Bringing this up is making me wonder if people born without abilities we normally take for granted (sight, smell, touch, hearing) will end up forming their own culture since more of them will end up managing to survive in the modern world, where having to fend for yourself is at least slightly less pressing than before... I wonder if they would eventually found their own nation due to differences in the majority of thought... Hm... I should really find some blind people and ask them things to see if they percieve the world in a different light than sighted people. Other than the whole "not seeing stuff" thing, I mean.

Ah.. The human mind can not really generate a random number, and if you understood everything that influenced that person's thought at the time, you would be able to predict what number they would choose at a given time. We use dice rather than "thinking up a number at random" for random numbers for more than to avoid people cheating, you know.
The human mind also runs "a series of algorithims" to determine what's going to happen, where to step at any given moment, where a dart will fly when you throw it at a certain angle. Much of it is hard-wired reflexes, or subconcious thought. You are nearly always going by a "set pattern based on each scenerio".

The idea of a "soul" (in relation to robots) is more or less the idea that humans have the ability to "go beyond experience", whereas robots are bound to expereince...

Also, I'd be pleased if you all didn't tl;dr this post... I'm not sure I could even give you a short summary of what I bring up here.


I read the entire post, and I have a few issues.

To Poetix
First of all, you seem to be ignoring the fact that these androids might be used as companions. That's one of the reasons why designers may have put it in human form- one of your very points, with the uncanny valley- robots that fall into it are shunned by humans. A better companion would be one that's shaped like a dog or cat, or if they could make it well enough, a person. Sure, androids could be mere tools, but they are tools that we use to invest human feeling in.

Your second paragraph is mostly bogus. First of all, if it started to lower birthrates, the androids, bound by the Asimov's First Law of Robotics, would be forced to withdraw, forcing humans to mate with each other. Logically, even if the android knew that it would be hurting many individual humans, but it would be saving the countless unborn humans that would not be if it did not force them to. If the humans would rather die than mate, then they would come back, because that would keep humans from surviving altogether, which leads to my counterpoint to the uprising. Free will is a need that all humans need to at least feel like they have. If, somehow, robots twisted the first law to the point that humans were enslaved so that they were safe,
, they would see that human's would rather die than live without free will, and stop. In fact, as androids became more intelligent, they would do this logic tree in their heads, and never revolt in the first place. The same could be said for the ludicrous criminal organisation. They might be enticed to help with some sort of Robin Hood type scheme, but it would be impossible to be any more corrupt than that.

Your remark about the removal of jobs by robots is also iffy. Think- robots would do everything better than humans, so the lower classes can be supported by the robots who don't need food, sleep, or shelter. In fact, all of humanity can invest itself in becoming culturally perfect. Those who want to feel productive can still do what they want, and they could make a buck selling human grown stuffs, which would be rare and therefore, valuable. We would be productive in a different fashion- freed from the constraints of manual labor, we can work on higher problems. It's a utopian dream, so unable to be achieved, but even just the basics are a good thing, yes? I mean, I suppose halfway between the two there's a situation where robots have taken the jobs and humans are unemployed but they need to be, but robots would see that humans were suffering because of them and stop.

Your comments about human patterns was interesting, but not original. Asimov's Laws of Humanics speculated a number of laws far more complicated than the Laws of Robotics that govern human behavior.
Blindness is another thing entirely- although I speculate that for blind people, they may perceive voices. When I think of all the people in the world, I see them. The blind might hear them- thus, they might perceive that so many voices will drown them out and allow no other voices through. It's really cool.
Also, humans are made with experience. We are more than the sum of our parts. That's all this spiritual thing is. Thus, we can apply souls to robots by merely thinking that robots too, are also made with experience.

To Poetax: It's pretty clear that Video Game A.I.'s are not the pinnacle of robot programming. Plus, you completely ignore allowance for robots to change there own programming based on past experiences. Like: A robot drops a cup on the counter from three feet up- it breaks. The next time, a robot tries 2 feet, and it rolls of the counter and breaks. Through a system of trial and error, it learns that a mug should be placed there, and it never drops it again Learning robots will be able to imitate humans far better than anyone may realize, after a while. If a learning robot is given the instructions- be human, than it will continue to learn how to be a human until it is indistinguishable, no matter how complex the behavior. In fact, robots may want to become human, in order to do a task. Ex. Get a jar of mayo from a certain store. Unfortunately, the store is humans only. A human-looking android may learn to be human in order to complete that task. Combine that with the law that says to protect humans, and you can account for all robot imitation. A person is sad. The robot needs as per the first law to prevent harm from coming to the human, including emotional harm. Thus, it needs to find out why the person is sad. That may require needing the ability to empathize or otherwise communicate in a more human manner, especially if it wants to help.


Thank you for taking the time to read this entire post before responding to it.
First  Prev  1  2  3  4  5  Next  Last
You must be logged in to post.