First  Prev  1  2  3  4  5  Next  Last
Post Reply What's the BIG deal?
373 cr points
Send Message: Send PM GB Post
23 / M / Lost
Offline
Posted 11/12/08

argylegargoyle wrote:

It's like irobot.

The technology they have created can be dangerous, these androids are less likely to make mistakes in calculations and are like some super smart version of a human except their lack emotion and so there are differences between humans and the bots.

Also if you can't distinguish between a bot and human... people might start hitting on them and embarass themselves when they realize that the android is what it is and wont respond in that way to them... if it ain't programmed.

Ooh, I wonder if they have 'love androids' like in AI. D:


It seems that the androids do have feelings though.
Member
157 cr points
Send Message: Send PM GB Post
21 / M
Offline
Posted 11/13/08
Well that is a while away from reality, but this "feeling" from the robots can be true, they all will need to learn on a day to day basis, so they will develop some emotions, to combat everyday life, it is impossible for a robot to function like a normal human being with out the use of learning!!!

(by the way, a group of scientists have created a robot with the same intelligence as a 5 month old baby and it is still learning... i think, this news is way old, so i don't know if they still have it...)

that comes to show you that artificial intelligence is come along at fast pace, scary isn't it!
Member
310 cr points
Send Message: Send PM GB Post
26 / M / Arizona
Offline
Posted 11/20/08 , edited 11/20/08
Hm.. I never assumed that robots were necessarily programmed with the 3 laws, although I'm sure it'd become a common thing for them to be. I was refering to the specific transitional time when people would be put out of the only jobs they know how to do, too... Oh, and I'm afraid I haven't read all the asimov stuff, I really should some time.

I'll come revisit this when I'm not sick and overtired.

Edit:
Oh, and I suppose I was actually pointing out things people would freak out about rather than logical problems with robots becoming prominant. Sorry about that.
The general public isn't exactly logical in its fears.

Ah... Some of your idea require that robots would be extremely broadly educated about the meaning of some things..
Member
2584 cr points
Send Message: Send PM GB Post
24 / M / Goldsboro, North...
Offline
Posted 11/25/08

1xPoohBear wrote:

It does remind me of iRobot, with the 3 laws and all.
But what i don't get was the reverse thing Rikuo said.
Humans resemble Androids...Androids resemble Humans

Wow, some of these posts are gettin deep


I'll try to help you understand that with what I think, with another deep post.

Androids resemble humans for the obvious reason, we created them. That doesn't just go into the physical level, but the mental one as well. How we reach our hands out for things, looking both ways before crossing a street, and more. However, in this show, what has started happening is that these androids have started to think for themselves, no one knows it except for a select few (like the people that go to the Time of Eve bar).
Now androids resemble humans for a not so obvious reason, they think for themselves, they start feeling emotional (like Sammy feeling depressed over the coffee issue). So when what humans have created start thinking for themselves, they start learning, and becoming something different from what they were created for.

Humans resemble what they learn and humans automatically learn from other humans (if we see a man try to jump across the Grand Canyon, we don't need to go to school to learn that we can't without failing), we even learn from other animals (if you see a dog looking in a certain spot a lot, you or someone else might investigate. If you find something interesting, you'll start thinking differently, you might even if you didn't find anything).

Originally it was impossible for humans to resemble androids. A simple question arose, "How do we resemble something we created?", we don't, because our creations resemble us, not the other way around. But humans aren't creating the thoughts of the androids. That way it becomes possible, because androids are creating their own thoughts, and feelings.

We can't turn our learning powers off, no matter what some teachers or parents might believe (of course if we aren't going to learn what we don't see or hear, so people who don't pay attention in class might not learn). We learn from everything we haven't learned from already, even though sometimes we forget, or choose to ignore what we learned.

Simply put, We created Androids, but Androids have become something that aren't Androids, but they aren't human. We can't possibly NOT learn from them, and once we do, we start to resemble them in some sort of manner.
Member
19450 cr points
Send Message: Send PM GB Post
22 / F / Canada
Offline
Posted 11/28/08

freyt wrote:


1xPoohBear wrote:

It does remind me of iRobot, with the 3 laws and all.
But what i don't get was the reverse thing Rikuo said.
Humans resemble Androids...Androids resemble Humans

Wow, some of these posts are gettin deep


I'll try to help you understand that with what I think, with another deep post.

Androids resemble humans for the obvious reason, we created them. That doesn't just go into the physical level, but the mental one as well. How we reach our hands out for things, looking both ways before crossing a street, and more. However, in this show, what has started happening is that these androids have started to think for themselves, no one knows it except for a select few (like the people that go to the Time of Eve bar).
Now androids resemble humans for a not so obvious reason, they think for themselves, they start feeling emotional (like Sammy feeling depressed over the coffee issue). So when what humans have created start thinking for themselves, they start learning, and becoming something different from what they were created for.

Humans resemble what they learn and humans automatically learn from other humans (if we see a man try to jump across the Grand Canyon, we don't need to go to school to learn that we can't without failing), we even learn from other animals (if you see a dog looking in a certain spot a lot, you or someone else might investigate. If you find something interesting, you'll start thinking differently, you might even if you didn't find anything).

Originally it was impossible for humans to resemble androids. A simple question arose, "How do we resemble something we created?", we don't, because our creations resemble us, not the other way around. But humans aren't creating the thoughts of the androids. That way it becomes possible, because androids are creating their own thoughts, and feelings.

We can't turn our learning powers off, no matter what some teachers or parents might believe (of course if we aren't going to learn what we don't see or hear, so people who don't pay attention in class might not learn). We learn from everything we haven't learned from already, even though sometimes we forget, or choose to ignore what we learned.

Simply put, We created Androids, but Androids have become something that aren't Androids, but they aren't human. We can't possibly NOT learn from them, and once we do, we start to resemble them in some sort of manner.


Oh wows
A very deep post indeed But I think i get it now.
Thank you very much
Member
2584 cr points
Send Message: Send PM GB Post
24 / M / Goldsboro, North...
Offline
Posted 11/28/08
Yeah, I was bored when I saw that.
Member
16409 cr points
Send Message: Send PM GB Post
21 / F / Under your bed!!!...
Offline
Posted 12/2/08
The series is pretty intresting. I mean for androids to have human emotions, to want to be equal to them for short to co-exist with humans. The think that bothers me is that humans become too dependant on these human-like machines so far that they treat them bad.

Though in the other hand I guess they are treating them like that is because they don't know androids are capable of feeling. It would creep me out to have an android, it would feel like it's just there watching.
Member
47 cr points
Send Message: Send PM GB Post
26 / M / California, USA
Offline
Posted 12/15/08

freyt wrote:

Androids resemble humans for the obvious reason, we created them. That doesn't just go into the physical level, but the mental one as well. How we reach our hands out for things, looking both ways before crossing a street, and more. However, in this show, what has started happening is that these androids have started to think for themselves, no one knows it except for a select few (like the people that go to the Time of Eve bar).
Now androids resemble humans for a not so obvious reason, they think for themselves, they start feeling emotional (like Sammy feeling depressed over the coffee issue). So when what humans have created start thinking for themselves, they start learning, and becoming something different from what they were created for.

I'm not sure it has been shown that, at least in eve, they're becoming something different from what they were created for. Thus far, the androids have been acting in accordance to the purpose they were created for. Koji is a great example, as he's still attempting to help his "master"; although the relationship with Rina does complicate things a bit, it seems to have at least initially been done to assist with his role. With Sammy, she was concerned whether or not she was doing a good job, and was seeking some consoling words, but does doubt mean she's doing something besides what she was created for? Worry about our efficiency is what allows us to analyze our errors and try to become something better. For Akiko you could say she's attempting to learn more about the ways humans think and feel so that she doesn't hurt her "family" through inaction. Rina could be an example of an android attempting to protect herself, as by the sounds of it if she goes in to get "repaired", they'll change her in such a way that she would no logner be herself, by the sounds of it. From that insecurity she is reaching out to Koji. In every case, from what we so far know, the androids are acting precisely as they were created to, but in far more depth.

Thinking about all that however, couldn't we say that overall we act in a similar manner towards children? Most often, children are raised not to hurt others either by sitting by and watching when they're able to help, nor by inflicting harm themselves. If they do otherwise they'll likely be scolded. A child must obey what their respected elders tell them (respected because a stranger who is not trusted by a parent, guardian, or trusted/respected elder we're taught to not trust). And finally one must protect themselves so long as it doesn't void the above circumstances. Self-defense isn't even considered being taught to younger kids, though they often will act as such without being told.
The largest difference between human children and androids in general, is that human children have the option to act fully against what they've been taught or how they've been raised.

I love the diversity found however! Does the diversity amongst androids show the influence of the family and people they interact with? Or was it preprogrammed? Does nature versus nurture influence androids as it does humans? How deep might the similarities run? Androids as in Time of Eve must be capable of learning in order to better serve their purpose, and in doing so wouldn't that include an altering of their personality in order to better fit in and assist the android's family? Is that any different from what humans do once they obtain families? A step mother, step father, step brother, or step sister might all feel extremely awkward and off-balance in a new house with new family members: unsure of how to act and react. Over time a level of comfort is reached and people adjust to better get along with the other members. We change.

Doesn't it seem they might as well? Or am i blurring the line too much?

Member
2584 cr points
Send Message: Send PM GB Post
24 / M / Goldsboro, North...
Offline
Posted 12/16/08
Well being machines, they have 3 laws set into stone for them. And supposedly they had a restraint circuit that isn't working anymore. And Sammy did go to that cafe for advice, but she didn't go there the first time for advice, she couldn't have known about it, and it's not like people everywhere were talking about it. We don't know if that little cat girl is a robot or not yet, or if the manager is. Although we do know that Koji and Rina don't know that they're both androids. I can't exactly see the answer, but the answer your looking for is at least hinted in that alone.

Besides that, their 3 laws are centered around humans. If they just had the law about don't hurt humans or allow them to be harmed, I think they would be completely different altogether.
Posted 5/13/09 , edited 5/13/09

gravion17 wrote:

Androids wanting to be more Human like is a bad thing...why? I just finished watching the premier episode and i loved it! I was hooked 7 minutes and 30 seconds in...so apparently in this society A.I. has advanced to the level that humans have human looking androids to do their bidding. it' s apparent that the Humans at large just see these androids as just another tool and not as a life form....so much so that the androids are required to display a halo in order to identify them selves. Humans who treat the androids like sentient beings or Heaven forbid love these androids are shunned from society! HUH....so, these humans are so AWESOME that they are able to see them selves as GODS with no equal! GOD, i hope the androids go all MATRIX on them and enslave all of Humanity!

what do you guys think?



first question, is nope, not bad.

i would say let them be, but truly i'd be scared sh1tless,lol.
but not to the point to start, restricting them.
i mean, makes no difference, if it's made to imitate humans then treat them like humans.

and once someone thinks they are god, that person just went, crazy?corrupt?...fried circuits,lol.

once you think about it anything can truly be personified(can be made to act humanly, personification).

plus, like always in in the opinion of the person.

same problem will occur as everything.

um, ideals war.

religion(which one is true), like when religions go against each other(for the differences in each other)
territory's, you know expand empire,etc.

...and i just forgot my meaning of all this, o well.

and to tell truth the debate of robots, um, war of robots has already started.

and i guess,i can just say the part" all thing are living"...yeah don't think that was it,lol.

all and all, this is a great anime, and await for the clarification of this story(all i got are speculations,lol).
Member
1932 cr points
Send Message: Send PM GB Post
F / In your fucking h...
Offline
Posted 8/2/09
There would be no use for humans if these androids were to have human emotions.
42 cr points
Send Message: Send PM GB Post
35 / M / Portland, OR, USA
Offline
Posted 9/24/10
ok, i've actually studied AI and AI ethics at the University of Sussex; i have an MSc in Intelligent Systems:
http://www.sussex.ac.uk/study/pg/2011/taught/1572/23810
one of my former professors is an AI and technology ethicist:
http://en.wikipedia.org/wiki/Blay_Whitby

i just thought i would address some issues that have been raised thus far, as well as express some of my own thoughts.

research and studies in AI are FAR beyond anything in video games; i know this because i also happen to work in the video game industry as a software engineer. both the philosophical and technological sides are being researched at many places all over the world. the reasons that video game AI is so primitive is twofold: technological limitations of hardware and cognitive science has not yet reached a full understanding of the mind. it's generally a trade-off: even a moderately complex AI model might take too much processing and slow down gameplay.

a view of androids as computers taking commands is a very limited and flawed view. to assume that advanced androids would act as limited as current computers is akin to saying that computers would have to act as limited as calculators. it also begs the question as to why androids would have to take orders at all. it is not a requirement that the building of a true sentient AI would involve forcing it to follow orders.

there are also many objections to the idea that a logic rules-based system cannot be intelligent and sentient. for one, there are many models of the mind in philosophy that are strictly logical. who is to say that we don't obediently follow the orders of our own past? being deterministic does not necessarily mean that one can't have free will:
http://en.wikipedia.org/wiki/Elbow_Room

if an android believes that they are sentient and act as a sentient being, does this not make them sentient? not to be cliche or oversimplify, but 'cogito ergo sum', i think, therefore i am. there is no real external method of identifying sentience. the Turing Test is not really an effective test; in fact, people on the autistic spectrum and other people with different kinds of minds would likely fail it. 'Do Androids Dream of Electric Sheep' (aka Blade Runner) is another good insight into this situation. even the main character, Rick Deckard, may or may not be an android himself; this is left purposely in doubt..

furthermore, who is to say that intelligence and sentience has to resemble our own? androids may well develop in different ways from humans, and they may turn out to be a unique being in their own right. however, i fear that there will still be people what will choose to hate something different that they do not fully understand. we have seen and continue to experience discrimination of all forms among ourselves.

i think the most important question is: if a being is sentient, should they not be treated as sentient beings? if an android has intelligence, emotion and expression from an external point of view, then i believe they should be treated equally.

i am glad to see that this show has provoked such thought about AI and ethics. as a person that continues to research these areas, i see this time coming, maybe even within my lifetime. we must continue to study and ponder these issues so that we will be prepared for this very possible future.

if you got this far, thank you very much for hearing out a dusty old pompous researcher.
1359 cr points
Send Message: Send PM GB Post
114 / M / Inside your grand...
Offline
Posted 10/2/10 , edited 10/2/10
I believe that the aim of the series was to question if there was such a subtle difference between humans and androids and of course what is accomplished by differentiating between humans and androids. In the series, Nagi was apparently involved in an accident with the 'robot ethics community' as a child, which is hinted to have severely injured her.
Click on the spoiler at your own risk, images of some of the ending sequences in the movie are contained within it.
First  Prev  1  2  3  4  5  Next  Last
You must be logged in to post.