First  Prev  1  2  3  4  5  6  Next  Last
Artificial Intelligence
26342 cr points
Send Message: Send PM GB Post
26 / M / Stamford, CT
Offline
Posted 1/26/08
well as today's world leans more and more in to computers and other high-tech devices, the demand and capabilities for these devices grow. some machines and early robots are even implemented with human thoughts. though we all saw what A.I(Artificial Intelligence) leads to in movies such as the terminator movies and Irobot..though it may seem like a ridiculous question but...do you think A.I will eventually reach it's point of disobeying it's creator? and why or why not? Now I'm not making this thread to be funny but as the world leans to high-tech devices..maybe years from now some supercomputer might use the world's "handicap" towards it advantage

and to add to that..is the world becoming too reliable on computers and other high tech devices, is it the right direction to be heading into?


mods....feel free to lock this thread if you think it doesn't match the requirements of being in the Extended Disscussion Thread
6468 cr points
Send Message: Send PM GB Post
26 / M / USA
Offline
Posted 1/26/08
Hmmm AI... Well it wont matter since the Poles will probably switch in the next 100 years and all technology will crash and we will need to start from scratch again.
3077 cr points
Send Message: Send PM GB Post
28 / F / USA
Offline
Posted 1/26/08
well, computers will only do what it is programmed to do right? so if you will only program these robots to do the things that you want them to do, then i don't see the probability of it disobeying you. unless you programmed it to do the exact opposite of what you ordered.

the movies you mentioned are obviously fictional and directors want something great, something eyecatching, something unreal. they may be based on what can happen but they are altered for the story to have an unexpectable twist. thus, making a blockbuster hit. they may or may not happen but the probability of a computer to have human emotions is extremely low.
3785 cr points
Send Message: Send PM GB Post
29 / M / Spencer,Iowa 51301
Offline
Posted 1/27/08
We can never replicate the human mind so it will never happen.
1131 cr points
Send Message: Send PM GB Post
26 / F / in front of a com...
Offline
Posted 1/27/08
i guess if you make those AIs too smart then theyll probably be like the terminator or irobot

eh just thinking of the possibilities....;..
22133 cr points
Send Message: Send PM GB Post
23 / F / Finland
Offline
Posted 1/27/08
if we success in making human mind, we will of course programm robots to do what we want them to do, but virus and other programs are designed to do something that we haven't programmed, so if now our computer deletes files when it's broken, in future why wouldn't machines example kill humans.

It's possible to them disobey us but i think it would be same kind of thing if all computes would get virus at the same time
392 cr points
Send Message: Send PM GB Post
28 / M / S'pore
Offline
Posted 1/27/08
u nvr know it ...

maybe AIs will soon come into power.. sooner than anyone would have guessed..
i mean.. ppl are inventing new AI stuff almost everyday and viruses are abounded in the technical flaws within the AIs...

well, all in all i guess AIs will take over us someday...



306 cr points
Send Message: Send PM GB Post
24 / F / there.
Offline
Posted 1/27/08
nope..
humans controll robots..
it will not disobey
it's creator..
pretty impossible..
21869 cr points
Send Message: Send PM GB Post
25 / M / Sleepy Land
Offline
Posted 1/27/08
Think so, if humans are too conceited and eave them roaming around our neighbours to do various things in the future~
306 cr points
Send Message: Send PM GB Post
24 / F / there.
Offline
Posted 1/27/08
robots = machines..
machines = computers
computers = programs..

humans are the ones who make programs
humans are the ones who control computers.

therefore..
robots can not overcome it's creator..

XD.. opinion..
18071 cr points
Send Message: Send PM GB Post
no permanent addr...
Offline
Posted 1/27/08

sacrificed21 wrote:


addicted2chocolate wrote:

well, computers will only do what it is programmed to do right? so if you will only program these robots to do the things that you want them to do, then i don't see the probability of it disobeying you. unless you programmed it to do the exact opposite of what you ordered.

the movies you mentioned are obviously fictional and directors want something great, something eyecatching, something unreal. they may be based on what can happen but they are altered for the story to have an unexpectable twist. thus, making a blockbuster hit. they may or may not happen but the probability of a computer to have human emotions is extremely low.


agreed

i also agree..well if u have doubt AI will have the capability 2 terminate us..then don't program them to disobey your commands ryt??Humans indeed have it's power over it's creations...

5344 cr points
Send Message: Send PM GB Post
M / Somewhere.....
Offline
Posted 1/27/08
A.I, huh......

It'll benefit humans as a whole, working for us.

But what if they don't want to?

They would be able to think, to judge for themselves.

Many humans are bad. Evil. Cruel.

Imagine robots which are capable of anything think like those people.

A.I is best not to be messed with.
20263 cr points
Send Message: Send PM GB Post
29 / M / The centroid of a...
Offline
Posted 1/27/08

addicted2chocolate wrote:

well, computers will only do what it is programmed to do right? so if you will only program these robots to do the things that you want them to do, then i don't see the probability of it disobeying you. unless you programmed it to do the exact opposite of what you ordered.

the movies you mentioned are obviously fictional and directors want something great, something eyecatching, something unreal. they may be based on what can happen but they are altered for the story to have an unexpectable twist. thus, making a blockbuster hit. they may or may not happen but the probability of a computer to have human emotions is extremely low.


The movies might be fiction but the ideas conveyed are not. Program loopholes are very common(bugs). Given human logic isn't exactly perfect, it is not entirely out of the question for an overworked insomniac programmer to accidentally slip up and we end up with a program loophole along the lines of
*Destroy all things causing our ozone to deplete*
followed some thousand program-lines later by
*Humans are making our ozone deplete*

This is a classical example of shooting yourself in the foot.
4344 cr points
Send Message: Send PM GB Post
32 / M / auckland
Offline
Posted 1/27/08

muera wrote:


sacrificed21 wrote:


addicted2chocolate wrote:

well, computers will only do what it is programmed to do right? so if you will only program these robots to do the things that you want them to do, then i don't see the probability of it disobeying you. unless you programmed it to do the exact opposite of what you ordered.

the movies you mentioned are obviously fictional and directors want something great, something eyecatching, something unreal. they may be based on what can happen but they are altered for the story to have an unexpectable twist. thus, making a blockbuster hit. they may or may not happen but the probability of a computer to have human emotions is extremely low.


agreed

i also agree..well if u have doubt AI will have the capability 2 terminate us..then don't program them to disobey your commands ryt??Humans indeed have it's power over it's creations...



what if the A.I is so developed that it forms a sentient mind of it's own? And no, that's not impossible. With science, everything is possible.

The big question is: If they take over the humans, then what are we going to do? well... If it was me, i'd let them.

I'll elaborate why:

Just because we are one of the human race, we begin to develop this biased point of view about our own existence in this earth. We think that us a human race, we are the only existance who have the right to rule over this earth. We always set up ourself to be the centre of the universe. As a hero. As a survivor. What a bunch of crap.

Don't get me wrong.I love my self. I love my being. But absolutely nobody can't deny that we as a human race, are a poison to this planet. It was barely 10.000 years (more or less) since human civilization took form. (The friggin dinosaurs exist for 160 million years with them preserving their species perfectly.) And yet, you see all kind of problem already with us and this planet, in that relatively short span of time. Global warming. Overpopulation. Nations with nuclear arms. Wars. You name it.

You can see the fate of humankinds already. You can see the pattern. We are inevitably driving ourselves towards our doom. And there's absolutely nothing that nobody can do about it. We are selfish. We, as a human being, we were born with a degree of insanity ingrained deep within our psyche. Our sole purpose of existance is for what? To satisfy our ego! To be: at all times, superior to our other human peers no matter what the cost. And what makes it worse is that we exist with knowledge and intelligence. You do the math. Ego + Capability and intelligence to destroy = Self-destruction.

But nobody can do a goddamn thing about it. It's just us. We are human with flesh and souls. Petty desires and earthly lust.

Not the A.I. We can end our own madness with the possiblity of *their* existance.

For me, i'd think that they're the next step on our line of evolution. We can't transcend ourself further. Us humans. We have reached our peak. Our job here in this earth is done. It's time to make way to another being who's more capable to evolute themselves.
20263 cr points
Send Message: Send PM GB Post
29 / M / The centroid of a...
Offline
Posted 1/27/08

supermalv wrote:


muera wrote:


sacrificed21 wrote:


addicted2chocolate wrote:

well, computers will only do what it is programmed to do right? so if you will only program these robots to do the things that you want them to do, then i don't see the probability of it disobeying you. unless you programmed it to do the exact opposite of what you ordered.

the movies you mentioned are obviously fictional and directors want something great, something eyecatching, something unreal. they may be based on what can happen but they are altered for the story to have an unexpectable twist. thus, making a blockbuster hit. they may or may not happen but the probability of a computer to have human emotions is extremely low.


agreed

i also agree..well if u have doubt AI will have the capability 2 terminate us..then don't program them to disobey your commands ryt??Humans indeed have it's power over it's creations...



what if the A.I is so developed that it forms a sentient mind of it's own? And no, that's not impossible. With science, everything is possible.

The big question is: If they take over the humans, then what are we going to do? well... If it was me, i'd let them.

I'll elaborate why:

Just because we are one of the human race, we begin to develop this biased point of view about our own existence in this earth. We think that us a human race, we are the only existance who have the right to rule over this earth. We always set up ourself to be the centre of the universe. As a hero. As a survivor. What a bunch of crap.

Don't get me wrong.I love my self. I love my being. But absolutely nobody can't deny that we as a human race, are a poison to this planet. It was barely 10.000 years (more or less) since human civilization took form. (The friggin dinosaurs exist for 160 million years with them preserving their species perfectly.) And yet, you see all kind of problem already with us and this planet, in that relatively short span of time. Global warming. Overpopulation. Nations with nuclear arms. Wars. You name it.

You can see the fate of humankinds already. You can see the pattern. We are inevitably driving ourselves towards our doom. And there's absolutely nothing that nobody can do about it. We are selfish. We, as a human being, we were born with a degree of insanity ingrained deep within our psyche. Our sole purpose of existance is for what? To satisfy our ego! To be: at all times, superior to our other human peers no matter what the cost. And what makes it worse is that we exist with knowledge and intelligence. You do the math. Ego + Capability and intelligence to destroy = Self-destruction.

But nobody can do a goddamn thing about it. It's just us. We are human with flesh and souls. Petty desires and earthly lust.

Not the A.I. We can end our own madness with the possiblity of *their* existance.

For me, i'd think that they're the next step on our line of evolution. We can't transcend ourself further. Us humans. We have reached our peak. Our job here in this earth is done. It's time to make way to another being who's more capable to evolute themselves.


If the A.I. actually obtained sentience as we described such a thing, they would be as fucked as we are right now.
First  Prev  1  2  3  4  5  6  Next  Last
You must be logged in to post.