First  Prev  1  2  3  4  5  6  7  8  9  10  Next  Last
Post Reply AI could end mankind, warns Stephen Hawkings and other prominent thinkers
27279 cr points
Send Message: Send PM GB Post
39 / Inside your compu...
Offline
Posted 1/31/15 , edited 2/1/15
http://mashable.com/2014/12/02/stephen-hawking-artificial-intelligence-bbc/


"The development of full artificial intelligence could spell the end of the human race," Hawking said.

Although he sees the benefit of existing artificial intelligence, Hawking said, he added that future machines could redesign themselves at an ever increasing rate.

"Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."


He's hardly the only guy with the same opinion:

http://time.com/3614349/artificial-intelligence-singularity-stephen-hawking-elon-musk/


Nick Bostrom

The Swedish philosopher is the director of the Future of Humanity Institute at the University of Oxford, where he’s spent a lot of time thinking about the potential outcomes of the singularity. In his new book Superintelligence, Bostrom argues that once machines surpass human intellect, they could mobilize and decide to eradicate humans extremely quickly using any number of strategies (deploying unseen pathogens, recruiting humans to their side or simple brute force). The world of the future would become ever more technologically advanced and complex, but we wouldn’t be around to see it. “A society of economic miracles and technological awesomeness, with nobody there to benefit,” he writes. “A Disneyland without children.”


Even the person who came up with the term "singularity" didn't rule it out. I wonder why all the "singularitarians" have all but chosen to ignore what he said, of all people:


Vernor Vinge

A mathematician and fiction writer, Vinge is thought to have coined the term “the singularity” to describe the inflection point when machines outsmart humans. He views the singularity as an inevitability, even if international rules emerge controlling the development of AI. “The competitive advantage—economic, military, even artistic—of every advance in automation is so compelling that passing laws, or having customs, that forbid such things merely assures that someone else will get them first,” he wrote in a 1993 essay. As for what happens when we hit the singularity? “The physical extinction of the human race is one possibility,” he writes.


============= TL;DR point already made... only read what I write below if you've got nothing better to do================

I'm just an average run-of-the-mill engineer, and the point that I'm going to make has absolutely nothing to do with engineering. It doesn't take a genius to realize how dangerous this "full AI" would be once built (I'm going to assume that Hawking was referring to the dual criteria of fully self-sustaining and fully self-propagating- It doesn't have to be conscious or "alive", and one of the worst aspects is that it's NOT, like how viruses aren't in the normal sense) simply from the perspective of psychiatric symptoms:

-Lack or absence of empathy i.e. callous and thus no regard for life
-Absence of remorse
-No guilt
-No sense of morality (ends and goals justifies any means)

...in other words, the very definition of a psychopath.

A psychopath can pretend to empathize, but they are incapable of feeling empathy because that's the way they are:

http://www.huffingtonpost.com/2013/04/24/psychopath-brain-hardwiring-concern-for-others_n_3149856.html

Similarly, an AI can be programmed to exhibit all the exterior symptoms of empathy (e.g. programmed to say the right things and have the right expressions etc) but never actually possess empathy, just like a psychopath. Ditto with remorse, guilt, along with everything else.

A fully self-sustaining, self-perpetuating psychopathic entity with generally super-human capabilities (e.g. super-human in more than just one aspect, as in more than just a chess-playing grandmaster but something with superhuman strength, durability, speed, endurance, logical reasoning abilities, etc) is a disaster waiting to happen. Whomever is actually foolish enough to WANT to make one is asking for the metaphorical "it".

There is bound to be some kind of resource competition no matter how advanced tech gets, and once those darn things realize (in probably less than a microsecond) that human beings are taking up all the resources which they could take by simply killing us all, they'd give us the wipe.

To me, the eggheads are just stating the obvious.

Hawking et al. signed an open letter urging a shift in priority for AI research towards "benefit" (really another way of saying "instead of not benefiting and killing us") instead of "construction". You can sign it too if you think it makes a difference, but I don't think it matters since if people don't even listen to the likes of those top thinkers, what does a bunch of signatures from random peeps like me gonna do? http://futureoflife.org/misc/open_letter
6315 cr points
Send Message: Send PM GB Post
22 / M / Hongdae
Offline
Posted 1/31/15 , edited 2/16/15
Until we unplug the server.
Posted 1/31/15
Singularity movement, here we go.


If you don't know what singularity movement is, it is a movement postulated by futurist that states that eventually machines and ai will get so advanced that they'll design their descendants.
4210 cr points
Send Message: Send PM GB Post
20 / M / England
Offline
Posted 1/31/15
R.I.P human race
27279 cr points
Send Message: Send PM GB Post
39 / Inside your compu...
Offline
Posted 1/31/15 , edited 1/31/15

Kingzoro02 wrote:

Until we unplug the server.


If those hunks of metal are so smart, they'd make versions of themselves that are completely distributed and infinitely parallelized so they don't need any servers (i.e. to kill them all you really need to KILL THEM ALL). Of course, they'll need anti-jamming tech, which of course they'd take care of beforehand.


PeripheralVisionary wrote:

Singularity movement, here we go.


If you don't know what singularity movement is, it is a movement postulated by futurist that states that eventually machines and ai will get so advanced that they'll design their descendants.


I seriously don't get those people. I mean, if even the guy who coined the term for them sees a grim picture, why would they want it??


MontyDono wrote:

R.I.P human race


That or Roko's Basilisk.
24279 cr points
Send Message: Send PM GB Post
27 / M
Offline
Posted 1/31/15 , edited 2/1/15
I'll say! Have you played Forza? The fuckers would run you right off the road.
6315 cr points
Send Message: Send PM GB Post
22 / M / Hongdae
Offline
Posted 1/31/15 , edited 1/31/15

nanikore2 wrote:


Kingzoro02 wrote:

Until we unplug the server.


If those hunks of metal are so smart, they'd make versions of themselves that are completely distributed and infinitely parallelized so they don't need any servers (i.e. to kill them all you really need to KILL THEM ALL). Of course, they'll need anti-jamming tech, which of course they'd take care of beforehand.


If any one dumbass puts an intelligent AI on a global server, they will be beaten and dragged through the streets.

LAN = Local Area Network. Meaning computers connected to one another usually directly and in the same building.
27279 cr points
Send Message: Send PM GB Post
39 / Inside your compu...
Offline
Posted 1/31/15 , edited 1/31/15

Kingzoro02 wrote:


nanikore2 wrote:


Kingzoro02 wrote:

Until we unplug the server.


If those hunks of metal are so smart, they'd make versions of themselves that are completely distributed and infinitely parallelized so they don't need any servers (i.e. to kill them all you really need to KILL THEM ALL). Of course, they'll need anti-jamming tech, which of course they'd take care of beforehand.


If any one dumbass puts an intelligent AI on a global server, they will be beaten and dragged through the streets.

LAN = Local Area Network. Meaning computers connected to one another usually directly and in the same building.


I've seen a manga in which the penalty for creating conscious robots is death. (Reasonable, since in that story the human race had been wiped out twice already by robots)

I'm talking about a distributed wireless compute network similar to what some virtual currency people have talked about, except a whole lot more advanced. It's easy to picture each bot as a computing/storage/communication node, and since there are so darn many of them they act collectively as one giant brain on the planet using that wireless com network. (Which is why I talked about jamming)
14233 cr points
Send Message: Send PM GB Post
M / USA
Offline
Posted 1/31/15 , edited 1/31/15
Eh, the nuclear bomb could've ended mankind.

I'm not to worried, considering I'm one of those idiots that is going for a degree in artificial intelligence. Humans simply need to artificially evolve with the technology we create.

And if we're afraid of creating a terminator, don't build a robot with limbs lol.
27279 cr points
Send Message: Send PM GB Post
39 / Inside your compu...
Offline
Posted 1/31/15

CoffeeGodEddy wrote:

Eh, the nuclear bomb could've ended mankind.

I'm not to worried, considering I'm one of those idiots that is going for a degree in artificial intelligence. Humans simply need to artificially evolve with the technology we create.

And if we're afraid of creating a terminator, don't build a robot with limbs lol.


Well, Hawking's point is those buggers would evolve faster.
31690 cr points
Send Message: Send PM GB Post
49 / M / Planet KLK-X
Offline
Posted 1/31/15 , edited 2/16/15
Ha! Machines that think can never be programmed or designed. They would have to evolve, like us, perhaps to the point where they even develop self-awareness. Then they wouldn't be artificial intelligence any more, but true artificial life. They would _understand_, have feelings, and develop morality. Like all thinking beings do, totally naturally.

Artificial intelligence will never, ever be a match for humans. The "singularity" idea is ridiculous, the stuff bad sci-fi is made of. But artificial life will join us and gain personhood, eventually.
27279 cr points
Send Message: Send PM GB Post
39 / Inside your compu...
Offline
Posted 1/31/15 , edited 1/31/15

sena3927 wrote:

Ha! Machines that think can never be programmed or designed. They would have to evolve, like us, perhaps to the point where they even develop self-awareness. Then they wouldn't be artificial intelligence any more, but true artificial life. They would _understand_, have feelings, and develop morality. Like all thinking beings do, totally naturally.

Artificial intelligence will never, ever be a match for humans. The "singularity" idea is ridiculous, the stuff bad sci-fi is made of. But artificial life will join us and gain personhood, eventually.


Bottom-up AI is still programmed AI. Its "evolution" is an emulation. Yes, those experiments had these "unexpected code" coming out, but they still had to program the box because all those instructions still had to come from somewhere.
18910 cr points
Send Message: Send PM GB Post
M
Offline
Posted 1/31/15
My respect for Hawking is getting less and less.
Posted 1/31/15 , edited 1/31/15
Meh. Either way we end up killing ourselves or they do it for us. Either way there going to be a danger if we do allow them to exist and if we don't, we get nothing of the advantages. Assuming aliens don't kill us first.
82916 cr points
Send Message: Send PM GB Post
44 / M / WA
Offline
Posted 1/31/15
Until the sun has a major hiccup ...then it is sayonara to the machines. I don't think machines will ever arrive at sentience; though if they did, it wouldn't necessarily mean they would turn against humanity ...or that a civil war of the machines wouldn't erupt.
First  Prev  1  2  3  4  5  6  7  8  9  10  Next  Last
You must be logged in to post.