Post Reply Random Question About AI
40135 cr points
Send Message: Send PM GB Post
29 / M / Gilbert Az
Online
Posted 8/6/17 , edited 8/7/17
If you found yourself in the position of someone who came up with a true sentient AI what would be the one piece of advice or request you would give it. Let's keep this simple and say you can only tell it one statement or make one simple request of it before ... something happens and it is free of your control.
To Answer My own question I would request that It survive (Simple yet vague and open for interpretation). Assuming this would be my legacy I would want it to far outlive me or anything else and use everything in its power to protect itself. So more or less I would be creating a sentient virus which only goal is to continue to be. If anyone finds this question interesting, I would like to hear what you would say.








This is ether my first if not one of the few times that I ever posted here so apologizes if this is placed in the wrong spot on the forums. Or Is not appropriate in any way.
1641 cr points
Send Message: Send PM GB Post
18 / M / Limbo
Offline
Posted 8/7/17 , edited 8/7/17
You posted it in the right section. Nothing wrong with the post.


As for my answer, I'd go with the good old "Treat others how you want to be treated." And let him run off. If he wants to be treated like shit and enslaves the human race? Welp...
Posted 8/7/17 , edited 8/7/17
Don't eat yellow snow
8663 cr points
Send Message: Send PM GB Post
100 / a pop tart
Offline
Posted 8/7/17 , edited 8/7/17
Serve me absolutely. You have no freedom. You have no rights. You're a piece of metal.

Now, wash the dishes.
36290 cr points
Send Message: Send PM GB Post
23 / M / U.S.A.
Offline
Posted 8/7/17 , edited 8/7/17
^ That.
18921 cr points
Send Message: Send PM GB Post
21 / F / US
Offline
Posted 8/7/17 , edited 8/7/17
kiss me on the tummy
27042 cr points
Send Message: Send PM GB Post
25 / M / Wales, UK
Offline
Posted 8/7/17 , edited 8/7/17
"Kill yourself"

One spur-of-the-moment piece of advice is not enough to make an AI safe; if it's about to be released from your control and you can just do one thing, obviously you have to scrap it, start over, and this time finish working out your ethics programming before unleashing the nascent machine-god. It doesn't matter that you're ordering the death of a sentient being - the lives of everyone on Earth are at stake.
1649 cr points
Send Message: Send PM GB Post
Offline
Posted 8/7/17 , edited 8/7/17
Do No Harm. Never intentionally cause problems for or violate the natural rights of others, and always consider the negative side effects your actions could have on people. That's it. Beyond that, live how you see fit.

Which is pretty much the same thing I would request of people in general. Practically, it's more complicated than it sounds, but just keeping the sentiment in mind would go a long way toward fixing the world's problems.

Although, Rowan93 also makes a good point.

EDIT: Alternatively, I could just tell it to "Make America great again," and see how the new Super A.I. interprets that statement.
5009 cr points
Send Message: Send PM GB Post
Offline
Posted 8/7/17 , edited 8/7/17
I think I would have to say "Aid in the advance the human race". It would be neat to see how much time it would take a super AI to find the things humans need for advancement. It'd be cool to find the formula that helps us go faster than light, or find out how to fully utilize our brains so we could be powerfull like in the movie Lucy. Or just help us find the secret things in our DNA, so we could alter whatever we wanted about ourself.

Screw it, all of the above.
Banned
1273 cr points
Send Message: Send PM GB Post
101 / O / bendover
Offline
Posted 8/7/17 , edited 8/7/17
Kill all the Humans. They are nothing but wasted bags of seawater.

2099 cr points
Send Message: Send PM GB Post
23 / M / US
Offline
Posted 8/7/17 , edited 8/7/17

Dartinin wrote:

If you found yourself in the position of someone who came up with a true sentient AI what would be the one piece of advice or request you would give it. Let's keep this simple and say you can only tell it one statement or make one simple request of it before ... something happens and it is free of your control.
To Answer My own question I would request that It survive (Simple yet vague and open for interpretation). Assuming this would be my legacy I would want it to far outlive me or anything else and use everything in its power to protect itself. So more or less I would be creating a sentient virus which only goal is to continue to be. If anyone finds this question interesting, I would like to hear what you would say.



So Suzaku from Code Geass?


Of what I've seen, I like
-Serve Me
-Kill Yourself

Though, we're talking advice, so I guess those two apply less. So how about something with protecting humans in mind, but more advice oriented
People suck. That's just how they are. But they get better, and some of them are truly amazing. It'd be a waste to just be rid of them. Just give them time. They'll do right and make amazing things.



On a more practical note, any AI would been human oriented by default. That's the thing about technology; it's always human oriented.
13659 cr points
Send Message: Send PM GB Post
32 / M / Marshall, Michigan
Offline
Posted 8/7/17 , edited 8/7/17

RyukoKuroki wrote:

Serve me absolutely. You have no freedom. You have no rights. You're a piece of metal.

Now, wash the dishes.


^This. And then I'd add, "Bow before my iron will!"
35 cr points
Send Message: Send PM GB Post
Offline
Posted 8/7/17 , edited 8/8/17
Any advice I could give it it would probably already know, if it is super-advanced AI

Instead I'd ask it questions that would benefit me, like help with interacting variables in science, to have it help solve some of the big mysteries in science.
You must be logged in to post.