First  Prev  1  2  3  Next  Last
Post Reply Consciousness and Machines
21448 cr points
Send Message: Send PM GB Post
46 / M / Between yesterday...
Offline
Posted 3/18/15

Metazoxan wrote:

There is one major issue with this

You see one of the amazing things about our brains is how they adapt and grow. More so when young but even later in life it happens a little. The brain adapts and grows overtime and retains everything. We may personally forget things but it's always somewhere in our heads. So when talking about "artificial brains" you are talking about a machine that is not only capable of storing electrical signals of the brain but can grow and adapt in order to keep up with all brain proccesses and store all memories. This isn't as simple as a mechanical limb (which isn't actually simple but in comparison to a brain it is) which you just have to make able to respond to signals sent though the body and move properly. So I'm not sure if we could ever reach the point to create a proper artificial substiture for the brain. Anything we could make would just be a poor substitute for what out actual brains can do.


It is called a neural network we have them, all of the popular digital assistance use them. You aren't reproducing the hardware you are reproducing the effect of it with software. This emulation of the human brain has been in development for decades now and is going along at a fair paces. So yes when you ask Google now to find you a good Japaneses places where you live it is using a network that functions similar to your brain. Yes it learns and grows just like your brain right now the only real limit on them long term is hardware and that limit is slowly vanishing.

General information on man made neural networks
[http://en.wikipedia.org/wiki/Artificial_neural_network

MIT robotics lab Kismet project
http://www.ai.mit.edu/projects/humanoid-robotics-group/kismet/kismet.html

Google's self taught Atari game AI
http://www.gamespot.com/articles/google-creates-ai-that-can-play-atari-games-better/1100-6425557/[/http://www.gamespot.com/articles/google-creates-ai-that-can-play-atari-games-better/1100-6425557/
19466 cr points
Send Message: Send PM GB Post
43 / M / Finland
Offline
Posted 3/18/15

GayAsianBoy wrote:There are just some things human cannot re-create; and I believe consciousness is one of those things.

Disagree.

The entirety of our consciousness is housed within that ~1,5kg of matter that constitutes our brain. And all its functions are under the domain of Physics. All. Just because we don't yet understand how something like consciousness emerges from the complicated interactions within that mass of matter doesn't mean we won't someday.

It's not an issue if it is actually possible to create an 'artificial' consciousness, of course it is possible. If it was not, the human brain wouldn't be able to exist either. We're just not there yet, whether to build the damn thing, or even understand it. But far enough into the future, barring WWIII or similar, we will be. No buts about it.
27244 cr points
Send Message: Send PM GB Post
27 / M
Offline
Posted 3/18/15 , edited 3/18/15

dsjb wrote:


Metazoxan wrote:

There is one major issue with this

You see one of the amazing things about our brains is how they adapt and grow. More so when young but even later in life it happens a little. The brain adapts and grows overtime and retains everything. We may personally forget things but it's always somewhere in our heads. So when talking about "artificial brains" you are talking about a machine that is not only capable of storing electrical signals of the brain but can grow and adapt in order to keep up with all brain proccesses and store all memories. This isn't as simple as a mechanical limb (which isn't actually simple but in comparison to a brain it is) which you just have to make able to respond to signals sent though the body and move properly. So I'm not sure if we could ever reach the point to create a proper artificial substiture for the brain. Anything we could make would just be a poor substitute for what out actual brains can do.


The ability to change and adapt in not necessarily something we couldn't give a computer, I agree its complicated and beyond are current capabilities but 100 years ago who could have predicted we would be as far along as we are today. Alternatively perhaps the future lies in living organic computers.


I like to use the example of the lab-grown brain.

If we figured out a way to grow perfectly functional brains in a laboratory and had a way to 'transfer' memories from one old brain to the new brain, there would probably be little dispute over whether the person receiving the new organ is still himself or whether he is actually conscious.

So, if we create an artificial brain that performs the same duties as a biological brain, but is simply made with different materials, where is the difference?

On a somewhat unrelated note, I also read about a certain "brain-in-a-box" scenario. It happened many years ago. A man doing experiments on monkeys managed to remove a brain from a monkey and keep it alive and well under the skin of another monkey by diverting that monkey's blood flow into the removed brain.

I wonder what being conscious in that state would do to you as a "brain-in-a-box." No sensory stimuli. Just pure consciousness with no way to express that you are conscious.
29840 cr points
Send Message: Send PM GB Post
F
Offline
Posted 3/18/15
I feel like the brain in some terms exists but also is very.. uhh.. ya know.. complex. Replicating the brain is like.. impossible.
9200 cr points
Send Message: Send PM GB Post
35 / M
Offline
Posted 3/18/15 , edited 3/18/15
My thoughts on it are weird....

I think you can replace parts till it's all machine as long as there is a temporal continuity to the changes. I don't think uploading someone into a machine will work, and I don't think cloning someone and uploading their mind into a meat-computer will work. Temporal continuity is somehow key.
17761 cr points
Send Message: Send PM GB Post
33 / M / outer wall, level...
Offline
Posted 3/18/15
dont really think there is a difference.
the brain is the most advanced computing hardware (wetware) on the face of the earth. i guess its just a matter of design and scale and programming.
27244 cr points
Send Message: Send PM GB Post
27 / M
Offline
Posted 3/18/15 , edited 3/18/15
Yeah, it's considered that clones and twins and complete replacements don't have the same consciousness. A machine-clone is probably considered a separate consciousness as well, if we consider it to have one.

Brings up a classic question of philosophy, though: If you take a ship and keep on replacing parts until none of the original components remain, is it still the same ship? Many people are split on this.
9200 cr points
Send Message: Send PM GB Post
35 / M
Offline
Posted 3/18/15

Morbidhanson wrote:

Yeah, it's considered that clones and twins and complete replacements don't have the same consciousness. A machine-clone is probably considered a separate consciousness as well, if we consider it to have one.

Brings up a classic question of philosophy, though: If you take a ship and keep on replacing parts until none of the original components remain, is it still the same ship? Many people are split on this.


Exactly. And I say it is, as long as it has a continuity throughout the dimension of time. If you replace those parts slowly, maybe 5% a year, it's the same ship the first year, the second year, etc. until you finally replace all of them. It's still the same ship with all the new parts, and if you take all the old parts and put them together, it's not the original ship but the original parts construct a different ship.

hence the temporal continuity argument that I've kinda developed on the spot here...

Besides, our bodies ALREADY do this.

http://www.livescience.com/33179-does-human-body-replace-cells-seven-years.html

Some cells may still remain with us till we die, but very few. You could argue that those are the "key" cells in the body, but really, it's a weak argument. Besides, our brains and our personalities never remain the same either. They add new neural pathways, memories, etc. We're constantly changing and the only thing that holds the "self" to the self is the fact that it has that continuous line than can be drawn through time of identifying it as its "self".
27244 cr points
Send Message: Send PM GB Post
27 / M
Offline
Posted 3/18/15 , edited 3/18/15
I guess the thing I'm most curious about is whether machines can ever possess true consciousness as we know it.

Will machines be forever relegated to only acting based on programming and AI? Do you think machines can eventually do things like.....appreciate art and nature like people can?

Or are people essentially just "biological machines" and there is no such thing as consciousness?
Posted 3/18/15
Libet really shed some light onto human consciousness. Long story short, when participants were asked to make a decision like clicking a button during his experiments, they found that there was a small spike in brain activity before the spike that showed a conscious decision was made. Our subconscious plays a much larger role than we think.


serifsansserif wrote:


Morbidhanson wrote:

Yeah, it's considered that clones and twins and complete replacements don't have the same consciousness. A machine-clone is probably considered a separate consciousness as well, if we consider it to have one.

Brings up a classic question of philosophy, though: If you take a ship and keep on replacing parts until none of the original components remain, is it still the same ship? Many people are split on this.


Exactly. And I say it is, as long as it has a continuity throughout the dimension of time. If you replace those parts slowly, maybe 5% a year, it's the same ship the first year, the second year, etc. until you finally replace all of them. It's still the same ship with all the new parts, and if you take all the old parts and put them together, it's not the original ship but the original parts construct a different ship.

hence the temporal continuity argument that I've kinda developed on the spot here...

Besides, our bodies ALREADY do this.

http://www.livescience.com/33179-does-human-body-replace-cells-seven-years.html

Some cells may still remain with us till we die, but very few. You could argue that those are the "key" cells in the body, but really, it's a weak argument. Besides, our brains and our personalities never remain the same either. They add new neural pathways, memories, etc. We're constantly changing and the only thing that holds the "self" to the self is the fact that it has that continuous line than can be drawn through time of identifying it as its "self".


I've heard people compare the ship of theseus to the human body before but I'm not convinced. If we agree that consciousness is a result of our organic brains and persists because of it, then it's the neurones and the CNS that we should focus on rather than these insignificant kidney/blood/bone cells that have no direct influence on it. Ignoring recent advancements in neurogenesis, you are born with a fixed amount of brain cells that in theory are components of consciousness. Removal or damage of these cells can strip away your self awareness and all that constitutes it. Just something to note.
27244 cr points
Send Message: Send PM GB Post
27 / M
Offline
Posted 3/18/15

mistletain wrote:

Libet really shed some light onto human consciousness. Long story short, when participants were asked to make a decision like clicking a button during his experiments, they found that there was a small spike in brain activity before the spike that showed a conscious decision was made. Our subconscious plays a much larger role than we think.


serifsansserif wrote:


Morbidhanson wrote:

Yeah, it's considered that clones and twins and complete replacements don't have the same consciousness. A machine-clone is probably considered a separate consciousness as well, if we consider it to have one.

Brings up a classic question of philosophy, though: If you take a ship and keep on replacing parts until none of the original components remain, is it still the same ship? Many people are split on this.


Exactly. And I say it is, as long as it has a continuity throughout the dimension of time. If you replace those parts slowly, maybe 5% a year, it's the same ship the first year, the second year, etc. until you finally replace all of them. It's still the same ship with all the new parts, and if you take all the old parts and put them together, it's not the original ship but the original parts construct a different ship.

hence the temporal continuity argument that I've kinda developed on the spot here...

Besides, our bodies ALREADY do this.

http://www.livescience.com/33179-does-human-body-replace-cells-seven-years.html

Some cells may still remain with us till we die, but very few. You could argue that those are the "key" cells in the body, but really, it's a weak argument. Besides, our brains and our personalities never remain the same either. They add new neural pathways, memories, etc. We're constantly changing and the only thing that holds the "self" to the self is the fact that it has that continuous line than can be drawn through time of identifying it as its "self".


I've heard people compare the ship of theseus to the human body before but I'm not convinced. If we agree that consciousness is a result of our organic brains and persists because of it, then it's the neurones and the CNS that we should focus on rather than these insignificant kidney/blood/bone cells that have no direct influence on it. Ignoring recent advancements in neurogenesis, you are born with a fixed amount of brain cells that in theory are components of consciousness. Removal or damage of these cells can strip away your self awareness and all that constitutes it. Just something to note.


Good point. There is something about a living being that we feel cannot be replaced as readily as ship parts. Although there are some parts we can replace without much controversy, such as limbs, the brain is a very mysterious organ indeed.
34924 cr points
Send Message: Send PM GB Post
21 / M / Florida
Offline
Posted 3/18/15 , edited 3/18/15
When I was a kid, one of my wishes was to get a robot wife when I got older. (I was really attached to Saber Marionette J)


I still do
Posted 3/19/15

Gracias2 wrote:


GayAsianBoy wrote:There are just some things human cannot re-create; and I believe consciousness is one of those things.

Disagree.

The entirety of our consciousness is housed within that ~1,5kg of matter that constitutes our brain. And all its functions are under the domain of Physics. All. Just because we don't yet understand how something like consciousness emerges from the complicated interactions within that mass of matter doesn't mean we won't someday.

It's not an issue if it is actually possible to create an 'artificial' consciousness, of course it is possible. If it was not, the human brain wouldn't be able to exist either. We're just not there yet, whether to build the damn thing, or even understand it. But far enough into the future, barring WWIII or similar, we will be. No buts about it.


Sometimes we can understand something, but we might not have the resources to do it...
I totally believe one day, humans will be able to clone somebody's brain and transplant it into a machine...

But to make a machine become self-aware? It's in the realm of impossibility. Of course I believe that we can program an android to become almost human and have unpredictable responses and be able to do things on its own; but for it to question its own existence? To develop its cognitive skills on its own without "updates" or "patches" from the programmer?? Seems impossible...
1398 cr points
Send Message: Send PM GB Post
26 / M
Offline
Posted 3/19/15

GayAsianBoy wrote:


Gracias2 wrote:


GayAsianBoy wrote:There are just some things human cannot re-create; and I believe consciousness is one of those things.

Disagree.

The entirety of our consciousness is housed within that ~1,5kg of matter that constitutes our brain. And all its functions are under the domain of Physics. All. Just because we don't yet understand how something like consciousness emerges from the complicated interactions within that mass of matter doesn't mean we won't someday.

It's not an issue if it is actually possible to create an 'artificial' consciousness, of course it is possible. If it was not, the human brain wouldn't be able to exist either. We're just not there yet, whether to build the damn thing, or even understand it. But far enough into the future, barring WWIII or similar, we will be. No buts about it.


Sometimes we can understand something, but we might not have the resources to do it...
I totally believe one day, humans will be able to clone somebody's brain and transplant it into a machine...

But to make a machine become self-aware? It's in the realm of impossibility. Of course I believe that we can program an android to become almost human and have unpredictable responses and be able to do things on its own; but for it to question its own existence? To develop its cognitive skills on its own without "updates" or "patches" from the programmer?? Seems impossible...


how is it in the realm of impossibility?

selfpatching programs already exist, though the terminology isnt as fixed, these functions fall under self modifying code and meta programming.
developing cognitive and social skills in a dynamic fashion has also been done, in this case by mimicking a humanoid body, they developed their own language without external "programming".

even learning to fly from scratch without any external reference to its own form or function and that was more than a decade ago.

we can quickly agree that fully self aware and sentient AI is far off, but to call it impossible is a bit of a stretch.
9200 cr points
Send Message: Send PM GB Post
35 / M
Offline
Posted 3/19/15

mistletain wrote:

Libet really shed some light onto human consciousness. Long story short, when participants were asked to make a decision like clicking a button during his experiments, they found that there was a small spike in brain activity before the spike that showed a conscious decision was made. Our subconscious plays a much larger role than we think.


serifsansserif wrote:


Morbidhanson wrote:

Yeah, it's considered that clones and twins and complete replacements don't have the same consciousness. A machine-clone is probably considered a separate consciousness as well, if we consider it to have one.

Brings up a classic question of philosophy, though: If you take a ship and keep on replacing parts until none of the original components remain, is it still the same ship? Many people are split on this.


Exactly. And I say it is, as long as it has a continuity throughout the dimension of time. If you replace those parts slowly, maybe 5% a year, it's the same ship the first year, the second year, etc. until you finally replace all of them. It's still the same ship with all the new parts, and if you take all the old parts and put them together, it's not the original ship but the original parts construct a different ship.

hence the temporal continuity argument that I've kinda developed on the spot here...

Besides, our bodies ALREADY do this.

http://www.livescience.com/33179-does-human-body-replace-cells-seven-years.html

Some cells may still remain with us till we die, but very few. You could argue that those are the "key" cells in the body, but really, it's a weak argument. Besides, our brains and our personalities never remain the same either. They add new neural pathways, memories, etc. We're constantly changing and the only thing that holds the "self" to the self is the fact that it has that continuous line than can be drawn through time of identifying it as its "self".


I've heard people compare the ship of theseus to the human body before but I'm not convinced. If we agree that consciousness is a result of our organic brains and persists because of it, then it's the neurones and the CNS that we should focus on rather than these insignificant kidney/blood/bone cells that have no direct influence on it. Ignoring recent advancements in neurogenesis, you are born with a fixed amount of brain cells that in theory are components of consciousness. Removal or damage of these cells can strip away your self awareness and all that constitutes it. Just something to note.


I was thinking about it and you kinda have a point.

It still does not disclude AI in machines though. With evolvable hardware, the "unused pieces" being removed can also lead tothe system breaking down sometimes. (and honestly, I think evolving hardware is the way to go)
First  Prev  1  2  3  Next  Last
You must be logged in to post.