🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Why A.I is impossible

Started by
116 comments, last by Alexandra Grayson 6 years, 4 months ago
1 hour ago, cowsarenotevil said:

Why not because they can talk about it? As you point out, your own consciousness (insofar as that's a thing that exists at all) is self-evident to you, but when you talk about it, are you actually referring to it?

If so, then it would, at least, be pretty implausible that other people would appear to talk about their own consciousness if it weren't something that they themselves also actually have.

If not, then there's some even weirder coincidence afoot: you experience consciousness, but when you talk about your own consciousness, you're actually talking about something different than the consciousness you actually experience.

Basically, either consciousness manifests itself physically to the extent that people are at least able to refer to it in speech and writing, or it doesn't, meaning we can't actually refer to it at all despite the fact that we appear to be discussing it. In the former case, the fact that people outside of your own perception of consciousness claim to refer to consciousness would suggest that they too actually can refer to it, and thus experience it in some way.

In the latter case, either it's pure coincidence that we merely appear to be discussing a phenomenon that actually exists (but cannot actually be discussed), or consciousness doesn't exist at all.

Maybe I misunderstood your reply but:

If I write a program that prints "I have consciousness" if you press enter (or make some other simple claims), does it make it have consciousness? Also if I can't discuss about consciousness with somebody because that person can't effectively reason about anything (like me) or simply that person is blind and deaf, does it mean no consciousness?

What I'm trying to say that it's pretty arrogant for anyone to tell that some other entity doesn't have consciousness (especially just because one "feels" ones consciousness, or whatever.) I'm not saying it's not magic. I'm only saying that (I think) manking is not special.

Plus, just because something doesn't exist, we can talk about it. Hell, I'm not even sure we are talking about the same thing... So much for "can refer to it, therefore it exists"

 

Edit: I think I "sense" what most of you try to imply by the "knowledge of consciousness affects the physical world, since we are talking about it, therefore somehow consciousness must be out of this world" thing, but the "therefore somehow consciousness must be out of this world" part is something beyond my linguistic abilities to reason about. I "feel" that this part is the mistake in our thinking (and leeds to the classic dilemma/contradiction of predestination).

Advertisement

Aw man, I stumbled in to that part of the internet again.

1 hour ago, lonewolff said:

Aw man, I stumbled in to that part of the internet again.

That's like saying "Aw man, I stumbled in to the outdoors again."

13 hours ago, Eric LeClair said:

2. We are using our 5 senses to create something that is literally 'out of this world'. 

Well this is just wrong on so many levels.

We have already proven humans either have hundreds of senses or only one depending on what you define as a sense. The five senses is just the old "traditional" way of thinking.

Then there is the fact that as humans we have long ago found ways to extend our senses. I mean no one has ever seen the inside of a blood cell before or even seen an atom but we know they are there and we can perceive them.

13 hours ago, Eric LeClair said:

1. The only difference between a human being and a machine is 'consciousness'. Some people call it a soul or spirit or whatever. Basically, it's energy that's beyond the 5 'human' senses.

Artificial Intelligence is implying that it doesn't have a soul or what ever. If a Artificial Intelligence did have a soul it would need an Artificial Soul in order to remain a A.I or else it would be a machine possessed by a soul.

There always is the chance that we will get possessed machines, after all the brain and body is just matter and with enough skill we can copy it perfectly. Cloning is already a thing and it does look like clones have souls.

So maybe if we clone every bit of a human, using machine parts, it will have a soul.

 

But A.I is suppose to be only a mimicry of a soul at best. After all we want to use A.I without the worry of slave laws and a uprising.

9 hours ago, Lactose said:

Just asking, are you trolling this forum?

EDIT: You seem to be using a proper name & profile picture, however the profile picture does not belong to anyone of that name, rather it belongs to someone holding a Ph.D. in engineering, who does not seem to have any record of being particularly interested in spiritualistic stuff.

That plus the fact that it's such a fresh account here makes me wonder, given the 'trollishness' of your posts (claiming stuff without any actual proof, even when you claim to prove stuff).

I'm gonna say that this looks really trolly, given the excessive use of circular logic...

But the discussion is interesting enough in itself: is it possible to achieve a true human like AGI (as @ChaosEngine elucidated for us). I'm also gonna agree with ChaosEngine: it's not impossible to create a true AGI, but it will probably be borderline impossible for a true human like AGI, primarily because our first true AGIs are almost certainly going to think in a manner completely alien to us, for a lot of reasons, starting from hardware, going all the way to things like what will AGI emerge from (one could argue that a search engine does 'think' for example, just not in a manner that we recognize. An argument could be made that a 'thought' is the response to the queries. It's a weak argument, but it's more for an example rather than anything else)

And as @Oberon_Command fairly concisely pointed out, we still don't really know what consciousness is. No one truly understands what exactly it is, although there's some great theories. So to flat out say that true human like AI is impossible without that knowledge is pretty presumptuous at best. 

No one expects the Spanish Inquisition!

Well, as long as we're talking about it...

https://en.wikipedia.org/wiki/Chinese_room

Among a lot of BS, I think one of Eric's core point is mostly likely correct. But I guess he is approaching it from the wrong angle and probably using the wrong words and logic. For one, in my opinion i think the statement that "we are all one single consciousness" is complete BS, but that "the seed of consciousness is separate from the brain and that's what makes us real human", I think that's CORRECT. 

I use seed for a particular reason. Without the brain there is no awareness. But its a deep and long stuff and I haven't got the time to write on that now. I'm seriously behind schedule on what I'm developing at the moment.  I think this thread would be long dead by the time i complete my coding and have the time.

Also mikeman's link seems very interesting, haven't read the whole thing though

3 hours ago, deltaKshatriya said:

primarily because our first true AGIs are almost certainly going to think in a manner completely alien to us,

Why? If humans coded the logic by which the "AGI" thinks or operates and builds upon... AGI would develop (because of infinite and fast self programming resource) to be more advance than us , but why would it be alien to us

can't help being grumpy...

Just need to let some steam out, so my head doesn't explode...

14 minutes ago, grumpyOldDude said:

that "the seed of consciousness is separate from the brain and that's what makes us real human", I think that's CORRECT. 

But why?

To apply some of the common 'logic' to other things:

There is more to a car's complex engine than just the engine! Just look at it. There are tons of parts, and if you remove a few parts, like a spark plug or two, then it will still mostly run, but not great. Things can be added, and some bits can even be moved around, and there are lots and lots and lots of different engines out there, but lets be honest and admit that there is no way humanity could ever understand how an engine TRULY works, so therefore it must rely on some outside factor to operate that we haven't yet discovered...

It is a bio-chemical-electrical machine, and it really doesn't make much logical sense to assume there is anything magical or other worldly needed for it to run or for it to be simulated in another system once we can precisely define the functionality of the originals.

Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.

We already have AI that can learn and adapt, like Deep Mind can do.  This is already the first step in this direction.

"I wish that I could live it all again."

This topic is closed to new replies.

Advertisement