Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Is it based on some underlying assumption, that we have some hardwired, a priori subjective experience of red and blue colors and later we only associate these with perceptions of blood, roses, ripe apples and sky, water accordingly?

The point is that we don't know. We know that we have a subjective experience of an image, where 'red' parts correspond to visual red light stimuli, but it's impossible for me as an individual to know what your subjective experience of an image looks like. If someone 'smelled' subjective images it would be impossible to tell the difference as long as they still 'smell' red light as red.

>That might be just an illusion and our color perception is learned, so blue is just the color of sky and water, and nothing more.

But where does that 'blue' experience come from?

>Some people do (synesthesia), but generally lack of such experience mixes can be explained by different part of the brain getting different inputs, and impossibility of e.g. auditory stimulus to generate the same response as seeing red color would do.

Sure, I'm just asking why there are different experiences to begin with. Why do we experience smell, sound, etc. the way we do?



Sure, my answer to that is that subjective experiences are learned in a chaotic process, so location of neurons representing the same color might be in different parts of our visual cortices. At the same time we learn patterns that exist in the real world, so our experiences will most probably create similar associations, e.g. associating mint taste with green or strawberries smell with red.

> But where does that 'blue' experience come from? It is learned by having interactions with blue things and trying to find patterns, common qualities about these, and such association of possible outcomes creates a subjective concept, a qualia of "blueness".

> Sure, I'm just asking why there are different experiences to begin with. Here I don't have good explanations. Maybe it's matter of different wiring in our brain, maybe it comes from the fact, that you can predict how things are going to look to your left eye by looking on them with your right eye or how things are going to feel touching your face after touching them with your hand - allowing you to create abstractions between such perceptions, that are not transferable across senses.


> but it's impossible for me as an individual to know what your subjective experience of an image looks like

This smells like a god of the gaps or argument from ignorance type of argument to me. And it's not impossible, just difficult to execute. All we would have to do is digitize your brain and swap out the responsible part, then you can form a memory of that, swap the original back in and do a comparison. Or something like that. All handwavy science-fiction of course since we currently lack detailed knowledge how this works, but that does not imply there's anything special about it, only that one human might be processing the data slightly differently than another human. The same way that one human may assign a different most-common-meaning to an ambiguous word than another.


>This smells like a god of the gaps or argument from ignorance type of argument to me

huh? Are you saying that 'we don't know how subjective experience works' is a argument from ignorance?

>All we would have to do is digitize your brain and swap out the responsible part, then you can form a memory of that, swap the original back in and do a comparison.

I'm not sure what that is supposed to mean. What is 'the responsible part' and what would swapping it out achieve? I'm still only going to have my subjective experience.


> huh? Are you saying that 'we don't know how subjective experience works' is a argument from ignorance?

I'm saying that the lack of knowledge does not imply that there's anything special about it that wouldn't also arise naturally in an NN approaching even animal intelligence.

> What is 'the responsible part' and what would swapping it out achieve? I'm still only going to have my subjective experience.

I assume that "subjective experience" has some observable consequences, of which you can form memories. Being able to swap out parts of a brain will allow you to have a different subjective experience and then compare them. It is an experimental tool. I don't know what you will observe since that experiment has not been performed.


>Only that there's something special about subjective experience that wouldn't arise naturally in an NN approaching even animal intelligence.

That isn't at all what I've said. I'm saying that 'qualia' exist and that we have no clue how they arise. Maybe they arise from complicated enough systems, maybe they don't. Hell, maybe panpsychists are right and even a rock has some sort of consciousness. My issue is with people who are confident that a big enough NN necessarily has consciousness.

>I assume that "subjective experience" has some observable consequences, of which you can form memories. Being able to swap out parts of a brain will allow you to have a different subjective experience and then compare them. It is an experimental tool. I don't know what you will observe since that experiment has not been performed.

Unless you presuppose that there is some part that completely determines subjective experience (I don't think it'd even be possible to identify such a part if it existed), I don't see how that would work. Yes, you can swap out a part and see that your subjective experience changes, but this tells you nothing about the subjective experience of others.


> I'm saying that 'qualia' exist

If by qualia you mean slight differences in information processing in human brains, then sure. If you mean anything more than that I would like a) a better definition than the one I have given b) some observational evidence for its existence.

> My issue is with people who are confident that a big enough NN necessarily has consciousness.

Not necessarily, just potentially. After all there will be many inefficient/barely-better-than-previously/outright detective big NNs on the path to AGI.

If you're asking whether an intelligent NN will automatically be conscious then it depends on what we mean by "intelligent" and "conscious". A mathematical theorem prover may not need many facilities that a human mind has even though it still has to find many highly abstract and novel approaches to do its work. On the other hand an agent interacting with the physical world and other humans will probably benefit from many of the same principles and the mix of them is what we call consciousness. One problem with "consciousness" is that it's such an overloaded term. I recommend decomposing it into smaller features that we care about and then we can talk about whether another system has them.

> Hell, maybe panpsychists are right and even a rock has some sort of consciousness.

If we twist words far enough then of course they do. They are following the laws of physics after all which is information processing, going from one state to another. But then all physical systems do that and its usually not the kind of information processing we care that much about when talking about intelligences. Technically correct given the premise but useless.

> I don't think it'd even be possible to identify such a part if it existed

We're already making the assumption we have the technology to simulate a brain. If you have that ability you can also implement any debugging/observational tooling you need. AI research is not blind, co-developing such tooling together with the networks is happening today. https://openai.com/blog/introducing-activation-atlases/


>If by qualia you mean slight differences in information processing in human brains, then sure. If you mean anything more than that I would like a) a better definition than the one I have given b) some observational evidence for its existence.

Subjective experiences i.e. how I actually experience sense data. There is no real, objective observational evidence and there can't be. How would you describe taste to a species of aliens that understands the processes that happen during tasting, but don't taste themselves? It's simply impossible. I know that I have personal, subjective experiences (the 'images I see' are not directly the sense data that I perceive), but I can only appeal to you emotionally to try and make you believe that it exists operating under the assumption that you too must have these experiences.

>One problem with "consciousness" is that it's such an overloaded term. I recommend decomposing it into smaller features that we care about and then we can talk about whether another system has them.

This entire discussion has been about consciousness in the philosophical meaning i.e. the ability to have some form of subjective experiences.

>If we twist words far enough then of course they do. They are following the laws of physics after all which is information processing, going from one state to another. But then all physical systems do that and its usually not the kind of information processing we care that much about when talking about intelligences. Technically correct given the premise but useless.

This isn't about twisting words, some people genuinely believe that everything is conscious with more complex system being more conscious.

>We're already making the assumption we have the technology to simulate a brain. If you have that ability you can also implement any debugging/observational tooling you need. AI research is not blind, co-developing such tooling together with the networks is happening today

The point is that it's about _subjective_ experiences.


Fish tastes like fish because the taste is a categorizing representation of that sensory input.

What you can do is today is start with a feature map. We can do that with colors https://imgs.xkcd.com/blag/satfaces_map_1024.png (do you perceive this color as red?) and we can do that with smells https://jameskennedymonash.files.wordpress.com/2014/01/table... That's a fairly limited representation but words are an incredibly low-bandwidth interface not suitable to exporting this kind of information in high fidelity, so we can't. That does not mean it's conceptually impossible. If you wanted to export subjective experience itself then you'd need the previously mentioned debugging interface. Our brains don't have that built-in, but software does. I.e. a program can dump its entire own state and make it available to others.

To me subjective experience seems to be an intermediate representation, deep between inputs and outputs, and due to the various limitations we're bad at communicating it. That doesn't mean there's anything special about it. It is a consequence of compressing inputs into smaller spaces in ways that are useful to that entity.

> This isn't about twisting words, some people genuinely believe that everything is conscious with more complex system being more conscious.

Anything that interacts with the world will have an internal, idiosyncratic representation of that interaction. Even a rock will have momentarily vibrations traveling through it that carry some information about the world. One of today's NNs will have a feature layers that roughly correspond to concepts that are of human interest. They're often crude approximations, but it's good enough for some use-cases. Animal brains just have more of that.

So in that sense, sure, it's a continuum. But there's nothing mysterious about it.


>Fish tastes like fish because the taste is a categorizing representation of that sensory input.

Yes, but why does the fish taste have the taste it does? Hell, try explaining what fish tastes like, without evoking similar tastes.

>What you can do is today is start with a feature map. We can do that with colors https://imgs.xkcd.com/blag/satfaces_map_1024.png (do you perceive this color as red?) and we can do that with smells https://jameskennedymonash.files.wordpress.com/2014/01/table.... That's a fairly limited representation but words are an incredibly low-bandwidth interface not suitable to exporting this kind of information in high fidelity, so we can't. That does not mean it's conceptually impossible. If you wanted to export subjective experience itself then you'd need the previously mentioned debugging interface. Our brains don't have that built-in, but software does. I.e. a program can dump its entire own state and make it available to others.

But a feature map doesn't tell you anything about how the space itself works. If you look at that smell graph, you'll see that it uses comparisons, because it's literally impossible for us to explain what smelling is like without saying "well, it's similar to smelling x". Someone who is born without smell could memorize that chart, understand everything there is about smelling, but he wouldn't actually know what it's like to smell.

>To me subjective experience seems to be an intermediate representation, deep between inputs and outputs, and due to the various limitations we're bad at communicating it. That doesn't mean there's anything special about it. It is a consequence of compressing inputs into smaller spaces in ways that are useful to that entity.

We're not just bad at communicating it, but we're bad at understanding it, because our conventional means of measuring things doesn't really work for subjectivity. I'm not saying it's "magical", but it's not certain that we even can potentially build tools to interact with it.


> But a feature map doesn't tell you anything about how the space itself works.

The space is what is doing the work. Of course it's vastly more complex than a simple image with a few regions painted into it. There are only implementation details below it. The issue is that we cannot import and export them. With software that is a wholly different matter and they be transplanted, fine-tuned, probed and so on.

> but it's not certain that we even can potentially build tools to interact with it.

I agree that this is all very speculative, we don't have the technology and it can take a long time until we can actually inspect a human brain. But we may be able to do the same much easier to artificial intelligences, once created.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: