> I'm not sure the question of "whether an X can have conscious experience" (where X is computer, dog, fish, human infant...) is even all that meaningful.
I do know that I have a conscious experience, so that's quite meaningful - to me. I cannot check that any other being has similar feelings, so the doubtful question would be whether "an X other than me can have conscious experience"; but people with good manners make the polite assumption that it's also true for other similar beings.
You may have the illusion of a "conscious experience" that is in fact a story told to yourself about how a thing you call "you" is in charge of your thoughts.
That doesn't really work. What is "yourself" in that statement, other than a conscious entity?
How would you write a computer program that "tells that story to itself," such that it actually has an experience of the world, as opposed to just being a machine executing a program without any conscious awareness?
Edit: also, whether we're in charge of our thoughts is a separate question from whether we possess consciousness. Even if we're not in charge of our thoughts, we still have a conscious experience of them.
I've never heard a complete and convincing explanation for what "yourself" could be, but meditating on the extreme unintuitiveness of self-reference and recursion (a la Douglas Hofstadter's I am a Strange Loop) increases my expectation that a computational explanation is coming, eventually.
I think that's pretty wishful thinking. It's not like we don't have a lot of experience with self-reference and recursion in computational systems. In fact this site is named after that. I don't think the Y combinator is conscious.
The conscious experience might be there but at the same time it could be an entirely deterministic thing. Maybe you and I having this exchange was determined in the instant of the big bang.
Being deterministic doesn't make it any less conscious. We're talking perception here, not free will.
In fact, there's pretty good evidence that what we call consciousness is a post-facto rationalization of the subconscious brain processes that determine an automatic answer of your brain to stimulus (not that it makes them deterministic, but certainly they're not "rational" in the classic sense).
"Your perception of conscious subjectivity" implies consciousness.
Put another way, how would you program a computer to have a perception of conscious subjectivity, as opposed to just blindly and unconsciously executing its instructions?
> "Your perception of conscious subjectivity" implies consciousness.
No it doesn't! Assuming by "consciousness", you mean a phenomenon that's not reducible to unconscious particle interactions, which is typically what is meant in philosophical discussions of this topic.
We have some mechanistic theories for consciousness [1]. It basically amounts to the same sort of illusion that your single core CPU uses to achieve the illusion of parallelism, ie. context switching between internal and external mental models produces the illusion of consciousness.
> Assuming by "consciousness", you mean a phenomenon that's not reducible to unconscious particle interactions
I'd say that's an unfounded assumption, which doesn't come up in the argument you're responding to - even if it's somewhat 'popular' elsewhere.
The argument made is that consciousness is (or includes) a form of perception; not that this perception is independent of mechanistic components. With this definition, you assertion that 'conscious subjectivity is an illusion' is inconsistent, as an illusion is a complex form of perception that requires a consciousness to perceive it.
Following your CPU example, there is parallelism from the point of view of the program being executed, even if it's simulated from a single-core mechanical basis (threads and context-switching).
> I'd say that's an unfounded assumption, which doesn't come up in the argument you're responding to - even if it's somewhat 'popular' elsewhere.
It's not really. Consciousness quite literally does not exist in mechanistic/eliminativist conceptions of consciousness like the link I provided, just like cars don't really exist because they aren't in the ontology of physics. My clarification of "assumption" is simply because many people don't know this.
> Following your CPU example, there is parallelism from the point of view of the program being executed, even if it's simulated from a single-core mechanical basis (threads and context-switching).
> just like cars don't really exist because they aren't in the ontology of physics
If I understand you correctly, that's a pretty harsh criterion for existence, isn't it? Even though a car is just a composite of metal atoms under a precise configuration and not a metaphysical entity on itself, you can still use it to drive you home. I suppose that makes me an utilitarian.
> No, there is concurrency but not parallelism.
You're right, my bad. I've forgotten my precision from my college days. Still, that's good enough for the program, just like my consciousness is good enough for me, even if it's entirely mechanistic and doesn't exist in the same way that cars don't exist.
I do know that I have a conscious experience, so that's quite meaningful - to me. I cannot check that any other being has similar feelings, so the doubtful question would be whether "an X other than me can have conscious experience"; but people with good manners make the polite assumption that it's also true for other similar beings.