Is Searle's 'chinese room' internally consistent? Does it not presuppose an agent who can understand the manipulation of symbols? If so, why not conceptualize our consciousness analogously to a computer's more fundamental structures bestowing it the capacity to 'run' softwares?

If the article portrayed our consciousness analogously to a computer's more fundamental structures bestowing it the capacity to 'run' softwares then it probably would be internally inconsistent. But it doesn't.

Read another response by Gabriel Segal
Read another response about Mind