Exploring Consciousness: The Distinction Between Cognition and Sentience
Written on
Chapter 1: Understanding Consciousness
When discussing consciousness, it’s crucial to be precise with terms like ‘cognition,’ ‘comprehension,’ and ‘understanding.’ In everyday language, these terms often suggest an element of consciousness. For instance, ‘cognition’ implies an awareness inherent in our consciousness. However, assessments conducted through various medical evaluations or Turing-like AI tests do not actually provide insights into ‘experiential’ consciousness—the subjective feeling of understanding.
Cognition, as measured in these contexts, primarily reflects neuronal input and output activity. Therefore, to grasp the essence of ‘experiential’ consciousness, it's advisable to view cognition as a process strictly related to information processing and neuroscience. In contrast, the source of experiential consciousness remains largely enigmatic. It could stem from internal neurological processes or even possess a dualistic nature, hinting at a potential existence beyond this simulated reality.
We remain uncertain about these matters, and intriguingly, physics itself appears to present a simulated aspect. Currently, we lack the means to program AI systems to genuinely experience consciousness; we simply do not possess the necessary scientific framework.
Section 1.1: The Nuances of Understanding
The same semantic caution applies to ‘understanding’ and ‘comprehension.’ The human experience of ‘understanding’ differs significantly from what an 8th grader might demonstrate on a comprehension test regarding a newspaper article or a piece of poetry.
Section 1.2: The Challenge of Proving Sentience
One can exhibit comprehension by answering questions, yet this does not validate the notion of ‘feeling’ that comprehension, even if one asserts they do. Most individuals will affirm their own feelings and experiences of thought. However, proving this to others remains elusive.
This leads to a paradox: we cannot definitively prove that AIs do not possess sentience or the ability to experience their thoughts. Nonetheless, a prevailing belief exists that AIs lack such experiences, which seems curiously unscientific. There’s an intuitive understanding that consciousness encompasses more than mere information processing, and we trust in the sentience of our fellow humans.
Chapter 2: The Mystery of Experiential Consciousness
My primary inquiry revolves around the reasons behind our sentience and why we actively experience the outcomes of our neural processes. I ponder why we are not akin to ‘philosophical zombies’—unfeeling entities presumed to be similar to AIs.
Despite the complexity of feedback loops within the human brain, a perplexing absence of any known mechanism exists in our scientific understanding that could evoke experiential consciousness. Data representations and processing do not inherently lead to the capacity to feel.
Be cautious of arguments suggesting that consciousness is merely an illusion or that everything is fundamentally conscious, as posited by panpsychism. The intricacies of experiential consciousness are profoundly complex, and unlike the functional aspects of neuroscience, we currently lack any scientific explanation.
Nothing exists to bridge this gap. This realization is deeply significant and signals our advancement beyond a primitive scientific understanding.
The first video titled "Evidence for sentience in insects" features Dr. Andrew Crump discussing the potential for sentience in non-human organisms, challenging traditional views of consciousness.
In the second video, "What is sentience?", the concept of sentience is explored, providing insights into its definitions and implications across various contexts.