In Psychology, many theories and models use process charts resembling circuit diagrams of technical devices. In this account, human behaviour and experience appears to be the result of processes taking place in the ‘black box’ named cognition. In this context, “computationalism is the view that computation […] can offer an explanatory basis for cognition” (Davenport, 2008, p.1). The Computational Theory of Mind (CTM) has developed on this foundation, attempting to reveal what is inside this ‘black box’. In contrast, human consciousness being a part of cognition (Harnad, 1994) seems to be beyond any scientific explanation. This essay will critically discuss the extent to which consciousness poses a problem for the CTM – regarding issues surrounding consciousness as an area of scientific study, the extent to which consciousness is explicable in computational terms, explanations of consciousness, and Dennett’s (1991) different account to consciousness. It will be argued that consciousness does pose a major problem for the CTM, especially when it is conceptualised as subjective experience. It will conclude that Cognitive Science should presently focus only on certain aspects of consciousness, called the easy problems.
THE EXTENT TO WHICH ‘CONSCIOUSNESS’ POSES A PROBLEM FOR THE
COMPUTATIONAL THEORY OF MIND.
by Dipl.-Psych. Sebastian A. Wagner, B.Sc. (F.C. Hon.) in Psych. University of Bamberg, Germany
In Psychology, many theories and models use process charts resembling circuit diagrams of technical devices. In this account, human behaviour and experience appears to be the result of processes taking place in the ‘black box’ named cognition. In this context, “computationalism is the view that computation […] can offer an explanatory basis for cognition” (Davenport, 2008, p.1). The Computational Theory of Mind (CTM) has developed on this foundation, attempting to reveal what is inside this ‘black box’. In contrast, human consciousness being a part of cognition (Harnad, 1994) seems to be beyond any scientific explanation. This essay will critically discuss the extent to which consciousness poses a problem for the CTM - regarding issues surrounding consciousness as an area of scientific study, the extent to which consciousness is explicable in computational terms, explanations of consciousness, and Dennett’s (1991) different account to consciousness. It will be argued that consciousness does pose a major problem for the CTM, especially when it is conceptualised as subjective experience. It will conclude that Cognitive Science should presently focus only on certain aspects of consciousness, called the easy problems.
Jerry Fodor (2000, p.1) considered Putnam’s (1960; 1961) Computational Theory of Mind to be by far the “best theory of cognition that we’ve got.” He may have come to this conclusion because the CTM promises to give an answer to the most fundamental question of Cognitive Science. That is to say: ‘How does the mind actually work?’ (Pylyshyn, 1986). The starting point of the CTM is an analogy to digital computers: the human mind can be regarded as an information processing system, whereupon the relationship between mind and brain corresponds to the relationship between software and hardware (Aulinger, 2008). In other words: to think is to compute, and the mind is a computer (Kuczynski, 2006). The advantage of this view is that it can explain how the mind works, whereas its implementation regarding the brain can remain unknown. The CTM consists of two elements: a computational account of cognitive processes, and a representational account of mental states (Horst, 1999; 2005).
The latter mental states, such as beliefs, are mental representations of the thinker. These states are represented in symbols. Referring to Fodor (1981; cited in Horst, 1999) these symbols have semantic and syntactic properties. The former cognitive processes of reasoning - which can be regarded as computing - are a case of manipulating these symbols. Here, only the syntax of the symbols is manipulated, not the semantics, which meets the definition of ‘computation’. Fodor (2000, p.1) sums it up by stating that: “intentional processes are syntactic operations defined on mental representations”.
Consciousness is one of the biggest unsolved problems in the scientific world (Block, 1995). The fact that the concept of consciousness involves a plurality of connotations is obviously related to this (Block, 1995). William James (1910, cited in Torrance, in press) proposed a number of principles about everyday conscious awareness. He suggested that consciousness includes “every thought or state of mind that one has”, that there are “constant changes between successive states”, and that “consciousness appears to us as a continuous flow or stream”. These descriptions of consciousness appear difficult enough for scientific investigations.
However, even more problematic seems to be the so-called ‘epistemological problem’: How can scientists have “knowledge of the internal activities of conscious, intelligent minds?” (Churchland, 1988, p.67). According to Churchland (1988), it consists of two major problems. Firstly, it seems impossible to know for sure that other individuals have such a thing as mental state. This conclusion could only be drawn by observations of other people’s behaviour, and the only way to validate it is one’s own experience of consciousness. Secondly, these individual experiences - self-consciousness - are difficult to catch. Self- consciousness can be regarded as the understanding of the inner reality of one’s own mental state and activities. It seems impossible to directly access someone else’s first-person perspective of consciousness. However, it is indeed possible to gain some insights into one’s own conscious experiences. Spurr & Stopa (2002) for instance regard the ability to have access to one’s own consciousness as a meta-cognitive skill of self-awareness, which involves reflecting on a high level capacity on one’s own thoughts (Clark, 2001). Nevertheless, this still does not explain whether other individuals have a mental state or not. All this shows how difficult it is to grasp the concept of consciousness. Nevertheless, consciousness seems to be a subject of experience, e.g. to taste the apples, or to feel the backache etc. (Clark, 2001). These experiences of human beings - or raw feelings - are called ‘qualia’. Clark (2001, p.22) defined as “the qualitative sensations that make life rich, interesting, or intolerable”.
However, it is difficult to say what exactly qualia are, as with the majority aspects of consciousness.
[...]
- Quote paper
- Dipl.-Psych. (Univ.) - B.Sc. (F.C. Hon.) in Psych. Sebastian A. Wagner (Author), 2008, The Extent to which ‘Consciousness’ poses a problem for the Computational Theory of Mind, Munich, GRIN Verlag, https://www.grin.com/document/170543
-
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X. -
Upload your own papers! Earn money and win an iPhone X.