Chat with us, powered by LiveChat Theories of Consciousness - STUDENT SOLUTION USA

 

 

Guidelines for Short Papers Each paper must be double spaced, justified on the left side only (turn off right-hand justification) in Times Roman 12-point font (or equivalent), with margins of at least 1 inch. Papers must be no longer than 2 pages.I will stop reading at the bottom of the secondpage. Do not attempt to summarize every aspect of the paper! This exercise is designed to help you learn how to identify and summarize the core argument(s) in anarticleaccurately and succinctly.You may but are not required tooffer some critical insight into the assigned reading. This can take the form either of agreeing or disagreeing with the author(or both)and explaining why you agree or disagree. To do this you need to spell out the author’s position enough so that your own remarks have some context. If you wish, you may focus your attention on a particular argument or part of a paper, as long as it is central to the main issue and not a peripheral point. It is your choice which readings you write on, but you must hand each paper in on the class in which we will discuss the material you wrote about. For example, if you wish to hand in an assignment on Flanagan and Polger’s paper, it is due at the beginning of class on February 27. So please pay close attention to the reading schedule. However, the reading you are to do this paper on is 21.Excerpt from Seager, William. 1999. Theories of Consciousness: An Introduction and Assessment. London; New York: Routledge Pages 72-84 of the attached pdf article I will provide a copy of the reading for you to use and read as a word attachment. Please follow the guidelines and directions above. You must read the reading and then follow the directions stated above to write the short paper. I have attached both the reading you are to use for this short paper as well as an example of a short paper written by an A student in our class on a different reading/topic that you can look at to see how it should look. Thank you and im looking forward to seeing the work you do on this short paper. The reading you are to do this short paper on is a pdf and the sample is a word document.

Lack of instructions 

Guidelines for Short Papers Each paper must be double spaced, justified on the left side only (turn off right-hand justification) in Times Roman 12-point font (or equivalent), with margins of at least 1 inch. Papers must be no longer than 2 pages.I will stop reading at the bottom of the secondpage. Do not attempt to summarize every aspect of the paper! This exercise is designed to help you learn how to identify and summarize the core argument(s) in anarticleaccurately and succinctly.You may but are not required tooffer some critical insight into the assigned reading. This can take the form either of agreeing or disagreeing with the author(or both)and explaining why you agree or disagree. To do this you need to spell out the author’s position enough so that your own remarks have some context. If you wish, you may focus your attention on a particular argument or part of a paper, as long as it is central to the main issue and not a peripheral point. It is your choice which readings you write on, but you must hand each paper in on the class in which we will discuss the material you wrote about. For example, if you wish to hand in an assignment on Flanagan and Polger’s paper, it is due at the beginning of class on February 27. So please pay close attention to the reading schedule. However, the reading you are to do this paper on is

21.Excerpt from Seager, William. 1999. Theories of Consciousness: An Introduction and Assessment. London; New York: Routledge


Pages 72-84 of the attached pdf article

I will provide a copy of the reading for you to use and read as a word attachment. Please follow the guidelines and directions above. You must read the reading and then follow the directions stated above to write the short paper. I have attached both the reading you are to use for this short paper as well as an example of a short paper written by an A student in our class on a different reading/topic that you can look at to see how it should look. Thank you and im looking forward to seeing the work you do on this short paper. The reading you are to do this short paper on is a pdf and the sample is a word document.

In his essay, What Is It Like to Be a Bat, Thomas Nagel argues that regardless of the validity of physicalism, it is still impossible for humans to understand how physicalism could be true in that a complete physical description of consciousness is not possible. Nagel claims that the explanatory gap is unbridgeable since consciousness is necessarily subject to a particular point of view and any attempt to objectively understand consciousness is therefore rendered incomplete.

Nagel posits, first, that consciousness is not a strictly human phenomenon. Rather, there are other organisms which experience consciousness. From their consciousness, and their consciousness alone, Nagel concludes that there is what it is like to be that organism. Nagel calls “what it is like to be that organism” the subjective character of experience. For Nagel, the reductionists must describe the subjective character of experience to a physical phenomenon, since, if base properties and all subsequently determined properties can be described in terms of physical properties then their resultant mental properties (consciousness) should be describable in terms of physical properties. However, for Nagel, this is the most difficult task of the physicalists since “every subjective phenomenon is essentially connected with a single point of view, and it seems inevitable that an objective, physical theory will abandon that point of view.” (437)

To demonstrate the relationship between subjectivity and point of view, Nagel uses the experience of a bat. Bats use echolocation in order to perceive certain characteristics of real things (their size, motion, distance from the bat, etc.). While humans are also able to perceive these characteristics, due to the difference in sensory modalities, the way in which we do would be radically different from the way bats perceive these characteristics of reality. Nagel explains that even the limited ability to imagine oneself as a bat would merely constitute imagining oneself behaving like a bat and would not constitute any understanding of what it would be like to be a bat. Further, lacking the same sensory modalities as bats we do not have the correct experiential terms for describing what it would be like to be a bat. Nagel holds himself only to the claim that humans cannot describe the experience of a bat since we are not of the same type of being as a bat. That is to say that we are so different in our senses that we, humans, could never correctly describe the subjective character of experience of a bat. Nagel later uses the example, “Red is like the sound of a trumpet,” (449) to show the vanity of attempts at such descriptions between beings with differing sensory modalities (here between one with and one without vision). However, Nagel refuses to claim that all subjective characters of experience can never be described. He admits that humans may be able to understand a description of the subjective character of others’ experience since we are of the same type (beings with similar sensory modalities.) Nagel does not give a definitive divisor between types but posits only, “The more different from oneself the other experiencer is, the less success one can expect with this enterprise.” (442)

Nagel notes that humans and intelligent Martians, who’s physical makeup and sensory modalities are completely different from our own, may be able to come to the same conclusion regarding the physical phenomena that constitute a rainbow, or clouds, or lightning. However, the Martians could never understand the human conception of these things in the same way that we could not understand a bat’s conception of these things. The problem, for Nagel, in understanding the subjective character of experience is that it is ordinarily the role of the sciences to describe reality in terms of objective descriptions. However, since the subjective character of experience is necessarily subject to a particular point of view, any attempt to objectively describe it would be a step further from a correct conception. Nagel asks, “Does it make sense … to ask what my experiences are really like, as opposed to how they appear to me?” (448)

Nagel concludes, noting that the pursuit of understanding consciousness may allow for objective descriptions of consciousness, but ultimate understanding of consciousness will not be possible until the question of subjective and objective is first answered.

Bibliography
Nagel, Thomas. “What Is It Like to Be a Bat?” The Philosophical Review, 1974: 435-450.

60

3

HOT THEORY: THE MENTALISTIC
REDUCTION OF CONSCIOUSNESS

Box 3.1 • Preview

The higher-order thought (HOT) theory of consciousness asserts that a mental
state is conscious if it is the object of a thought about it. Given that we have
some naturalistically acceptable understanding of thoughts independent
of the problem of consciousness, HOT theory promises a mentalistic
reduction of consciousness. Then, the naturalistic account of non-conscious
mind – which is presumably relatively easy to attain – solves the whole
mind–body problem. HOT theory makes substantial assumptions . It assumes
that the mind’s contents divide into the intentional (or representational)
and the non-intentional (qualia, sensations). It assumes that consciousness
requires conceptual thought, and what is more, requires apparently pretty
sophisticated concepts about mental states as such. It assumes that no mental
state is essentially a conscious state. It comes dangerously close to assuming
that consciousness is always and only of mental states. Not all these
assumptions are plausible, and they lead to many objections (e.g. can
animals, to whom the ability to engage in conceptual thought may be
doubted, be conscious; what is an unconscious pain, etc.). Some objections
can be deflected, but problems remain that engage the generation problem
and prevent the mentalistic reduction from going through successfully.

Philosophers have always been attracted by projects aiming to reduce consciousness
to Something Else, even if this reduction might require a more or less radical
reconception of our understanding of consciousness. They have been motivated
by the hope that, as compared to consciousness, the Something Else would prove
more tractable to analysis and would fit more easily into the physicalist world
view (here it is perhaps encouraging that, compared to consciousness, almost
anything else would possess these relative virtues). In the tradition of Descartes,
consciousness was supposed to exhaust the realm of the mind, which itself thus
became something immediately apparent and open to the mind’s own self
inspection (inasmuch as conscious states of mind were somehow essentially self-
intimating). There is of course something intuitively appealing to such a thesis

Seager, William. Theories of Consciousness : An Introduction, Taylor & Francis Group, 1999. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/oculwlu-ebooks/detail.action?docID=169266.
Created from oculwlu-ebooks on 2020-04-07 21:28:23.

C
o
p
yr

ig
h
t
©

1
9
9
9
.
T

a
yl

o
r

&
F

ra
n
ci

s
G

ro
u
p
.
A

ll
ri
g
h
ts

r
e
se

rv
e
d
.

HOT THEORY

61

but we have long since lost any sense that it must be true and are now happy to
countenance legions of unconscious mental states and hosts of cognitive processes
existing beneath or behind our conscious mental states. As we saw in chapter 1,
even Descartes ended up endorsing a form of the view that finds cognition, or
cognition-like phenomena outside of consciousness. A second traditional idea,
one stemming from the empiricist heritage, is that there are basic or ‘atomic’
elements of consciousness which are pure sensory qualities and from which all
‘higher’ states of consciousness are constructed, either by complex conjunction or
mental replication, or both. Hume, for example, calls these atomic elements the
simple impressions.1 The impressions are the truly immediate objects of
consciousness and their occurrence is supposed to be entirely independent of
thought. The radical proposal of the HOT theories is to deny this last claim. What
if consciousness were in fact dependent upon certain sorts of thoughts which
themselves were part of the now admissible zone of unconscious mentation?

The appealing possibility is that consciousness is somehow a definable
relation holding between certain mental states, where the latter do not already
essentially involve consciousness and, of course, are in themselves less puzzling
than consciousness itself. A mentalistic reduction of consciousness would have
several virtues. The explanation of consciousness in terms of mentality would
avoid the direct explanatory leap from consciousness to the physical, a leap
which has always seemed somewhat to exceed philosophy’s strength. If
consciousness can be reduced to anything at all, it is evidently more plausible
that it be to something already mental than directly to brute matter. Yet mental
states which do not intrinsically involve consciousness can be seen as ‘closer’ to
the natural, physical world, and so this sort of reduction promises to build a
bridge across our explanatory gap, supported by intermediate mental structures
which can be linked to both sides with relative ease.

In order to evaluate such a project we require a precise specification of, first,
the relevant non-conscious mental states and, second, the relation between them
that is to account for consciousness. One such reductive theory, distinguished by
its clarity and detailed presentation, has been advanced by David Rosenthal, first
in ‘Two Concepts of Consciousness’ (1986) and then in a series of papers that
have appeared over the last decade (see for example 1993a, 1993b, 1995). My
aim here is to review Rosenthal’s theory and to argue that, in the end, it fails to
reduce consciousness successfully. I will not claim outright that any theory of the
sort we are considering must similarly fail, but I confess that the wide scope and
extensive development of Rosenthal’s theory makes me doubt whether there are
other theories of this sort which differ significantly from it. Thus I hope my
objections will possess a quite general applicability.2

Rosenthal begins by dividing mental states into the two traditional, and
presumably exhaustive, classes: intentional mental states (e.g. beliefs, hopes,
expectations, etc.) and phenomenal or sensory mental states (e.g. pains, visual
sensations, etc.).3 For now I’ll follow Rosenthal in this distinction, but it is in fact
a substantial assumption which I shall doubt for much of the rest of this book, and

Seager, William. Theories of Consciousness : An Introduction, Taylor & Francis Group, 1999. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/oculwlu-ebooks/detail.action?docID=169266.
Created from oculwlu-ebooks on 2020-04-07 21:28:23.

C
o
p
yr

ig
h
t
©

1
9
9
9
.
T

a
yl

o
r

&
F

ra
n
ci

s
G

ro
u
p
.
A

ll
ri
g
h
ts

r
e
se

rv
e
d
.

THEORIES OF CONSCIOUSNESS

62

one that is curiously unsupported by the details of the HOT theory. Rosenthal
understands this distinction in terms of a division of mentalistic properties, so:

All mental states, of whatever sort, exhibit properties of one of two
types: intentional properties and phenomenal, or sensory, properties.
. . . Some mental states may have both intentional and phenomenal
properties. But whatever else is true of mental states, it is plain that we
would not count a state as a mental state at all unless it had some
intentional property or some phenomenal property.

(1986, p. 332)

The first demand of theory specification is then met by asserting that no mental states
are intrinsically or essentially conscious. This sweeping assertion would appear to be
necessary to ensure the completeness of the theory, for otherwise there would remain
a species of consciousness – the essential, non-relational sort of consciousness – for
which the theory would offer no account. The claim that mental states are not
intrinsically conscious is most plausible for the intentional states and least plausible
for the phenomenal states, but there are some intuitive grounds for both. It is
undeniable that we frequently ascribe intentional states of which we claim the subject
is not conscious, even as we also claim that these intentional states are part of the
causes and explanation of the subject’s behaviour. As for phenomenal states, Rosenthal
offers this:

Examples of sensory states that sometimes occur without
consciousness are not hard to come by. When a headache lasts for
several hours, one is seldom aware of it for that entire time. . . . But we
do not conclude that each headache literally ceases to exist when it
temporarily stops being part of our stream of consciousness, and that
such a person has only a sequence of discontinuous, brief headaches.

(1986, p. 349)

Of course, this is contentious, for one naturally wants to draw a distinction between
the headache and the persistent condition that underlies it. The ache, which is the
mental component, is indeed discontinuous but we allow the persistence of the
underlying cause to guide our speech, even though the underlying cause is
occasionally blocked from having its usual effect on consciousness. One wants to say
that the ache is a sensing of this underlying condition and this sensing is not
continuous. By analogy, if we are watching a woodpecker move through a dense
wood for an extended time we will not actually be seeing the bird throughout that
time. We nonetheless say that we watched the woodpecker for an hour. However, on
Rosenthal’s side, I should point out that in cases where the headache can be felt
whenever attention is directed towards it we are, I think, rather more inclined to say

Seager, William. Theories of Consciousness : An Introduction, Taylor & Francis Group, 1999. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/oculwlu-ebooks/detail.action?docID=169266.
Created from oculwlu-ebooks on 2020-04-07 21:28:23.

C
o
p
yr

ig
h
t
©

1
9
9
9
.
T

a
yl

o
r

&
F

ra
n
ci

s
G

ro
u
p
.
A

ll
ri
g
h
ts

r
e
se

rv
e
d
.

HOT THEORY

63

that the headache itself persisted even during the time it was not being consciously
experienced. This sort of neglected but continually accessible sensation is quite
common. If, even upon introspection, nothing was felt we would be reluctant to say
that the ache might still ‘be there’, whether or not the underlying condition persisted.
Of course, such considerations do not sever the relation between certain mental states
and consciousness, but they do make that relation more complex.

Box 3.2 • Essential HOT Theory

For α to be a conscious mental state, the subject must have a higher-order
thought about α. But not just any sort of thought, brought about in any sort
of way, will do. Roughly speaking, we can say that for α to be conscious
one must have the ‘properly’ acquired belief that one is in α. So HOT theory
defines consciousness as follows:

α is a conscious state of S if and only if (iff)
(1) S is in the mental state α,
(2) S has an ‘appropriate’ thought about α (we’ll call having

this thought ‘being in the state T[α]’; the content of
T[α] is something like ‘I am in state α’),

(3) S’s being in α causes S’s being in T[α],
(4) S’s being in α does not cause S’s being in T[α] via inference

or sensory information.
Each clause is necessary to avoid potential objections. It follows from HOT
theory that to be conscious of anything is to be conscious of it as something-
or-other. Every state of consciousness is ‘aspectual’. This follows from the
fact that every thought must be, so to speak, structured from concepts. But
it does not follow from HOT theory that anything has an essential conceptual
aspect under which one must be conscious of it. It also follows from HOT
theory that one can’t be conscious without having beliefs (i.e. the appropriate
higher-order thought). But it does not follow that when one is conscious of
a mental state that one is conscious of a belief. To be conscious of such
beliefs requires yet higher-order thoughts about them.

In any case, I don’t want to press this point since HOT theory may offer an
explanation of why we tend to think that consciousness is intrinsic to certain mental
states. This involves the second specification task, the delineation of the relation
between non-conscious mental states that accounts for consciousness. Rosenthal
explains it so:

. . . it is natural to identify a mental state’s being conscious with one’s
having a roughly contemporaneous thought that one is in that mental
state. When a mental state is conscious, one’s awareness of it is,

Seager, William. Theories of Consciousness : An Introduction, Taylor & Francis Group, 1999. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/oculwlu-ebooks/detail.action?docID=169266.
Created from oculwlu-ebooks on 2020-04-07 21:28:23.

C
o
p
yr

ig
h
t
©

1
9
9
9
.
T

a
yl

o
r

&
F

ra
n
ci

s
G

ro
u
p
.
A

ll
ri
g
h
ts

r
e
se

rv
e
d
.

THEORIES OF CONSCIOUSNESS

64

intuitively, immediate in some way. So we can stipulate that the
contemporaneous thought one has is not mediated by any inference
or perceptual input. We are then in a position to advance a useful,
informative explanation of what makes conscious states conscious.
Since a mental state is conscious if it is accompanied by a suitable
higher-order thought, we can explain a mental state’s being conscious
by hypothesizing that the mental state itself causes that higher-order
thought to occur.

(1986, pp. 335–36)

Thus it is possible to maintain that if we tend to think of certain sorts of mental states
as essentially involving consciousness this can be explained as the mistaking of a
purely nomological link for a ‘metaphysical’ one. It might be, for example, that pains
are normally such as to invariably cause the second-order thought that one is in pain
and that abnormal cases are exceptionally rare (and, needless to say, rather hard to
spot). In fact, this does not seem at all implausible. The machinery of philosophical
distinctions mounted above is then seen as merely a case of philosophical error
forcing us into an unnecessarily complex view of pains. It is literally true, according
to the HOT Theory, that a pain – in possession of its painfulness – can exist without
consciousness of it, but in fact almost all pains will be attended by consciousness of
them, in virtue of causing the appropriate state of consciousness. One might even
hope to account for the strength and constancy of this nomological link by appeal to
its evolutionary usefulness. Rosenthal comes close to making this point (while actually
making another) when he says: ‘. . . people cannot tell us about their non-conscious
sensations and bodily sensations usually have negligible effect unless they are
conscious. So non-conscious sensations are not much use as cues to [bodily] well
being . . .’ (1986, p. 348). Nature would not likely miss the chance to entrench a causal
connection between sensations, whether of pleasure or pain, and consciousness that
is of such obvious biological benefit. Still, I believe that there remain serious difficulties
with this view of the consciousness of phenomenal mental states, but it will take some
effort to bring out my worries clearly.

Before proceeding let me introduce a piece of notation. We will frequently need
to consider both a mental state and the second-order thought to the effect that one is
in the former mental state. I will use Greek letters for mental states and form the
second (or higher) order mental states as follows: the thought that one is in mental
state α will be designated by T[α]. If necessary, we can allow this construction to be
iterated, so the thought that one is in the mental state of having the thought that one
is in the mental state α gets formally named T[T[α]], and so on. This notation allows
a succinct characterization of HOT theory:

For any subject, x, and mental state, α, α is a conscious state iff
(1) x is in α,
(2) x is in (or, more colloquially, has) T[α],

Seager, William. Theories of Consciousness : An Introduction, Taylor & Francis Group, 1999. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/oculwlu-ebooks/detail.action?docID=169266.
Created from oculwlu-ebooks on 2020-04-07 21:28:23.

C
o
p
yr

ig
h
t
©

1
9
9
9
.
T

a
yl

o
r

&
F

ra
n
ci

s
G

ro
u
p
.
A

ll
ri
g
h
ts

r
e
se

rv
e
d
.

HOT THEORY

65

(3) x’s being in α causes x’s being in T[α],
(4) x’s being in α does not cause x’s being in T[α] via inference or

sensory information.

Note that for a to be a conscious state, the subject, x, must be in T[α], but x will not
normally be conscious of T[α] as well. This would require x to be in the still
higher-order state T[T[α]]. Such higher-order thoughts are entirely possible but
relatively rare; we are not usually conscious that we are conscious (of some
particular mental state) and HOT theory’s explanation of this is quite satisfying.
HOT theory has many other virtues which are well remarked by Rosenthal himself.

Still, the definition as it stands fails to mark a crucial distinction the neglect
of which can lead to confusion. We must distinguish between α’s being a conscious
state of the subject x and x’s being conscious of α. Sometimes HOT theorists as
well as objectors appear to be conflating the idea that the subject has a second-
order thought about α which makes α a conscious state with the idea that the
subject is conscious of α in virtue of having the second-order thought. I think it
would be an unfortunate consequence if HOT theory entailed that one could be
conscious only of mental states. Most conscious states have an (intentional)
object; a conscious perception of a cat has the cat as its object and the subject in
such a state is conscious not of his state of consciousness but rather of the cat,
that is, the intentional object of the state of consciousness. In fact, it is very rare
for anyone to be conscious of a mental state, at least if it is a mental state with its
own intentional object, and despite the philosophical tradition it is entirely
mistaken to define consciousness as an apprehension of one’s own mental states.
So in a spirit of improvement and to forestall confusion, we can emend the
definition as follows. If α is a conscious state and the intentional object of α is ∈
then we say that the subject is conscious of ∈ (in virtue of being in the conscious
state α). There may be, and Rosenthal assumes that there are, conscious states that
have no intentional objects. In such cases, saying that a is a conscious state is
equivalent to saying that the subject is aware of α. For example, if we suppose
that pains are ‘purely phenomenal’ states with no intentional objects then to be
conscious of a pain is just the same thing as the pain being conscious. But even
here we must be cautious. To be conscious of a pain in this sense is not to be
conscious of a pain as such. This is a much higher level affair demanding a state
of consciousness whose intentional object is the pain, conceived of as a pain. We
shall shortly see how attention to these distinctions can be important and can fit
rather nicely into the HOT theory.

It is worth digressing here to consider a line of objection to HOT theory
which I think ultimately fails. But the objection is interesting in at least three
ways: it endorses its own radical transformation of our notion of consciousness
and the reply to it reveals some subtle strengths of the HOT theory as well as
bringing out certain features crucial for the defence of a representational view of
consciousness. The attack is mounted by Fred Dretske (1993). Dretske’ s objections

Seager, William. Theories of Consciousness : An Introduction, Taylor & Francis Group, 1999. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/oculwlu-ebooks/detail.action?docID=169266.
Created from oculwlu-ebooks on 2020-04-07 21:28:23.

C
o
p
yr

ig
h
t
©

1
9
9
9
.
T

a
yl

o
r

&
F

ra
n
ci

s
G

ro
u
p
.
A

ll
ri
g
h
ts

r
e
se

rv
e
d
.

THEORIES OF CONSCIOUSNESS

66

fundamentally depend upon a distinction between an experience’s being
conscious and someone’s being conscious of that experience, and the claim that
the former does not imply the latter. If Dretske is right about this we have not only
a powerful challenge to HOT theories, but also a substantial and, I would say, very
surprising extension of our knowledge about consciousness. However, I will try
to show that Dretske’s objections cannot be sustained, revealing on the way some
subtle strengths of HOT theories of consciousness.

Dretske follows Rosenthal’s use of some key concepts in setting forth his
objections. Some states of mind are conscious and some are not: state
consciousness is the sort of consciousness which conscious states enjoy.
Conscious states are always (we think) states of some creature which is conscious:
creature consciousness marks the difference between the conscious and the un-
or non-conscious denizens of the universe. Creature consciousness comes in two
flavours: transitive and intransitive. Transitive creature consciousness is a
creature’s consciousness of something or other; intransitive creature consciousness
is just the creature’s being conscious. Dretske allows that transitive creature
consciousness implies the intransitive form, or

(1) S is conscious of x or that P ⇒ S is conscious. (1993, p. 269)

Furthermore, transitive creature consciousness implies state consciousness:

(2) S is conscious of x or that P ⇒ S is in a conscious state of some sort.
(1993, p. 270)

A further crucial distinction is evident in (1) and (2) – the distinction between
what Dretske calls thing-consciousness and fact-consciousness or the distinction
between being conscious of an object4 and being conscious that such-and-such is
the case.

Dretske’s basic objection to HOT theories, although articulated in a number
of ways, can be briefly stated in terms of some further claims involving these
distinctions. The most significant is that, in a certain sense, state consciousness
does not require creature consciousness. That is, Dretske allows that states can be
conscious without their possessor being conscious of them or conscious that they
are occurring. Consider, for example, someone who is consciously experiencing
a pain. By hypothesis, this is a conscious experience. Dretske’s claim is that it is
a further and independent question whether this person is conscious of the pain
or is conscious that he or she is in pain, and one which need not always receive a
positive answer. If Dretske is correct, then HOT theories would appear to be in
trouble, for they assert an identity between a state’s being a conscious experience
of pain and the possession of the belief than one is in pain.

Seager, William. Theories of Consciousness : An Introduction, Taylor & Francis Group, 1999. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/oculwlu-ebooks/detail.action?docID=169266.
Created from oculwlu-ebooks on 2020-04-07 21:28:23.

C
o
p
yr

ig
h
t
©

1
9
9
9
.
T

a
yl

o
r

&
F

ra
n
ci

s
G

ro
u
p
.
A

ll
ri
g
h
ts

r
e
se

rv
e
d
.

HOT THEORY

67

We must, however, re-emphasize a subtlety of the HOT theory here. The
belief that one is in pain, which according to HOT theories constitutes one’s
consciousness of the pain, does not itself have to be and generally will not be a
conscious state. One would be conscious of this belief only via a third-order state,
namely a belief that one believed that one was in pain. Thus one cannot refute the
HOT theory by claiming that it is possible for one consciously to experience pain
without consciously believing that one is in pain, that is, without being conscious
of a belief that one is in pain. HOT theories cheerfully embrace this possibility.
This is important because Dretske does not seem sufficiently to appreciate this
subtlety. He claims that HOT theories must make a negative answer to the following
question: ‘can one have conscious experiences without being conscious that one
is having them? Can there, in other words, be conscious states without the person
in whom they occur being fact-aware of their occurrence?’ (1993, p. 272). But,
plainly, HOT theories allow an affirmative answer to this question. To have a
conscious experience is, according to the theory, to believe that one is having it
but not necessarily to consciously believe that one is having it. To put the point
more generally in terms of the notation introduced above, to be conscious of a is
to be in the state T[α]; this says absolutely nothing about whether one is in the
state T[T[α]] or not, and it is the latter state that is required for T[α] to be conscious.
So, according to HOT theories we have, roughly,

S is conscious of pain = S believes that he is in pain,

so the correct analysis of fact-awareness must be along these lines:

S is conscious that he is in pain = S believes that he is in f(he is in pain),

where f is some self-ascription function. I would suggest that f(he is in pain)
should be cashed out as something like ‘. . . is in a state characterized by I am in
pain’.5 Of course, normally we are rapidly carried from the conscious pain to the
fact-awareness that we are in pain but this is a feature of our cognitive machinery,
not an analytic truth constraining HOT theories of consciousness. If one considers
animal consciousness the need to separate these states is apparent. HOT theories
must assert that an animal’s being conscious of something is the animal’s having
an appropriate thought. While this is a real difficulty for HOT theories of
consciousness, for there are many who would deny to animals the ability to have
thoughts of any kind and even more who would deny that they have thoughts
about their own mental states, this is not the difficulty Dretske advances.6 It is
natural to say that animals can be conscious of pains but that they cannot be
conscious that they are in pain. However, given that animals can have some,
perhaps quite ‘primitive’, thoughts (and the HOT theory simply must address

Seager, William. Theories of Consciousness : An Introduction, Taylor & Francis Group, 1999. ProQuest Ebook Central,
http://ebookcentral.proquest.com/lib/oculwlu-ebooks/detail.action?docID=169266.
Created from oculwlu-ebooks on 2020-04-07 21:28:23.

C
o
p
yr

ig
h
t
©

1
9
9
9
.
T

a
yl

o
r

&
F

ra
n
ci

s
G

ro
u
p
.
A

ll
ri
g
h
ts

r
e
se

rv
e
d
.

THEORIES OF CONSCIOUSNESS

68

animal consciousness in this way), the distinction is successfully accounted for
within HOT theories by the above analysis.

The worry that Dretske may not be taking this subtlety into account is
strengthened by his remark that: ‘HOT theories . . . take an experience to be
conscious in virtue of [its] being the object of some higher-order-thought-like
entity, a higher-order mental state that . . . involves the deployment of concepts.
My concern . . . therefore, was to show that conscious experience required no fact-
awareness . . .’ (1993, p. 279). Since HOT theories allow that experiences can be
conscious in the absence of fact-awareness of these experiences, this line of
attack is, strictly speaking, misguided. It may be that Dretske meant to assert no
more by ‘fact-awareness of p’ than ‘belief that p’, without any implication that
these beliefs are themselves conscious. Such an interpretation would not be foreign
to common usage and would lead immediately to the objection against HOT
theories considered below. But Dretske actually says that ‘consciousness of a fact
[which must surely be fact-awareness] . . . requires a conscious belief that this is a
fact’ (1993, p. 272, my emphasis). HOT theories do not require this, and would
consider it an unnecessary leap to third-order thoughts.

Box 3.3 • Dretske’s Objection

Since HOT theory makes every conscious state the object of a thought
about it, every conscious state has an associated conceptualization of it, as
given in the thought that ‘makes it’ conscious. Dretske objects that it is
possible for there to be conscious experience without any of what he calls
fact awareness. Fact awareness is consciousness of facts, which are conceptual
entities; an example would be an awareness that snow is white. One can be
aware of white snow without being aware that snow is white (one can even
be aware of the whiteness of snow without being aware that snow is white).
But HOT theory does not require any consciousness of facts for there to be
conscious experience; it only demands that there be some conceptual
categorization of the experience which is itself generally not conscious.
Dretske’s basic objection can thus be countered. Dretske can, however,
further deny that every conscious experience requires some
conceptualization of it. However, while one can plausibly argue that no
conscious experience has a mandatory conceptualization, it is very difficult
to show that some conscious experience has no conceptualization. HOT
theory asserts rather that every consciousness is a consciousness as of. . . .
Contrary to Dretske, this seems entirely plausible.

In any case, HOT theories do assert an intimate connection between conscious
experiences and beliefs about those …

error: Content is protected !!