Want to know more about uses of the term "consciousness?"

Try the entry on 'consciousness' in S. Guttenplan (ed) A Companion to
Philosophy of Mind, Blackwell, 1994.  Or the entry 'Consciousness in
Anglo-American Philosophy' in The Encyclopedia of Philosophy Supplement,
Macmillan, 1996.  Here is a version of an encyclopedia article on this topic:


T. H. Huxley said "How it is that anything so remarkable as a state of
consciousness comes about as a result of irritating nervous tissue, is just
as unaccountable as the appearance of Djin when Aladdin rubbed his lamp."
We have no conception of our physical or functional nature that allows us
to understand how it could explain our subjective experience, or so says
one point of view on consciousness.  This issue has dominated the
discussion of consciousness in recent years.  Neuroscientists have
hypothesized that the neural basis of consciousness is to be found in
certain phase-locked 40 Hz neural oscillations.  But how does a 40 Hz
neural oscillation explain what its LIKE (in Nagel's memorable phrase) to
be us?   What is so special about a 40 Hz oscillation as opposed to some
other physical state?  And why couldn't there be creatures with brains just
like ours in their physical and functional properties, including their 40
Hz oscillation patterns, whose owners' experiences were very unlike ours,
or who had no subjective experiences at all?  One doesn't have to suppose
that there really could BE creatures with brains just like ours who have
different experiences or no experiences to demand an account of why NOT?
But no one has a clue about how to answer these questions.  This is the
heart of the mind-body problem.

        Consciousness in the sense discussed is  phenomenal  consciousness.
"What's  that? ", you ask.  There is no non-circular definition to be
offered; the best that can be done is the offering of synonyms, examples
and one or another type of pointing to the phenomenon.  For example, I used
as synonyms `subjective experience' and `what it is like to be us'.  In
explaining phenomenal consciousness, one can also appeal to conscious
properties or qualities, e.g. the ways things seem to us or immediate
phenomenological qualities.  Or one can appeal to examples: the ways things
look or sound, the way pain feels and more generally the experiential
properties of sensations, feelings and perceptual experiences.  I would
also add that thoughts, wants and emotions often have characteristic
conscious aspects, and that a difference in representational content can
make a phenomenal difference.  Seeing something as a cloud differs from
seeing it as a part of a painted backdrop.  What it is like to hear
Bulgarian spoken depends on whether one understands the language.

        We gain some perspective on the explanatory gap if we contrast the issue
of the physical/functional basis of consciousness with the issue of the
physical/functional basis of thought.  In the case of thought, we do have
some theoretical proposals about what thought is, or at least what human
thought is, in scientific terms.  Cognitive scientists have had some
success in explaining some features of our thought processes in terms of
the notions of representation and computation.  There are many
disagreements among cognitive scientists: especially notable is the
disagreement between connectionists and classical "language of thought"
theorists. However, the fact is that in the case of thought, we actually
have more than one substantive research program and their proponents are
busy fighting it out, comparing which research program handles which
phenomena best.  But in the case of consciousness, we have
nothing--zilch--worthy of being called a research program, nor are there
any substantive proposals about how to go about starting one. Researchers
are  stumped .  There have been many tantalizing discoveries recently about
neuropsychological syndromes in which consciousness seems to be in some way
missing or defective, but no one has yet come up with a theoretical
perspective that uses these data to narrow the explanatory gap, even a
little bit.

Needless to say, there are many different attitudes towards this problem,
but five of them stand out.  First, we might mention eliminativism, the
view that consciousness as understood above simply does not exist (P. S.
Churchland, 1983; Dennett, 1988; Rey, 1983).  So there is nothing for there
to be an explanatory gap about.  Second, we have various forms of
reductionism, notably functionalism and physicalism.  According to these
views, there is such a thing as consciousness, but there is no singular
explanatory gap, that is, there are no mysteries concerning the physical
basis of consciousness that differ in kind from run of the mill unsolved
scientific problems about the physical/functional basis of liquidity,
inheritance or computation.  On this view, there is an explanatory gap, but
it is unremarkable. (Dennett, 1991 flirts with both this position and
eliminativism.)  A third view is what Flanagan (1992) calls the new
mysterianism.  Its most extreme form is  transcendentalism, the view that
consciousness is simply not a natural phenomenon and is not explainable in
terms of science at all.  A less extreme form of new mysterianism is that
of McGinn (1991), which concedes that consciousness is a natural phenomenon
but emphasizes  our  problem in understanding the physical basis of
consciousness.  McGinn argues that there are physical properties of our
brains that do in fact explain consciousness, but though this explanation
might be available to some other type of being, it is cognitively closed
off to us.  A fourth view, naturalism (Flanagan, 1992, Searle, 1992), holds
that though there may be important differences between a naturalistic
explanation of consciousness and naturalistic explanations of other
phenomena, there is no convincing reason to regard consciousness as
non-natural or unexplainable in naturalistic terms.  This view is suggested
by Nagel's remark that we are like the person ignorant of relativity theory
who is told that matter is a form of energy but who does not have the
concepts to appreciate how there could be chains of reference-links leading
from a single phenomenon to both `matter' and `energy'.  The explanatory
gap exists--and we cannot conceive of how to close it--because we lack the
scientific concepts.  But future theory may provide those concepts.  A
fifth view says the gap is unclosable, but not because we cannot find the
right physical concepts.  Rather, it is unclosable because reductive
explanation requires an a priori functional analysis of the phenomenon to
be explained, and no such analysis can be given of our concepts of
conscious experience (Levine, 1993). The unavailabilty of these functional
analyses is no accident: if our only concept of consciousness was one that
could be analyzed functionally, we would need a different concept of
consciousness to capture the features of experience that give rise to the
explanatory gap.

OTHER CONCEPTS OF CONSCIOUSNESS  Thus far I have been talking about
phenomenal consciousness.  But there are other concepts of
consciousness--cognitive or intentional or functional concepts of
consciousness--that are often not distinguished from it, and it is common
for deflationists or reductionists about phenomenal consciousness to
tacitly slide from phenomenal consciousness to one or another of these
cognitive or intentional or functional concepts.  (See Dennett, 1991, and
Block, 1995.)  I will mention three such concepts of consciousness:
self-consciousness, monitoring-consciousness and access-consciousness.

(1)  Self-consciousness  is the possession of the concept of the self and
the ability to use this concept in thinking about oneself.  There is reason
to think that animals or babies can have phenomenally conscious states
without employing any concept of the self.  To suppose that phenomenal
consciousness requires the concept of the self is to place an implausible
intellectual condition on phenomenal consciousness.  Perhaps phenomenally
conscious states have a  non-conceptual  content that could be described as
"experienced as mine", but there is no reason to think that this
representational aspect of the state exhausts its phenomenal properties.
After all, if both my experience as of blue and my experience as of red are
experienced as mine, we still need to explain the difference between the
two experiences; the fact that they are both experienced as mine will not
distinguish them. (The `as of' terminology is intended to preclude cases in
which red things don't look red.)

(2)   Monitoring-consciousness  takes many forms. One form is "internal
scanning", but it would be a mistake to conflate internal scanning with
phenomenal consciousness.  As Rey (1983) notes, ordinary laptop computers
are capable of internal scanning, but it would be silly to think of one's
laptop as conscious.  Rey favors supposing that internal scanning is
sufficient for consciousness, if there is such a thing, and so he concludes
that consciousness is a concept that both includes and precludes laptop
computers being conscious, and hence that the concept of consciousness is
incoherent.  But even if we acknowledge "internal scanning consciousness",
we should drop the idea that internal scanning is sufficient for
phenomenal  consciousness, and so we get no incoherence.

        Another form of monitoring consciousness is that of accompaniment by a
higher order thought.  That is, a conscious state is one that is
accompanied by a thought (grounded non-inferentially and
non-observationally) to the effect that one is in that state.  I favor a
liberal terminological policy, and so I have no objection to this idea as a
concept of consciousness.  But I do object to the idea (Rosenthal, 1986)
that phenomenal consciousness should be identified with
higher-order-thought consciousness.  One way to see what is wrong with that
view is to note that even if I were to come to know about states of my
liver non-inferentially and non-observationally--as some people just know
what time it is--that wouldn't make the states of my liver phenomenally
conscious (see Dretske, 1995).  Another objection is that phenomenal
consciousness does not require the intellectual apparatus that is required
for higher order thought.  Thus, the identification of phenomenal
consciousness with higher order thought shares the over-intellectualism of
the identification of phenomenal consciousness with self-consciousness.
Dogs and babies may have phenomenally conscious pains without thoughts to
the effect that they have those pains.

        A distinction is often made between  state consciousness , or
consciousness--and  consciousness-of , or transitive consciousness
(Rosenthal, 1986).  For example, if I say I'm nauseous, I ascribe a kind of
intransitive consciousness to myself, and if I say I am now seeing
something as a mosquito, I ascribe transitive consciousness.  The higher
order thought view purposely collapses these notions.  According to the
higher order thought view, a conscious state (intransitive consciousness)
of mine is simply a state that I am conscious of (transitive
consciousness), and consciousness-of is a matter of accompaniment by a
thought to the effect that I am in that state.  So what it is for a state
of mine to be conscious (intransitively) is for it to be accompanied by a
thought that I am in that state.

        This intentional conflation has an element of plausibility to it, which
can be seen by comparing two dogs, one of which has a perceptual state
whereas the other has a similar perceptual state plus a representation of
it.  Surely the latter dog has a conscious state even if the former dog
does not!  Quite so, because consciousness-of  brings consciousness with
it.  But it is the  converse  that is problematic.  State consciousness
makes less in the way of intellectual demands than consciousness-of, and so
the first dog could be conscious without being conscious of anything.

(3)   Access-consciousness  does not make the intellectual demands of
self-consciousness or higher-order-thought consciousness, and for that
reason, reductionists about phenomenal consciousness would do better to
identify phenomenal consciousness with access-consciousness.  A state  is
access-conscious if it is poised for global control. To add more detail, an
access-conscious representation is poised for free use in reasoning and for
direct control of action and speech.  An access-conscious state is one that
consists in having an access-conscious representation.

        A good way to see the distinction between access-consciousness and
phenomenal consciousness is to note cases of one without the other.
Consider a robot with a computer brain that is behaviorally and
computationally identical to ours.  The question arises as to whether what
it is like to be that robot is different from what it is like to be us, or,
indeed, whether there is anything at all that it is like to be that robot.
If there is nothing it is like to be that robot, the robot is a zombie. If
zombies are conceptually possible, they certainly illustrate
access-consciousness without phenomenal consciousness.  But there is
widespread opposition to the conceptual coherence of zombies.  (See
Shoemaker, 1975, 1981; Dennett, 1991)  So for illustrating
access-consciousness without phenomenal consciousness, I would rather rely
on a very limited sort of partial zombie.

        Consider blindsight, a neurological syndrome in which subjects seem to
have "blind" areas in their visual fields.  If the experimenter flashes a
stimulus to one of those blind areas, the patient claims to see nothing at
all.  But if the experimenter insists that the subject guess, and the
experimenter supplies a few alternatives, the blindsight patients are able
to "guess" reliably about certain features of the stimulus, features having
to do with motion, location, direction, and they are able to discriminate
some simple forms.  Consider a blindsight patient who "guesses" that there
is an `X' rather than an `O' in his blind field.  The patient has no
access-consciousness of the stimulus (because, until he hears his own
guess, he cannot use the information freely in reasoning or in rational
control of action), and it is plausible that he has no phenomenal
consciousness of it either.  Now imagine something that does not exist,
what we might call  super-blindsight .  A real blindsight patient can only
guess when given a choice among a small set of alternatives (`X'/`O',
horizontal/vertical, etc.)  But suppose (apparently contrary to fact) that
a blindsight patient could be trained to prompt himself at will, guessing
what is in the blind field without being told to guess.  Visual information
from the blind field simply pops into his thoughts the way that solutions
to problems sometimes pop into ours or (to use an example given earlier)
the way some people just know what time it is without any special
perceptual experience.  The super-blindsight patient says there is
something it is like to see an `X' in his sighted field, but not in his
blind field, and we believe him.  This would be a case of
access-consciousness without phenomenal consciousness, a sort of partial

        Here is an example of the converse of the zombie cases, namely
consciousness without access-consciousness.   It appears that some areas of
the brain specialize in reasoning and rational control of action, whereas
other areas subserve sensation.  If a person's brain has the former areas
destroyed, he is unable to use the deliverances of the senses to rationally
control action, to reason or to report sensibly, but he can still have
experiences.  Such a person has phenomenal consciousness without access

        Here is a different sort of example. Suppose that you are engaged in
intenseconversation when suddenly at midnight you realize that there is now
and has been for some time a deafening pounding noise going on.  You had
raised your voice in response to the noise, but you had not noticed the
noise or that you had raised your voice.  You were aware of the noise all
along, but only at midnight were you consciously aware of it.  That is, you
were phenomenally conscious of the noise all  along, but only at midnight
did you become access-conscious of it.  The period before midnight
illustrates phenomenal consciousness without access-consciousness.
`Conscious' and `aware' are roughly synonomous, so it is natural to use one
for the period before midnight, and both for the period after midnight when
there are two kinds of consciousness present.  The Freudian sense of
 unconscious  means access-unconscious.  Suppose a person was tortured in a
red room and represses that fact; Freudian theory allows visual images of
the red room that lead to desperate flight when the person is in a
similarly colored room, but mechanisms of repression can prevent thought,
reasoning and reporting about the torture and the room.  The vivid visual
image would be phenomenally conscious but access-unconscious.

        The cases I've mentioned of phenomenal consciousness without
access-consciousness are also counterexamples to the higher order thought
theory of phenomenal consciousness.  If the subject has no access to the
phenomenal state, he can't think about it either.  Before midnight, I have
a phenomenally conscious state caused by the noise but no thought to the
effect that I am in such a state.  The victim of repression has a
phenomenally conscious state that he is unable to think about.

        Akins (1993) has argued against the distinction between a
phenomenal and a
representational aspect of experience.  She keys her discussion to Nagel's
(1974) claim that we cannot know what it is like to be a bat, challenging
the reader to imagine that what it is like to be a bat is just what it is
like to be us--only all those experiences represent totally different
things.  Correctly, she says that you cannot imagine that.  That is
because, as I mentioned earlier, representational differences of a certain
sort make a phenomenal difference.  What it is like to hear a sound as
coming from the left is different from what it is like to hear a sound as
coming from the right. But from the fact that some representational
differences make a phenomenal difference, one should not conclude that the
phenomenal character of a conscious state is exhausted by its
representational content. Note, for example, that there are phenomenal
states that arguably have little or no representational content, orgasm for
example. (But see Tye, 1995 for the opposite view.)


Akins, K. (1993) `A bat without qualities'.  In Davies and Humphreys,
CONSCIOUSNESS Blackwell: Oxford

Block, N. (1995)  On a Confusion about a function of Consciousness , THE

Churchland, P.S. (1983) `Consciousness: the transmutation of a concept'.

Dennett, D (1988) `Quining Qualia.'  In A. Marcel & E. Bisiach (eds)

Dennett, D. (1991)  CONSCIOUSNESS EXPLAINED . Little Brown: New York

Dretske, F.(1995) NATURALIZING THE MIND MIT Press: Cambridge

Flanagan, O. (1992)  CONSCIOUSNESS RECONSIDERED  MIT Press: Cambridge

Levine, J. (1993) "On Leaving Out What it is Like" .  In M. Davies and G.
Humphreys, CONSCIOUSNESS Blackwell: Oxford

Lycan, W. (1987)  CONSCIOUSNESS   MIT Press: Cambridge

McGinn, C. (1991)  THE PROBLEM OF CONSCIOUSNESS . Blackwell

Nagel, T. (1974) `What is it like to be a bat?'   PHILOSOPHICAL REVIEW

Rey, G. (1983) `A reason for doubting the existence of consciousness.'  In
CONSCIOUSNESS AND SELF-REGULATION , vol 3. R. Davidson, G. Schwartz, D.
Shapiro (eds). Plenum

Rosenthal, David (1986) `Two concepts of consciousness.'  PHILOSOPHICAL
STUDIES  49: 329- 359

Searle, J. (1992)  THE REDISCOVERY OF THE MIND  MIT Press: Cambridge

Shoemaker, S. (1975) `Functionalism and qualia.'   PHILOSOPHICAL STUDIES
27: 291-315.

Shoemaker, S. (1981) `Absent qualia are impossible--a reply to Block'.  THE


Ned Block, Professor of Philosophy and Psychology
Department of Philosophy, New York University, Main Building Room 503E
100 Washington Square East, New York NY 10003
TEL: 212-998-8322, Phil Dept: 212-998-8320; FAX: 995-4179
Web site: http://www.nyu.edu/gsas/dept/philo/faculty/block/