Content and Consciousness

Home > Other > Content and Consciousness > Page 25
Content and Consciousness Page 25

by Daniel C. Dennett


  3 G. E. M. Anscombe, Intention, p. 14.

  4 H. Putnam, ‘Minds and Machines’, Dimensions of Mind, ed. S. Hook, New York, 1961, pp. 148–79.

  5 Ibid., p. 154.

  6 Ibid., pp. 155–60.

  7 See, e.g., N. Chomsky, Syntactic Structures, ’s-Gravenhage, 1957, Chomsky, ‘A Review of B. F. Skinner’s Verbal Behavior’, reprinted in The Structure of Language, eds. J. Fodor and J. Katz, Engelwood Cliffs, 1964, Sec. XI, pp. 574–8; Chomsky, ‘Three Models for the Description of Language’, I.R.E. Transactions on Information Theory, Vol. IT-2 (Sept. 1956), pp. 113–24; also K. S. Lashley, ‘The Problem of Serial Order in Behavior’, Hixon Symposium on Cerebral Mechanisms in Behavior, ed. L. A. Jeffress, New York, 1951, pp. 112–36; and E. Lenneberg, ‘The Acquisition of Language’ in Fodor and Katz, op. cit., pp. 579–603.

  8 W. Penfield, The Excitable Cortex in Conscious Man, Liverpool, 1958.

  9 In the past, some psychologists have wanted to hold that this phenomenon involved incipient but not quite detectable muscle movement, but although this can no doubt occur it is not longer seen as necessary.

  10 I am indebted to Dennis Stampe for raising these questions about speech acts and intentions.

  11 In the light of this view, Wittgenstein’s view that ‘the verbal expression of pain replaces crying and does not describe it’ may seem on reflection to be correct, except for the suggestion it carries – due to its failure to distinguish the personal from sub-personal level – that a verbal expression of pain (viewed from the personal level) is not intentionally a bona fide report, but rather an outcry of sorts.

  6

  AWARENESS AND CONSCIOUSNESS

  1 I mean ‘speech centre’ in the functional or logical sense developed in Chapter 5, of course, and not in any standard anatomical sense. It is my impression that this definition of awareness is congenial with G. Bergmann’s remarks on awareness in Meaning and Existence, Madison, Wisc., 1960. If I understand him right he too is concerned with what one might call the functional location of the contents of awareness.

  2 These very rough definitions leave many questions unanswered, e.g., what does ‘effective’ mean in (2), and how big a role must an event play in directing behaviour to count? Since I do not intend to build on these definitions but merely use them to illustrate a distinction, there is no point in elaborating answers to these interesting and important questions here. As it is these unwieldy definitions can serve as reference points for the uses of the terms in what follows, but for the sake of less cumbersome expression I shall revert to the ‘aware of’ form where it is convenient, speaking of a creature being aware1 of something or aware2 of something, letting it be understood that in these cases the creature is aware1 or aware2 that p, where p is a statement informing us about the ‘object’ of awareness.

  3 K. M. Sayre, ‘Human and Mechanical Recognition’ in K. M. Sayre and F. J. Crosson, eds., The Modeling of Mind, Notre Dame, 1963, pp. 157–70.

  4 Recognition without awareness1 would be the natural way to describe what occurs when we scan a list of words for, say, a colour word, discarding all non-colour words without noticing or being aware of what they are. Clearly this must be recognition with awareness2 and without awareness1. See, e.g., U. Neisser, ‘Visual Search’, Scientific American, 210 (June) 1964. On subconscious testing and awareness, see also my ‘Machine Traces and Protocol Statements’, Behavioral Science, Mar. 1968, pp. 155–62.

  5 M. Scriven, ‘The Mechanical Concept of Mind’, Mind, LXII, 1953, reprinted in Minds and Machines, ed. A. R. Anderson, Englewood Cliffs, 1964, p. 33.

  6 K. S. Lashley, ‘Cerebral Organization and Behavior’, The Brain and Human Behavior (Proc. of the Association for Research in Nervous and Mental Disease, Vol. 36, eds. H. C. Solomon, S. Cobb and W. Penfield, Baltimore, 1958, p. 4).

  7 See, e.g., Taylor, op. cit., pp. 58–67, and D. Hamlyn’s review of Taylor in Mind, LXXVI, 1967, p. 130.

  8 Cf. Anscombe, Intention, p. 86, where this conflict is clearly presented but not clearly resolved.

  7

  MENTAL IMAGERY

  1 Optimists who doubt that mental images are still taken seriously in philosophy and even in science are invited to peruse two recent anthologies, R. J. Hirst, ed., Perception and the External World, New York, 1965, and J. R. Smythies, ed., Brain and Mind, Modern Concepts of the Nature of Mind, London, 1965. The wealth of cross-disciplinary confusions over mental images is displayed in both volumes, which both include papers by philosophers, psychologists and neurophysiologists. Neither editor seems to think that much of what he presents is a dead horse, which strengthens my occasionally flagging conviction that I am not beating one. On the other hand there are scientists who have expressed clear and explicit rejections of imagistic confusions. See, e.g., G. W. Zopf, ‘Sensory Homeostasis’ in Wiener and Schadé, op. cit., esp. p. 118, and D. M. MacKay, ‘Internal Representation of the External World’, unpublished, read at the Avionics Panel Symposium on Natural and Artificial Logic Processors, Athens, July 15–19, 1963.

  2 H. B. Barlow, ‘Possible Principles Underlying the Transformations of Sensory Messages’ in Sensory Communication (op. cit.) offers a particularly insightful account of the ‘editorial’ function of afferent neural activity and the depletion of information that is the necessary concomitant of such analysis.

  3 J. M. Shorter, ‘Imagination’, Mind, LXI, 1952, pp. 528–42.

  4 Counter-examples spring to mind, but are they really counterexamples? All the ones that have so far occurred to me turn out on reflection to be cases of imagining myself seeing – with the aid of large mirrors – the inside and outside of the barn, imagining a (partially) transparent barn, imagining looking in the windows and so forth. These are all from a point of view in the sense I mean. A written description, however, is not bound by these limitations; from what point of view is the description: ‘the barn is dark red with black rafters and a pine floor’?

  5 In the unusual phenomenon of ‘eidetic imagery’, the subject can read off or count off the details of his ‘memory image’, and this may seem to provide the fatal counter-example to this view. (See G. Allport, ‘Eidetic Imagery’, British Journal of Psychology, XV, 1924, pp. 99–120.) Yet the fact that such ‘eidetic memory images’ actually appear to be projected or superimposed on the subject’s normal visual field (so that if the subject shifts his gaze the position of the memory image in his visual field remains fixed, and ‘moves with the eye’) strongly suggests that in these cases the actual image of retinal stimulation is somehow retained at or very near the retina and superimposed on incoming stimulation. In these rare cases, then, the memory mechanism must operate prior to afferent analysis, at a time when there still is a physical image.

  6 Penfield, op. cit. Some of Penfield’s interpretations of his results have been widely criticized, but the results themselves are remarkable. It would be expected that hallucinations would have to be the exception rather than the rule in the brain for event-types to acquire content in the first place, and this is in fact supported by evidence. Amputees usually experience ‘phantom limb’ sensations that seem to come from the missing limb; an amputee may feel that he not only still has the leg, but that it is itching or hot or bent at the knee. These phenomena, which occur off and on for years following amputation, are nearly universal in amputees, with one interesting exception. In cases where the amputation occurred in infancy, before the child developed the use and coordination of the limb, phantom limb is rarely experienced, and in cases where amputation occurred just after birth, no phantom limb is ever experienced (see M. Simmel, ‘Phantom Experiences following Amputation in Childhood’, Journal of Neurology, Neurosurgery and Psychiatry, XXV, 1962, pp. 69–78).

  7 Other phenomena less well known to philosophers also favour a descriptional explanation. See, e.g., W. R. Brain’s account of the reports of patients who have their sight surgically restored, in ‘Some Reflections on Mind and Brain’, Brain, LXXXVI, 1963, p. 381; the controversial accounts of newly si
ghted adults’ efforts to learn to see, in M. von Senden, Raum- und Gestaltauffassung bei operierten Blindgeborenen vor und nach der Operation, Leipzig, 1932, translated with appendices by P. Heath as Space and Sight, the Perception of Space and Shape in the congenitally blind before and after operation, London, 1960; I. Kohler’s experiments with inverting spectacles (a good account of these and similar experiments is found in J. G. Taylor, The Behavioral Basis of Perception, New Haven, 1962); and the disorder called simultanagnosia, M. Kinsbourne and E. K. Warrington, ‘A Disorder of Simultaneous Form Perception’, Brain, LXXXV, 1962, pp. 461–86 and A. R. Luria, et al., ‘Disorders of Ocular Movement in a Case of Simultanagnosia’, Brain, LXXXVI, 1963, pp. 219–28.

  8 Cf. Wittgenstein, ‘But the existence of this feeling of strangeness does not give us a reason for saying that every object we know well and which does not seem strange to us gives us a feeling of familiarity’, op. cit., i. 596. See also i. 597, i. 605.

  9 Muntz, op. cit. and Wooldridge, op. cit., pp. 46–50.

  10 Having found no room for images in the sub-personal account of perception, we can say that ‘mental image’ and its kin are poor candidates for referring expressions in science; having found further that nothing with the traits of genuine images is to be found at the personal level either allows us to conclude that ‘mental image’ is valueless as a referring expression under any circumstances.

  11 J. J. C. Smart discusses this and other colour phenomena in Philosophy and Scientific Realism, London, 1963, Ch. IV and cites (with some philosophically negligible distortion) the particularly remarkable findings of E. H. Land in ‘Experiments in Color Vision’, Scientific American, No. 200 (May) 1959, pp. 84–99.

  12 But not infinite. It is easy to make a device (a colour discriminator) that gives the same output (produces the same colour experience) when given a number of different inputs with no distinguishing feature in common. Suppose we choose three such inputs. Then we simply make three receptors each sensitive to just one of these and wire them disjunctively to the output, so that any one of them firing causes the output to fire. A single receptor can be sensitive only to conditions having some physical similarity (unless it in turn branches disjunctively into a number of receptors), so an infinite number of different conditions which could not be sorted into a finite number of families of conditions would require an infinite number of different receptors in the discrimination system.

  13 D. M. Armstrong has advanced this position in discussion with me.

  8

  THINKING AND REASONING

  1 Cf. S. Munsat, ‘What is a Process?’, American Philosophical Quarterly, 1969, pp. 79–83.

  2 Ryle, op. cit., p. 286ff.

  3 Ibid., p. 285.

  4 Striking cases of confusion of level often occur in the casual talk of neurologists and psychophysicists. Once a psychophysicist, in explaining to me his interpretation of certain data concerning afferent analysis in visual perception said of his human subject, ‘he performs a statistical analysis on the incoming information, making decisions with regard to what is mere noise and what is significant, by trying to maintain a low false-positive rate.’ The person, of course, does nothing of the kind; the person in this instance was just looking for spots on pieces of paper.

  5 J. Bennett, Rationality, London, 1964, p. 117.

  6 See Feigenbaum and Feldman’s anthology, op. cit., esp. Newell and Simon, op. cit. See also for a critical view, H. Dreyfus, ‘Alchemy and Artificial Intelligence’, RAND Memo, P 3244, Dec. 1965, and my rebuttal, ‘Machine Traces and Protocol Statements’.

  7 Cf. A. M. Turing’s famous paper, ‘Computing Machines and Intelligence’, sec. 7, in Mind, LIX, 1950, pp. 433–60, reprinted in Anderson, Minds and Machines.

  8 Anscombe, Intention, p. 80.

  9 D. C. Davidson, ‘Actions, Reasons and Causes’, Journal of Philosophy, Vol. LX, No. 23 (Nov. 1963), pp. 685–700.

  10 Anscombe, op. cit., p. 15.

  11 Ibid., pp. 23–4.

  9

  ACTIONS AND INTENTIONS

  1 Anscombe, op. cit., pp. 24–5.

  2 Ibid., p. 82.

  3 Ibid., p. 82.

  4 Ibid., p. 57.

  5 Ibid., p. 25.

  6 This account of intentional actions carries the corollary that that in virtue of which a particular motion is a particular intentional action is in principle – if not in practice – physically determinable in some centralist theory. Although the account here is in substantial agreement with Anscombe’s analysis, she has an argument (pp. 28–9) purporting to disprove this corollary, to establish the unbridgeable Intentionalist gap for intentional actions. For a rebuttal, see my ‘Features of Intentional Actions’, Philosophy and Phenomenological Research, XXIX, 1968, pp. 232–44.

  7 Anscombe, op. cit., pp. 48–9.

  8 Ibid., p. 52.

  9 D. M. MacKay points out that the relatively macroscopic size of neuronal events and the redundancy requirements noted in Chapter 3 rule out any cumulative effect on behaviour of quantum level randomness (‘Brain and Will’, Listener, 57 (1957), pp. 788–9).

  10 Anscombe, op. cit., p. 24.

  11 Ibid., p. 34.

  10

  LANGUAGE AND UNDERSTANDING

  1 Cf. M. Scriven in D. M. Mackay et al., ‘Computers and Comprehension’ (op. cit.): ‘One either knows three threes are nine or one doesn’t. It isn’t the kind of thing one is said to understand’ (p. 33).

  2 E. Edwards, Information Transmission, London, 1964, p. 39.

  3 D. M. MacKay, ‘Linguistic and Non-Linguistic “Understanding” of Linguistic Tokens’, p. 42. The Intentionality of intentions forces a revision of this if we are to be rigorous. Few people, if any, would ever intend an utterance to have a selective function on the range of possible states of the receiver (‘I intended no such thing. I merely was trying to tell him something!’). We can say that the effect which is the necessary condition of his intention being fulfilled is this selective function. I do not want to suggest in any case that this is the definition of the meaning of an utterance, but only that it is an illuminating one.

  4 Ibid., p. 9.

  INDEX

  access see introspection

  act/action 22, 37–8;

  intentional 39, 43, 147, 184–200;

  mental 147, 214

  activity 195–6, 212;

  personal 103–4, 166;

  reasons 187, 191, 193

  adaptiveness see appropriateness

  afferent 51, 58;

  -efferent connection 52, 54–7, 62–5, 68–70, 76–9, 81, 83–7, 91, 93, 102, 106, 186–7;

  ascription 82, 92, 94;

  awareness 137;

  introspection 124;

  mental imagery 154;

  thinking 173

  Allport, G. 228

  ambiguity 9, 60, 62, 79, 83, 90, 143, 177

  analysis/analyticity 8, 14–16;

  actions 184;

  ascription 81, 83, 86, 92–3, 102–3, 105–6;

  awareness 134, 137, 144–5, 147;

  evolution 51, 56, 59;

  imagery 151–2, 154, 159–60, 168–9;

  intentionality 27–8;

  introspection 111, 124–5;

  language 214–15;

  thinking 168–9, 177

  Anscombe, G.E.M. 112–13, 174, 180–1, 184–5, 187–90, 192, 195, 199–200, 220, 225, 227, 230–1

  Anselm 23

  aphasia 74, 120

  appropriateness 40–1, 44;

  ascription 80, 90, 96;

  evolution 50–1, 54, 62–4, 75;

  thinking 171

  Arbib, M. 61, 223

  Aristotle 78, 174–5

  Armstrong, D.M. 229

  attention 45, 65, 111, 132–3, 135, 137, 139–40, 157, 185

  avowal 112, 225

  awareness 128–49, 154–5;

  intentions 184–9, 191–2, 194–8, 200;

  reasoning 164, 168, 171, 173, 176, 180

  Barlow, H.B. 227

  baroqu
e 64, 69

  behaviourism 25, 35–7, 40–2, 45, 69, 82–3, 101

  belief 23, 27;

  ascription 81, 86–8, 91, 100, 107;

  goals 78;

  intentions 34, 37, 40, 44, 184, 187, 191, 201–3;

  mental imagery 159;

  reasoning 164;

  understanding 201–3

  Bennett, J. 167, 230

  Bergmann, G. 226

  Berkeley, G. 3

  bracketing see epoché

  brain 46–7, 51–2;

  ascription 80–4, 86, 88–9, 92–5, 97–8, 101–6, 108;

  consciousness 136–7, 139–41, 144, 146;

  goals 70–1, 77–9;

  intentions 193–6, 200;

  introspection 111–12, 118–20, 125–7;

  mental imagery 150–2, 155;

  reasoning 164, 167, 170, 173–5;

  structures 55–7, 59–68;

  understanding 213–15

  Brain, W.R. 228

  Brentano, F. 21–5, 29, 31, 35, 220

  brute force computing 169–70

  Carnap, R. 221

  category 7–8, 12, 14, 45, 107, 218

  cause 4, 8;

  ascription 85, 88, 91–2, 94;

  goals 70;

  intentions 37–9, 193, 195, 197–8;

 

‹ Prev