From a.sloman@cs.bham.ac.uk Sat Jun 24 17:07:23 2000 Return-path: Date: Sat, 24 Jun 2000 17:08:12 +0100 (BST) From: Aaron Sloman To: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" Subject: Evolution and consciousness (Was: Re: A computational conundrum) Cc: Max Velmans , Bernard Baars Greetings all. In a few days, at ASSC4 I look forward to attaching faces to many interesting people so far encountered only on this list. With two colleagues, Brian Logan and Matthias Scheutz, I'll be presenting a poster on "Evolvable Architectures for Human Like Minds". It was a late submission and isn't included in the poster abstracts on the web site. Some comments relating to a recent message from Bernie Baars which is relevant to this topic. He wrote > I particularly want to endorse Doug Watt's point about considering > consciousness in the evolved biological brain in light of the following > announcement on the Society for Philosophy and Psychology listserv, about a > forthcoming debate in London in which no one seems to be especially confident > that consciousness is a biological phenomenon at all. Viz: > ... Part of the problem is that "consciousness", like "emotion", "knowledge", "understanding" is a *cluster* concept. It refers in an indeterminate fashion to a large collection of different things, different subsets of which are relevant in different contexts. Unfortunately people think they know intuitively what THEY mean by it when they don't -- just as people used to think they knew intuitively what simultaneity is, because they often experienced simultaneity. Then Einstein showed we didn't know what we meant after all. (The history of mathematics is full of such examples such concepts: "number", "continuity", "infinity", "function", "variable", "point", "line", "set", etc.) People think "consciousness" is special because they have some sort of direct access to it. But introspection and other types of ostensive definition do not suffice to define any concept with the precision required to formulate questions about the origins of, or necessary or sufficient conditions for, the phenomena, or questions about their properties (which will be different depending which sub-cluster is in question). We no more know introspectively what we mean by "consciousness" than we know introspectively how we use passive grammatical constructions, or what we mean by "space" or "time", though we are immersed in both. Meaning anything at all is a complex, multi-faceted, achievement dependent on a vast array of mechanisms to which we have no introspective access. In particular, both being conscious and meaning are not merely *categorical* states, for they both involve a collection of *possibilities* for coping with variations. How you experience something is intrinsically linked with the variety of ways that experience could change and the variety of things you can do with that experience (or it can do to you). If you think you can abstract away from all those relationships and focus on something purely intrinsic and categorical, you are probably deceiving yourself, like a pre-einsteinian physicist trying to focus attention on the intrinsic properties of simultaneity, or a pre-19th century mathematician thinking he can attend to the essence of continuity by experiencing an example, or philosophers thinking about poor Mary brought up without colour experiences then exposed to redness. What do such philosophers know about the vast array of visual apparatus Mary has had all along, partly because of her evolutionary history? A lot of the history of philosophy of mind is based on a failure to grasp all this. Alas, such history is constantly repeated by scientists who rush in where (some) philosophers have learnt to tread cautiously. In short: people may be wise to be cautious about asking whether something is a biological phenomenon when there's so much confusion about WHICH phenomenon is in question or whether it's ONE phenomenon or a wide range of DIFFERENT phenomena with complex interconnections. Of course, if we are talking about something that happens in humans and some other animals, then it will very likely be a product of evolution, at least in part. It will also generally be a product of learning, development, cultural influences, etc. E.g. in some sense puzzlement about Goldbach's conjecture ("There are infinitely many twin primes") is a biological phenomenon. In humans it is crucially dependent on brain processes and these are biological phenomena. (Has anyone been looking for neural correlates of puzzlement about Goldbach's conjecture?) Evem though mathematical puzzlement in humans is implemented in biological mechanisms, it could turn out that other non-biological systems could equally well give rise to the main features of that kind of puzzlement, along with the conceptual and motivational apparatus required to grasp the problem and become puzzled about it. For all we know, such features might be able to be produced in a very different way, not involving biological evolution and brain mechanisms. There could be other machines, of hitherto unknown types, capable of implementing processes that, at an appropriate level of abstraction, have deep similarities to human puzzlement about deep mathematical problems, including their causal interactions (e.g. puzzlement generates searching for explanations, proofs, and disappointment when something that appeared to be a proof turns out to be invalid on closer examination, etc. etc.) Compare this much simpler case: at a certain level of abstraction, which includes the ability to get correct answers, several kinds of mathematical reasoning are already implementable in machines that are quite unlike brains. Some of them work much better than brains on some of the tasks: e.g. checking proofs, transforming certain logical and algebraic expressions, verifying that propositional logical expressions are consistent, etc. Harold Cohen's computer program Aaron, produces wonderful paintings and drawings though it probably has very limited appreciation of their aesthetic qualities. Despite the likely existence of non-biological cases, and perhaps also cases in alien species on distant planets, we may still wish to understand how the earthly biological sub-cases work! But deep understanding may require treating such biological cases as examples of something more general. In other words, granted that many of the essential features of whatever we dimly and variously refer to as "consciousnes" are no more *inherently* biological than calculation or reasoning or painting are inherently biological, nevertheless the biological versions probably have a host of special features which might be best understood by seeing how they are products of evolution, as well as being products of brain processes, and products of social processes, etc. So all I am arguing for is a broad-minded approach. Sometimes, by understanding a general class of phenomena we may be better able to understand the specific details of a particular case. If we look ONLY at the special cases we may confuse wood and trees. For instance, it could be the case that many of the details of biological evolution on THIS planet are not replicated in the products of evolution on other planets, even though some of the broad classes of architectures are, e.g. purely reactive organisms, organisms with a mixture of reactive and deliberative ('what if' reasoning) capabilities, organisms with reactive, deliberative and introspective capabilities, organisms with syntactically rich languages, etc. A few comments on the interesting conference announcement Bernie circulated. > A debate at the Institute of Contemporary Arts > The Mall, London SW1Y 5AH > > 5th September, 2000, 7:30-9:00pm > ... > Professor Euan Macphail: The evolution of language is the key to > the evolution of consciousness. Well, perhaps those kinds of consciousness that involve the use of language and concepts normally learnt through linguistic interaction. What about the fly's consciousness of my rapidly approaching hand? Why do so many people think that "consciousness" refers to one unique thing that has only one evolutionary history? > Professor Jeffrey Gray: Evolutionary theory must be able to > explain the existence of consciousness (but I > haven't the faintest idea how). First we have to start to get clearer regarding what we are talking about, and that involves a very large variety of different sorts of things. You can't hope to explain things about which you are completely muddled. Collecting a wide range of examples of phenomena to be explained can be interleaved with developing theories of ever increasing depth, precision and generality. With colleagues I've been trying to do that over many years, with partial results reported in papers in the Cognition and Affect directory: http://www.cs.bham.ac.uk/research/cogaff/ In particular, to answer Jeffrey Gray's question I think it is helpful to identify classes of information processing architectures that might have evolved at different times, and show how different aspects of those architectures explain different subsets of the phenomena of consciousness (including disorders of concsciousness). (E.g. our work shows how different classes of emotions, namely primary, secondary and tertiary emotions, and possibly others not yet clearly distinguished, arise out different aspects of a combined reactive, deliberative and reflective architecture. This would explain why they can't all occur in all animals.) Of course, given the impoverished evidence available, and our still limited knowledge of many of the biological and evolutionary mechanisms, this work has to be speculative. But informed and disciplined speculation can help us decide to look for new kinds of evidence that will reduce the need to be speculative. (All deep science is inherently speculative, especially in early phases.) Any good theory aiming to reduce Jeffrey's puzzlement should explain not only the varieties of consciousness within humans (of all ages, with and without brain damage, etc.) but also help us understand various types of phenomena to be found in the architectures evolved by other animals, which will be partly like ours and partly different. E.g. the field-martin that remembers where it has buried large numbers of nuts, and which ones it has already eaten. Or the chimp that is undecided whether to challenge the dominant male. Can a housefly ever become aware that it is aware of my approaching hand? Not if it lacks a certain type of self-monitoring sub-architecture. Perhaps a new-born human infant also lacks it? > Professor Stevan Harnad: Consciousness cannot be functional, it > can just be. The only things I know of worth referring to as "consciousness" include a large collection of functional phenomena along with others which are side-effects that may or may not be functional, and some that are clearly dysfunctional, e.g. when anxiety or the embarrassment of being watched closely, stops me thinking clearly about an important task. So "just being" is not the only alternative to being functional! > Dr Max Velmans: Evolutionary theory can account for the forms of > consciousness but not its existence. Certainly evolutionary theory cannot account for the existence of consciousness in all those future robots that will argue about whether humans have qualia. As for humans, the word "consciousness" can refer to many types of things, and when they occur within any type of human their existence is a product of far more than evolution. The existence of particular instances and varieties of consciousness will generally be determined by the rich and gory details of the context at the time, and evolutionary theory certainly cannot account for that. It might, at best, account for the development of the architectures that make possible a wide variety of types of consciousness. Existence of particular cases needs more. Back to work on that poster now... Aaron ==== Aaron Sloman, ( http://www.cs.bham.ac.uk/~axs/ ) School of Computer Science, The University of Birmingham, B15 2TT, UK EMAIL A.Sloman@cs.bham.ac.uk PAPERS: http://www.cs.bham.ac.uk/research/cogaff/ TOOLS: http://www.cs.bham.ac.uk/research/poplog/freepoplog.html Phone: +44-121-414-4775 Fax: +44-121-414-4281/2799 From psa01mv@gold.ac.uk Sat Jun 24 18:42:29 2000 Return-path: From: psa01mv@gold.ac.uk Date: Sat, 24 Jun 2000 18:36:02 +0100 To: Aaron Sloman Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) Content-Type: multipart/mixed; boundary="==========4094073941==========" --==========4094073941========== Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Content-Disposition: inline thanks for your thoughts Aaron - I'll give them some thought! Have you read my recent tome by the way - lot's in there to interest you (in fact quite a lot about you too). I will attach blurb from publishers - can get it anywhere including Amazon.co.uk. Best Max --On 24 June 2000, 17:08 +0100 Aaron Sloman wrote: --==========4094073941========== Content-Type: application/msword; name="velmanspromotionlit.doc" Content-Transfer-Encoding: base64 Content-Disposition: attachment; filename="promotionlit.doc"; size=16896 UNDERSTANDING CONSCIOUSNESS MAX VELMANS (Goldsmiths, University of London) ISBN 0-415-22492-6 £14.99 pbk ISBN 0-415-18655-2 £40 hbk REVIEWS "This book is excellent. There are lots of books on consciousness, but few which mix the philosophical, psychological and neuroscientific, and even fewer which are written without an axe to grind ... a lovely book ... I'll be recommending it to everyone I see." John Kihlstrom, University of California at Berkeley "An exceptionally lucid and balanced account of the different approaches to and aspects of the problem. Useful for both scientists and philosophers ... a beautifully written text." Jeffrey Gray, Institute of Psychiatry, London "... a splendid assessment of and contribution to the debate about consciousness as it is currently being waged between psychologists, philosophers, some neuroscientists and AI people." Steven Rose, The Open University, UK. " This is a splendid book ... In my view it should have a profound and lasting effect upon the debate as to the nature and function of consciousness, and should stimulate much new thinking and investigation." David Fontana, University of Cardiff, UK and University of Algarve, Portugal, "complements Chalmer's influential The Conscious Mind in illustrating precisely why and how the problem of consciousness is indeed a hard rather than an easy problemLike Jaynes' celebrated Origin of Consciousness, Velmans' book will be found informative and stimulating even by those who in the end are nor persuaded that it vouchsaves the solution." Stevan Harnad, Southampton University SYNOPSIS The mysteries of consciousness have gripped the human imagination for over 2500 years. At the dawn of the new millennium, this book provides solutions to some of the deepest puzzles surrounding its nature and function that are consistent with science, ordinary experience, and common-sense. Drawing on recent scientific discoveries, Max Velmans challenges conventional reductionist thought, providing an understanding of how consciousness relates to the brain and physical world that is neither dualist, nor reductionist. The book should be of interest to psychologists, philosophers, neuroscientists and other professionals concerned with mind/body relationships, and all who care about this subject. Excerpted from Understanding Consciousness by Max Velmans. Copyright © 2000. Reprinted by permission. All rights reserved. From the Preface: Consciousness is personal. Indeed, it is so close to the core of what it is to be human that it has puzzled thinkers from the beginnings of recorded history. What is it? What does it do? How does it relate to the physical world and to the workings of our bodies and brains? At the dawn of the new millennium answers to some of these questions are beginning to emerge. However, there is not one mind-body problem, but many. Some of problems are empirical, some are conceptual, and some are both. This book deals with some of the deepest puzzles and paradoxes. A good story has a beginning, a middle and an end, so the book is arranged in three parts. The first part, 'Mind-body theories and their problems' summarises current thinking about the nature and function of consciousness, pinpointing the strengths and weaknesses of the dominant mind-body theories. The international 'conscious debate' has largely been fuelled by two competing world-views: dualism, which splits the universe into two fundamentally different mental and physical substances or properties, and materialist reductionism, which claims consciousness to be nothing more that a state or function of the brain. While dualism seems to be inconsistent with the findings of materialist science, reductionism seems to be inconsistent with the evidence of ordinary experience. The challenge is to understand consciousness in a way that does justice to both. Part 2 of this book, 'How to marry science with experience', goes back to first principles. Rather than seeking to defend either dualism or reductionism, we start with a close examination of experience itself. I suggest that if one does this with care, the old boundaries that separate consciousness from the physical world can be seen to be drawn in the wrong place! This turns the mind-body problem around on its axis and forces one to re-examine how consciousness relates to the physical world, to knowledge and to the detailed workings of the brain. At first glance, these intricate relationships of mind, matter and knowledge seen to form an impenetrable 'world knot'. But, as far as I can tell, it is possible to unravel it, step by simple step, in a way that is consistent with the findings of science and with common sense. Part 3 of this book provides a synthesis. In it I suggest what consciousness is and does. I also develop a form of 'reflexive monism' which treats human consciousness as just one, natural manifestation of a wider self-conscious universe. Although the route to this position is new, the position itself is ancient. I find this very reassuring. Understanding consciousness requires us to move from the understanding of things we are conscious of, to understanding our role as conscious observers, and then to consciousness itself - an act of self-reflection which requires an outward journey and a return. If the place of return does not seem familiar, it is probably the wrong place. CONTENTS Part 1: Mind/body Theories and Their Problems.What is consciousness?Is there a conscious soul in the brain? Are mind and matter the same thing? Are mind and consciousness just activities? Could robots be conscious? Part 2: A New Analysis - How to Marry Science with Experience. Conscious phenomenology and common sense. Experienced worlds, the world described by Physics, and the thing-itself. Subjective, intersubjective, and objective science. Consciousness, brains, and human information processing. Part 3: A New Synthesis Reflexive Monism.What consciousness is. What consciousness does. Self-consciousness in a reflexive universe. BOOK ORDERS Psychology Press, International Thomson Publishing Services, Cheriton House, North Way, Andover, Hampshire, SP10 5BE, UK; Tel: +44 (0) 1264 343071; Fax: +44 (0) 1264 343005; E-mail: book.orders@tandf.co.uk --==========4094073941==========-- From owner-psyche-b@listserv.uh.edu Sun Jun 25 04:05:58 2000 Return-path: Date: Sat, 24 Jun 2000 13:07:56 -0400 From: Bernard Baars Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) Sender: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" Approved-by: patrickw@CSSE.MONASH.EDU.AU To: PSYCHE-B@LISTSERV.UH.EDU Reply-to: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" Content-type: text/plain; charset="US-ASCII" Content-transfer-encoding: 7bit Aron Sloman's thoughts are always interesting, always provocative, but, if I may say so, always more informed by computationalist concerns than by a knowledge of the psychology and neurobiology of consciousness. Aron writes: "Some comments relating to a recent message from Bernie Baars which is relevant to this topic. He wrote > I particularly want to endorse Doug Watt's point about considering > consciousness in the evolved biological brain in light of the following > announcement on the Society for Philosophy and Psychology listserv, about a > forthcoming debate in London in which no one seems to be especially confident > that consciousness is a biological phenomenon at all. Viz: > ... Part of the problem is that "consciousness", like "emotion", "knowledge", "understanding" is a *cluster* concept. It refers in an indeterminate fashion to a large collection of different things, different subsets of which are relevant in different contexts. Unfortunately people think they know intuitively what THEY mean by it when they don't -- just as people used to think they knew intuitively what simultaneity is, because they often experienced simultaneity." Aron is right, of course. But the notion that the multiplicity of commonsense meanings of consciousness hasn't occurred to those of us working in the field for the past twenty years is a trifle over the top. In the 1970s Thomas Natsoulas wrote a famous (to many of us) American Psychologist article detailing a dozen or so commonsense meanings of "consciousness." BUT ... and this is important ... he concluded that there was indeed a CORE meaning, and that is, roughly "direct awareness of an object, such as a perceptual object." (Not an exact quote). That meaning of consciousness has come to be the core meaning in the now sizable (and rapidly expanding) neuroscientific literature on the topic. It joins the other core meaning of consciousness as a state (contrasted with slow-wave sleep, general anesthesia, epileptic states of absence, and coma). Those two meanings (the state and conscious perceptual contents) are the least disputable meanings of the word, and as it happens, they are the ones we know the most about empirically. The evidence is overwhelming that both kinds of consciousness are biologically ancient. The reticular formation, needed for the state of consciousness as defined by EEG, goal-directed survival and reproductive behavior, sensory discrimination and generalization, neuromodulation of forebrain neurons, etc., etc., goes back to early vertebrates. Sleep-waking-dreaming is also controlled by brainstem nuclei, which are phylogenetically ancient. We now have a much deeper understanding of how the state of consciousness involves changes in all neurons in the thalamocortical core (in mammals), so that the several levels Walter Freeman talked about some days ago can be understood in a much more unified fashion. There is obviously much to learn, but the progress is unmistakable. On the perceptual side, there has also been remarkable progress in the past 15 years, due to the work of Nikos Logothetis and others. We now have a much deeper understanding of the difference between conscious and unconscious streams of information in visual cortex, under conditions of binocular rivalry. Studies of blindsight, parietal neglect, hippocampal episodic (conscious) memory, etc., have enriched this picture immensely. Visual cortex has a fairly well-understood evolutionary history, and earlier cortex (such as olfaction) has been studied by scientists like Walter Freeman for 50 years in great detail. The fact that they have a specific evolutionary history is simply beyond serious debate. There is much more, but these meanings of consciousness are central and stable in the scientific literature. A forthcoming book edited by the late Jim Newman and myself reprints 65 scientific articles on the various aspects of consciousness. (Baars & Newman (in press), Essential Sources in the Scientific Study of Consciousness. Cambridge, MA: MIT Press/ Bradford Books). It is increasingly important for those of us interested in consciousness to read the basic literature before commenting. Best, Bernie From Aaron Sloman Mon Jun 26 10:40:17 BST 2000 To: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) Cc: Bernard Baars Bernie thanks for your comments. I apologise for making two mistakes. One was leaving out the existential quantifier I should have put in before "scientists", like the one I inserted before "philosophers". I should have written: A lot of the history of philosophy of mind is based on a failure to grasp all this. Alas, such history is constantly repeated by (some) scientists who rush in where (some) philosophers ^^^^^^ have learnt to tread cautiously. (Maybe the first should have been "(many)" and the second should have been "(a few)"!) And since I was responding to a message from you, I should have had the courtesy to acknowledge that your own books, from which I have profited, do present many of the diverse phenomena that are in various contexts referred to by words like "consciousness" and related words. The real context was not so much to attack you as to comment on the impression given by the conference announcement appended to your message, to which you had also referred by saying: > ... > forthcoming debate in London in which no one seems to be especially confident > that consciousness is a biological phenomenon at all. Viz: > ... It looked to me as if part of the reason for that was that the people involved were using the word "consciousness" to pick out different subsets of the large cluster of things that it can refer to in ordinary and scientific or philosophical usage. If you think my diagnosis was wrong, what's yours? I also went on to make some comments about the need to be cautious in thinking that a particular type of *implementation* of consciousness could provide the only referent. It's not clear whether you are disagreeing with that. This is particularly important in the context of a discussion of evolution of consciousness. Evolution, like a creative engineer, often produces diverse solutions to the same general type of problem. Thus there are different varieties of biological vision, in all sorts of animals (including the housefly in my example) and I see no reason to believe that evolution would treat consciousness (or any of its many facets) any differently from vision, locomotion, reproduction, etc. I.e. there are likely to be diverse implementations, including some on remote planets very different in their details, and their evolutionary history, from those in earthly organisms. (There could also be artificial implementations, as with vision, learning, reasoning, locomotion, etc.) [BB] > Aron Sloman's thoughts are always interesting, always provocative, but, if I > may say so, always more informed by computationalist concerns than by a > knowledge of the psychology and neurobiology of consciousness. Just for the record: for about 40 years I have been a regular attender at seminars and lectures by psychologists and brain scientists in various universities (ask members of the school of psychology here in Birmingham!) and one of the reasons I am going to ASSC4 is the hope that I'll continue to learn useful things. I even deliberately opted for workshops with a strong empirical component. However, it's impossible to keep up with everything I have to know about to study minds properly! (Brain scientists also clearly find it impossible to keep up with everything relevant!) [AS] > Unfortunately people think they know intuitively what THEY mean by it > when they don't -- just as people used to think they knew intuitively > what simultaneity is, because they often experienced simultaneity." [BB] > Aron is right, of course. But the notion that the multiplicity of commonsense > meanings of consciousness hasn't occurred to those of us working in the field > for the past twenty years is a trifle over the top. Yes. There are exceptions, and I again apologise for leaving out the quantifier. [BB] > In the 1970s Thomas > Natsoulas wrote a famous (to many of us) American Psychologist article > detailing a dozen or so commonsense meanings of "consciousness." BUT ... and > this is important ... he concluded that there was indeed a CORE meaning, and > that is, roughly "direct awareness of an object, such as a perceptual > object." (Not an exact quote). That meaning of consciousness Oh dear! "direct awareness" again! That's exactly the muddled introspectively defined concept that has come out of philosophical discussion and has been subject to devastating criticisms on many occasions. If the scientists you admire choose THAT as their core definition, then I am afraid they are in the scope of my existential quantifier. I don't think there's any "direct" awareness of anything in any clear sense of "direct". All awareness is *mediated* by a host of very complex processes in diverse mechanisms of which we are not aware. Of course, if by "direct" you mean simply that the individual concerned is not aware of inferring anything (i.e. it *feels* direct) then since people can be deceived by all sorts of aspects of their own minds, it is not clear that that notion of "directness" picks out a category that is of any explanatory significance. And of course it would also include the mathematical puzzlement of which I am directly aware (from time to time). Of course, a mathematical puzzle is not a perceptual object, but the phrase you gave was "such as a perceptual object", and I see no reason why you'd want to distiguish consciousness of a mathematical puzzle from consciousness of a rabbit. (Which I assume is a perceptual object: or are the perceptual objects only my "rabbity sense-data" or something equally subject to conceptual criticism?) (I've just noticed that Marvin Minsky's comment makes a similar point.) [BB] > has come to be > the core meaning in the now sizable (and rapidly expanding) neuroscientific > literature on the topic. It joins the other core meaning of consciousness as > a state (contrasted with slow-wave sleep, general anesthesia, epileptic > states of absence, and coma). Those two meanings (the state and conscious > perceptual contents) are the least disputable meanings > of the word, and as it happens, they are the ones we know the most about > empirically. I know that there has been much learnt about various modes of perception and also about the difference between those two global states. And there is still much more to be learnt. Wittgenstein once wrote something like: "In psychology we have experimental method and conceptual confusion". The rise of neural studies of consciousness has extended the validity of this comment to brain science. However, I don't want to go on criticising brain scientists. I hope that in a cooperative endeavour we can step back and try to come up eventually with a more satisfactory mutually agreed conceptual framework for investigating the phenomena we all agree are fascinating, deep and very important for both theoretical and practical purposes. In part this requires (as I think Arnold Trehub mentioned in a recent post) many careful analytical studies of the phenomenology of the states and processes we are investigating. Without that we may get their biological functions wrong, and therefore look in the wrong place for both the explanatory mechanisms and the evolutionary origins. I.e. instead of trying to define a CORE meaning of "consciousness" we need to be clear about the whole variety of related phenomena that the word loosely refers to. We can then try to investigate explanations which account not just for some arbitrarily chosen subset but the integrated system, since it is unlikely that a brain is just a very large collection of disconnected functional modules. [BB] > The evidence is overwhelming that both kinds of consciousness are > biologically ancient. The reticular formation, needed for the state of > consciousness as defined by EEG, goal-directed survival and reproductive > behavior, sensory discrimination and generalization, neuromodulation of > forebrain neurons, etc., etc., goes back to early vertebrates. So what do you say about my friend the house-fly? Isn't it conscious, in some sense, of my approaching hand. Isn't my hand as much a "direct object of perceptual awareness" for the fly as it would be for you if I tried to punch you in the face? If not why not? (Beware of answers that define "awareness" in terms of a favourite mechanism for producing it.) If your answer is that the fly isn't aware that it is conscious of the hand, then maybe that's also true of your early vertebrates? Perhaps that self-awareness is an *extra* component that is only present in a subset of the architectures which provide the additional mechanisms required for it? > .... [BB] > There is much more, but these meanings of consciousness are central and > stable in the scientific literature. And are you confident that Macphail, Gray and Velmans, whose talk titles were listed in your message, were all using those "central and stable" meanings? I doubt it. > It is increasingly important for those of us interested in consciousness to > read the basic literature before commenting. And (some of) those who write the basic literature need to learn to do conceptual analysis, in order to avoid misleading themselves and their readers when they use terms from ordinary language to summarise their findings and theories. Remember, we are on the same side: trying to advance and deepen our scientific understanding of something that is far more complex than any of our current theories, yours or mine! See you in Brussels! Cheers. Aaron From jordan@ucsd.edu Sun Jun 25 05:56:06 2000 Return-path: Date: Sat, 24 Jun 2000 22:00:50 -0700 From: Jordan Hughes Organization: Cognitive Neuroscience Lab, UCSD X-Accept-Language: en To: Aaron Sloman Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) References: <200006241608.RAA28821@gromit.cs.bham.ac.uk> Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Aaron Sloman wrote: > > > People think "consciousness" is special because they have some sort of > direct access to it. > > But introspection and other types of ostensive definition do not suffice > to define any concept with the precision required to formulate questions > about the origins of, or necessary or sufficient conditions for, the > phenomena, or questions about their properties (which will be different > depending which sub-cluster is in question). > > We no more know introspectively what we mean by "consciousness" than we > know introspectively how we use passive grammatical constructions, or > what we mean by "space" or "time", though we are immersed in both. > > Meaning anything at all is a complex, multi-faceted, achievement > dependent on a vast array of mechanisms to which we have no > introspective access. > Hi Aaron, Wonderful post! Thank you. I've taken the liberty of forwarding it to the list that I moderate, "consciousness@egroups.com." Unfortunately, I cannot attend ASSC4, but I hope to meet you at some future event. I'd be delighted if you subscribed to the consciousness list. I'm hoping that the group increasingly includes participation of researchers of your caliber. All the best, Jordan Hughes -- jordan@ucsd.edu |HOME (my snailmail preference): Moderator, Consciousness@eGroups.com | Jordan Hughes Cognitive Neuroscience Lab | P.O. Box 2468 Department of Cognitive Science, 0515 | Julian, CA 92036-2468 University of California, San Diego | USA La Jolla, CA 92093 | fax: 760-708-6432 From owner-psyche-b@listserv.uh.edu Sun Jun 25 23:43:07 2000 Return-path: Date: Sun, 25 Jun 2000 07:40:56 -0700 From: Walter J Freeman Subject: Evolution and consciousness (Sloman and Baars) Sender: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" Approved-by: patrickw@CSSE.MONASH.EDU.AU To: PSYCHE-B@LISTSERV.UH.EDU Reply-to: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" Content-type: text/plain; charset="us-ascii" I am bemused by Aaron Sloman's comments and Bernie Baars' response. I agree with Bernie that there are two "core meanings", for which he proposes the "state" and "conscious perceptual contents". I would add a third: an "operator" that saves us biological animals, when it comes into play after we've done something we hadn't ought to, and that has an einsteinian 'flavour' to it of field theory in curved intracerebral dynamic state space. It's non-trivial that 'consciousness' used to be called 'conscience'. In this I disagree with Stevan Harnad: "Consciousness cannot be functional, it can just be." We and other animals don't survive long without it. I agree with Aaron that there are multiple facets, like a scintillating diamond, and that C is not [merely] "a biological phenomenon", though not for his reason: >E.g. in some sense puzzlement about Goldbach's conjecture ("There are >infinitely many twin primes") is a biological phenomenon. In humans it >is crucially dependent on brain processes and these are biological >phenomena. (Has anyone been looking for neural correlates of >puzzlement about Goldbach's conjecture?) That is a philosophical red herring, a subtle put-down of us neurocorrelators. In complement to "core meanings" I would suggest an "encompassing meaning" - . I am required by custom, ethics and law to treat animals humanely [sic!] on the premise that they feel pain. Not so machines - yet. Those engaged in SETI are fully prepared to attribute C to any linguistically competent beings they might contact, whether based in carbon, silicon, sulfur, or some as yet undiscovered stable transuranium elements. Should we care? Not yet. As Stanislav Lem remarked about what he called the 'silence of the universe', they may be around but don't yet trust or respect us. That's how housewives feel about cockroaches. I'm rather taken by them. They're good at what they do, including first-rate bipedal locomotion, a lot better at it than MIT robots are. Are they conscious? Prototypically. Certainly worthy of humane study. Regards, Walter From owner-psyche-b@listserv.uh.edu Mon Jun 26 00:09:51 2000 Return-path: Date: Sun, 25 Jun 2000 19:04:11 -0400 From: Marvin Minsky Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) Sender: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" X-Sender: minsky@mailhub.media.mit.edu Approved-by: patrickw@CSSE.MONASH.EDU.AU To: PSYCHE-B@LISTSERV.UH.EDU Reply-to: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" Content-type: text/plain; charset="us-ascii" Baar's reply to Sloman would seem to illustrate, rather than to refute, Sloman's analysis: > Baars: " a CORE meaning, and that is, roughly "direct awareness of an >object, such as a perceptual object." That meaning of consciousness has >come to be the core meaning in the now sizable (and rapidly expanding) >neuroscientific literature on the topic. This first 'core' seems to fit a frog jumping at a fly -with the gratuitous addition of "direct awareness" which as Sloman noted is the introspective monstrosity that may be the real 'core' of this whole debate. In its curious emphasis on perception, it misses almost everything else that most people would want to include into that 'cluster' or 'suitcase' of "'consciou"'--e.g., reflecting on what you've been thinking about. > Baars: It joins the other core meaning of consciousness as a state >(contrasted with slow-wave sleep, general anesthesia, epileptic states of >absence, and coma). Those two meanings (the state and conscious perceptual >contents) are the least disputable meanings of the word, and as it >happens, they are the ones we know the most about empirically. This second "core" appears to span so much of what people call 'thinking' that it seems only to mean 'the state of not being unconscious'-that is, to not be in a state in which one cannot think. Baars concludes, "It is increasingly important for those of us interested in consciousness to read the basic literature before commenting," I would propose instead-at least in this context-and with apologies to Santayana, "Those who know history are doomed to repeat it," -- From psa01mv@gold.ac.uk Mon Jun 26 11:32:32 2000 Return-path: Date: Mon, 26 Jun 2000 11:33:21 +0100 From: Max Velmans To: Aaron Sloman , "PSYCHE Discussion Forum (Biological/Psychological emphasis)" cc: Bernard Baars Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Content-Disposition: inline Dear Aaron Have just read your thoughtful comments - and there's a lot that I agree with - the relation of consciousness to evolution is complex. But you do finesse the central issues in my view. Being conscious is in the first instance what it is like to experience something, not what enables one to do something. I lay it all out in my book in depth so I won't attempt sound-bites here. Once you have studied the case against (in my book) I would be happy to debate, elaborate etc. See you soon in Brussels With best wishes Max ***************************************************************** Dr Max Velmans direct tel (+44)(0)171 919 7874 Dept of Psychology office tel (+44)(0)171 919 7871 Goldsmiths fax (+44)(0)171 919 7873 University of London email m.velmans@gold.ac.uk New Cross London SE14 6NW England URL www.goldsmiths.ac.uk/academic/ps/velmans.htm *************************************************************** * From owner-psyche-b@listserv.uh.edu Mon Jun 26 23:43:39 2000 Return-path: Date: Mon, 26 Jun 2000 15:26:10 +0100 From: Jeff Dalton Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) Sender: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" Approved-by: patrickw@CSSE.MONASH.EDU.AU To: PSYCHE-B@LISTSERV.UH.EDU Reply-to: "PSYCHE Discussion Forum (Biological/Psychological emphasis)" It's true, as Aaron Sloman has been pointing out for some time now, that consciousness is a cluster concept, that there's a danger of misunderstanding what someone means by the word (which subset of the cluster they are talking about), that someone using the word may not - at least the present stage of our investigations - be able to give it any meaning that is sufficiently precise for certain purposes, and that introspection, and ostensive definition, are beset by significant difficulties. Moreover, "meaning" is pretty tricky in itself. Philosophers (and others) have been labouring at it for centuries without getting it all worked out. Difficulties can be multiplied endlessly, I suspect. Nonetheless, people talking about consciousness often are able to understand each other well enough to get some work done. Indeed, that can be so even when the people involved are not aware of the various difficulties Aaron raises. It's important not to lose sight of those facts. I feel that the investigation of consciousness is at a point where it is not very useful to bring in the manifold difficulties that attend "consciousness" without getting down to specifics. Consider, for instance, Aarons' response to the suggestion that "no one seems to be especially confident that consciousness is a biological phenomenon at all": It looked to me as if part of the reason for that was that the people involved were using the word "consciousness" to pick out different subsets of the large cluster of things that it can refer to in ordinary and scientific or philosophical usage. Well, maybe. But what is the evidence, in what these people actually say, and just how does it actually affect their work? Without such specifics, the above "diagnosis" cannot usefully be evaluated. From BBaars8788@aol.com Tue Jun 27 16:17:05 2000 Return-path: Date: Tue Jun 27 11:16:59 2000 From: BBaars8788@aol.com Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) To: , , Cc: Content-Type: text/plain; charset=US-ASCII Content-Transfer-Encoding: 7bit Content-Type: text/plain; charset="US-ASCII" Content-Transfer-Encoding: 7bit There is much I agree with in what Aron says. I've wondered a great deal over the last 20 years why, if you'll forgive me for saying so, we have a series of extraordinary brilliant people for literally centuries who have been going around in painfully boring circles when it comes to consciousness. One possibility is that it is simply an inherently difficult or paradoxical or unsolvable problem, or that it requires some extraordinary appeal to things like quantum mechanics that most people haven't thought of. The other possibility, which I'm increasingly attracted to, is that the mind-body problem is a pseudoproblem that is very seductive but simply wrong-headed. I'm not the only person to suggest this, by the way === John Searle in his recent writing is absolutely devastating about the wrongness of all the standard mind-body philosophy of the past 100 years. I would extend that at least to Descartes and perhaps earlier. If I may use Max Velman's statement as an example (again, with apologies to Max) === IF you define the problem of consciousness, as Max does, as essentially a matter of trying to understand your own conscious experience, you can't treat consciousness as a construct that can be studied like any other construct in the history of science. In particular, you can't treat it as a variable because you have nothing to compare it to. There are no single-valued variables. The solution, of course, is to compare a conscious event to the same or similar even when it is uncosncious. The beginning of the last sentence is unconscious now for you, the person reading this. Yet it continues in your memory, or you could not understand this paragraph. This very simple example gives us TWO cases of the same piece of knowledge, a conscious case and an unconscious case. Now we can compare the two, and ask what is this thing consciousness, that it should make the difference between those two empirical cases? All of my writing of the past twenty years is based on this strategy; but I can't claim priority (and I'm not interested in doing that anyway). William James at times writes about studying THE DISTRIBUTION OF CONSCIOUSNESS as a key (the quote is in my 1988 book). And numerous scientists in the last 50 years have done it, because comparing things is what scientists do. What I would like to suggest is that the problem with the usual statements about consciousness is a kind of Cartesian solipsism ==== that is, you pretend that you're meditating on your only key to existence, which is that you're thinking. Well, with apologiest to Descartes, a great genius, that's bullshit. Descartes, like any other human being, had a rich semantic domainn of reference about consciousness, which allowed him to PRETEND to be sitting in solipsistic splendor starting from scratch. But in fact he knew perfectly well what it was like to wake up in the morning, to realize that time had passed, etc., all of which depends upon assuming the existence of an external world not recognized by true solipsists, if there are any. So in fact Descarted, like any other human being, had experience of consciousness as a variable (waking up in the morning, realizing you've been unaware of the passage of events). The whole Cartesian enterprise is a hoax, from that point of view, eve! n granted that it is a sincere hoax to its practitioners. SO we have to get to a situation in which we study the comparison cases. And then we find, just as in the case of the curved earth, or atmospheric pressure, or earth gravitation, that what we're studying is not a CONSTANT but a VARIABLE. WHich is a good thing, because if it wasn't a variable we couldn't talk about it. Then Minsky would be right in saying that a term like CONSCIOUS PERCEPTION is nonsense. But it's not, because we can easily today tease out the CONSCIOUS from the PERCEPTION part (see Logothetis, Milner and Goodale, Merikle, Marcel, and many others). So that's my guess. The whole field has been going in circles because it's been pursuing a very seductive pseudoproblem. Not unprecedented in intellectual history, by the way. But that's the only way I can explain to myself why these incredibly smart people seem to be saying things that don't make any sense, and can't seem to extract themselves from the trap. Best wishes, Bernie From psa01mv@gold.ac.uk Tue Jun 27 16:39:43 2000 Return-path: From: psa01mv@gold.ac.uk Date: Tue, 27 Jun 2000 16:33:19 +0100 To: BBaars8788@aol.com, a.sloman@cs.bham.ac.uk, psyche-b@listserv.uh.edu Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Content-Disposition: inline Bernie I have promised myself not to get drawn into sound bites - as my book takes 300+ pages to trace a course out of the circular positions that you outline. But your claim that treating consciousness as "conscious experience" leaves one with nothing to compare it to (and therefore no way to study it) really does deserve a bite or two. One can compare situations where one does have conscious experience with situations where one doesn't - and investigate the functional and structural differences between these situations in the brain. One can investigate the necessary and sufficient conditions for the different FORMS of experience that we have. One can compare the THINGS that one experiences with the things that one does not experience - and build this contrast into a theory of knowledge (for example and analysis of how the phenomenal world relates to the 'nonexperienced' but nevertheless existing energies and events described by physics). And above all, ANY theory of consciousness has to depart from an accurate phenomenology - that is an accurate description of the experience that it seeks to understand and explain, otherwise it heads off at a smart pace in the wrong direction. Reductive functionalism (which PRESUPPOSES that consciousness is other than it seems - ie isn't really experience at all) does just that. Look forward to continuing this in Brussels. Best Max From a.sloman@cs.bham.ac.uk Tue Jun 27 18:52:39 2000 Return-path: Date: Tue, 27 Jun 2000 18:53:32 +0100 (BST) From: Aaron Sloman To: BBaars8788@aol.com Subject: Re: Evolution and consciousness (Was: Re: A computational conundrum) Cc: , Hi Bernie, I'll respond properly when your message has come via the psyche-b list. But in case it doesn't get through till next week I just wanted now to say thanks, and comment that my approach is to try to use as much information as I can get from biology, neuroscience, psychology, phenomenology, philosophy, linguistics, and the best available engineering design techniques and tools to try to design (and later perhaps build) a WORKING system which is like us, not only in the obvious ways, but also in having exactly the sorts of features that will lead it (in some moods) to want to talk about its experiences as "direct", as "unanalysable", as having all the features that philosophers and others have attributed to qualia. I.e. I want to model Max! > The other possibility, which I'm increasingly attracted to, is that > the mind-body problem is a pseudoproblem that is very seductive but > simply wrong-headed. You are in excellent company. However it is a psychological *fact* that the problem is very seductive, as is the *fact* that for many people the claim that it is simply wrong-headed is totally unconvincing. So those facts have to be explained by a complete theory of the human mind. And I take that task very seriously. The process of designing models that can actually be made to work like us is both a fascinating design problem and potentially a major contribution to theory, by extending existing descriptive theories into actual runnable models. (So far my models are far too complex for me and my colleagues to implement: not in principle, but because we lack the resources, and in many ways they are still underspecified: I have a lot of question marks about how many of the parts work -- but who doesn't!) Right now I have a very general framework for explaining how in some organisms, but not all, the mechanisms for generating thoughts about qualia work. I suspect very few animals have those self-reflective mechanisms. Anyhow, See you shortly. And Max too. I see Max primarily as a philosopher who says very interesting and very clear things, but I simply disagree with them. On account of pressure of time and other priorities, I have not yet read the latest version, in his new book, which may have diverged from what he was saying when we last talked well over a year ago. Cheers. Aaron PS, I have a semi-serious paper called 'What is it like to be a rock?' which is relevant: http://www.cs.bham.ac.uk/~axs/misc/rock/rock.html From Aaron Sloman Mon Jun 26 19:45:19 BST 2000 To: stan.franklin@memphis.edu Subject: Re: cluster concept Hi Stan, Thanks for your comments. > But, I still don't seem to have fully come to grips with your notion of > a cluster concept. Aren't all of our concepts, except mathematical > concepts, cluster concepts? It's not actually my notion: I picked it up long ago when I was a student, though I can't recall who introduced the idea into philosophy. It is closely related to Wittgenstein's notion of a "family resemblance concept", which he illustrated with the difficulty of defining "game" since there are so many different types of things we call games, which don't have anything in common though there are many family resemblances between examples of games, and those resemblances form a network linking all the games. It is also closely related to what F.Waismann called "open texture" (discussed, I think in his 1965 book The Principles of Linguistic Philosophy, and papers published earlier.) One problem with explaining "cluster concept" is that it's a cluster concept! The simplest notion is something like this: If the predicate F is a cluster concept then there is a collection of other predicates F1, F2, F3, ... Fn such that 1. if ALL, or possibly MOST of the Fi apply to x then F(x) is definitely true, 2. if NONE of the Fi, or very FEW of them apply to x, then F(x) is definitely false, but 3. there is NO well defined subset of the Fi, nor any disjunction of conjunctions of subsets of Fi that is synonymous with F. 4. Further there's no well defined set of subsets that are sufficient for F. 5. Usually that's because we have only encountered cases where most of the Fi are present and cases where none, or few, are present, and therefore we have never been required to fix the meaning by deciding on intermediate cases. 6. However when intermediate cases arise, satisfying a subset of the Fi, the attempt to decide whether to classify it as F or not F, is usually bedevilled by the existence of arguments for saying it is F, because it has certain features in common with standard instances of F, and arguments for saying it isn't F because it lacks some "important" features of standard instances of F. Usually people then have motives for choosing one decision or the other which have nothing to do with the *facts* but may be driven by their ethical preferences or religious prejudices, or what their friends normally say, or whatever. Thus, arguments about intermediate cases can go on interminably, until people realise that there's nothing to argue about: instead we can define different varieties of F in terms of different collections of the features of instances. (E.g. we can use phrases for the special cases, such as: solo games, team games, competitive games, non-competitive games, board games, ball games, games without rules, etc. etc. ) Of course the situation is typically further complicated by the fact that some of the features in the cluster are themselves examples of the same problem. > > "purely reactive organisms, organisms with a > mixture of reactive and deliberative ('what if' reasoning) capabilities, > > organisms with reactive, deliberative and introspective capabilities, > organisms with syntactically rich languages, etc." > > Isn't each of the major concepts in the excerpt above cluster concepts? > Can't theses nonetheless be used and reasoned about? There's nothing wrong with using cluster concepts without further specification where you are talking about "standard" cases of instances and non-instances. We have to do it much of the time, because so many concepts are cluster concepts. If people start arguing about the precise boundaries of the cases then it becomes desirable to terminate the argument and define a set of different sub-concepts. I have been trying to define "deliberative" in terms of a specific collection of capabilities, then define "reactive" in terms of the absence of those capabilities. But the boundary remains a bit fuzzy. > There's clearly something you understand that I don't. I doubt it! > Perhaps what I > need is examples of concepts that you wouldn't consider cluster > concepts. I suspect "apple", "leaf", "cloud", "thunderstorm", "bicycle", "carbon", "water", "fork", "scissors", "cylinder", "lever", "hole", "screw", "nut", "red", "orange", "yellow", "ball", "tall", "brother", "mother", "grandfather", "eat", "breathe", "jump", "sniff", are not cluster concepts, at least not to any significant degree. ("apple" may be because of things I don't know about varieties of fruit.) They may be *vague* concepts, insofar as some of them depend on features which can vary and their precise boundaries are ill defined. But not all vague concepts are cluster concepts. Some are ambiguous (e.g. "brother" can be used to refer to things other than male siblings). I don't know if that helps. I should attempt to write a paper on all this one of these days.... Aaron