Why fish (likely) don’t feel pain

school-of-butterfly-fishby Brian Key

What’s it feel like to be a fish? I contend that it doesn’t feel like anything to be a fish.  Interestingly, much of our own lives are led without attending to how we feel. We just get on with it and do things. Most of the time we act like automatons. We manage to get dressed in the morning, or walk to the bus station, or get in the car and drive to the shops without thinking about what it feels like. Consequently, much of what we do is accomplished non-consciously.

There is an enormous amount of neural processing of information in the brain that never reaches our consciousness (that is, we never become aware of it and hence are unable to report it). I propose that fish spend all of their lives without ever feeling anything. In a recent paper in the academic journal Biology & Philosophy (Key, 2015) I discussed this idea in relation to the feeling of pain. I argued (as have others; Rose et al., 2014) that there is no credible scientific evidence for fish feeling pain.

I will now address the question about whether fish feel pain in this article using a slightly different approach. I propose that by defining how the human brain processes sensory stimuli in order to feel pain, we will be able to define a set of minimal neural properties that any vertebrate must possess in order to, at least, have the potential to experience pain. As an introduction to this argument I first highlight anthropomorphism as a major stumbling block for many people in recognizing fish do not feel pain and then I discuss the difference between noxious stimuli and pain since these terms are often conflated.

Resisting anthropomorphic tendencies

Grey wolves hunt as a pack. They carefully select their prey, and then perform a series of highly coordinated maneuvers as a team, in order to corral their target. Initially, each wolf maintains a safe working distance from other members of the pack as well as from their prey. They are relentless and seemingly strategic with an overall goal of driving the agitated prey towards one wolf. A cohesive group mentality emerges that portrays logic, intelligence and a willingness to achieve a common goal. Eventually one wolf comes close enough to lock its jaws on a rear leg of the prey, before wrestling it to the ground. The rest of the pack converges to share in the kill. There appears a purpose to their collective behavior that ensures a successful outcome.

But is everything as it seems? A team of international scientists from Spain and the U.S.A. has simulated the behavior of a hunting pack of wolves using very simple rules (Muro et al., 2011; Escobedo et al., 2014). Their computer models do not rely on high-level cognitive skills or sophisticated intra-pack social communication. The complex spatial dynamics of the hunting group emerges by having the computer-generated wolves obey simple inter-wolf and wolf-prey attractive/repulsive rules.

For instance, much of the hunting strategy can be reproduced by having each simulated wolf merely move towards a prey while keeping a safe distance from it and other wolves. In this way the prey is driven towards a single wolf in the pack. Simple rules are all that are needed to generate this hunting behavior. There is no need for sophisticated communication between the wolves, apart from visual contact. There is no need for a group strategy, each wolf can act independently to create what appears to be an elaborate ambush.

The lesson is clear. Watching and analyzing animals behaving, either in the wild or in captivity, is fraught with tendencies to describe underlying causes of actions and reactions in terms of human experience. This human-centered explanation of behavior is referred to as anthropomorphism: when humans observe animals responding to a sensory stimulation in a way that reflects how they would react, there is often a strong desire to invoke anthropomorphic explanations.

One can easily imagine that a group of humans closing in on a prey would either communicate amongst themselves or learn from experience how each other is thinking, and hence how they would react to different scenarios in order to achieve a common goal. Because humans so easily reflect on their own behavior, human-like qualities are bestowed on animals spontaneously. For example, when a fish squirms after it is hooked, there is a natural tendency to imagine the pain that the fish is feeling. It seems intuitive. A hook in your mouth would hurt, so why wouldn’t fish feel the same.

Our anthropomorphic and sometimes intuitive view of the world is not, however, always helpful in understanding the behavior of animals (particularly those that are not our close relatives; de Waal, 2009). Yet, even scientists at the top of their profession adopt anthropomorphism, a line of thinking that can camouflage biologically and evolutionarily more plausible explanations of animal behavior.   (However, not everyone will agree. Readers are referred to an essay by Marc Bekoff; Bekoff, 2006).

Why do humans so easily fall victim to anthropomorphism? One could argue that we are hard wired for empathy and hence anthropomorphism, especially given the role of a specialized set of neurons in the cortex (so-called mirror neurons) and subcortical regions which appear to non-consciously drive these behaviors (Corrandini and Antonietti, 2013; Gazzola et al., 2007; Heberlein and Adolphs, 2004).

Defining key terms

One of the common queries raised by discerning readers is that if fish don’t feel pain, why do they then squirm, flap and wriggle about in distress when they are raised out of water? Why do they fight so hard to escape a fisherman’s line? It is a simple and emotionally powerful anthropomorphic argument. That is, if a hook was pierced into your lips and then someone yanked on it, wouldn’t you struggle to escape and free yourself, just like a fish?

Maybe not.  A wild horse submits to a leash within minutes. A bear trapped in a foot snare shortly gives up its struggle. Why does a fish continue to fight in the face of supposedly extreme pain (in some cases, as in big-game sport fishing, fish will relentlessly fight against the hook for 1-2 hours). An alternative view is that fish do not feel pain.

There are two terms that need defining here: fish and pain. When I refer to fish, I am referring only to bony ray-finned fish, since they are the most common experimental fish model and the fish most people are familiar with (these are fish with bones as well as fins that have spikes). The most defining anatomical feature of ray-finned fish is gills. Whales, porpoises, dolphins, seals, otters and dugongs are not fish. These animals are marine mammals; they possess lungs rather than gills.

Pain is a term that many readers will not have difficulty in understanding. Everyone has some vivid recollection of it, after touching something hot or smashing a thumb with a hammer. However, we must be very clear in our definition given the claim that fish do not feel pain. Pain is the subjective and unpleasant experience (colloquially referred to as a “feeling”) associated with a mental state that occurs following exposure to a noxious stimulus.

The mental state is the neural activity in the brain that is indirectly activated by the stimulation of peripheral sensory receptors. A noxious stimulus is one that is physically damaging to body tissues (e.g., cutting, cold and heat) or causes the activation of peripheral sensory receptors and neural pathways that would normally be stimulated had the body been physically damaged.

Gentle touch and warm water are not noxious stimuli. They neither cause physical damage to tissues nor activate sensory receptors and nerves normally stimulated by physical damage. It should be noted that pain is not a necessary consequence of noxious stimuli. For example, there are many anecdotes of people who have experienced traumatic accidents resulting in severe body tissue trauma without feeling any immediate pain. This means that it is possible to cut your skin without feeling pain.

Some basic neurobiological concepts

To feel pain requires that you are aware or conscious of your own mental state. To be aware first requires that you attend to the stimulus. A simple demonstration of this concept is illustrated by asking you to feel the pressure on your ischial tuberosities (the bony parts of the pelvis that you sit on) when you are seated. Before I directed your attention to your backside you were probably not aware of it, but immediately afterwards you became conscious of the feeling of your seated position. To feel a sensory stimulus requires attention to that stimulus (in this case, pressure on the ischial tuberosities).

Awareness of the mental state associated with peripheral stimulation of sensory receptors arises as a result of the process of attention. This is called the top-down attentional system since it involves the frontal lobes, supposedly the highest hierarchical level in the brain (Collins and Koechlin, 2012). However, attention is not always under conscious or voluntary top-down control. It is possible for the sensory stimulus itself to non-consciously activate attentional processes in what is referred to as the bottom-up attentional system (Driver and Frackowiak, 2001). A relevant and simple example would be to accidentally stand on a sharp object while walking. In this case the noxious stimulus activates attentional circuitry and causes awareness (pain, in this example). In humans, the cerebral cortex in the frontal and parietal lobes of the brain is intimately involved in attending to input from our sensory receptors. In summary, feeling pain requires the activity of neural circuits associated with attention.  Once the brain is attending to a sensory stimulus then it becomes possible to subjectively experience a specific sensation.

These top-down and bottom-up attentional mechanisms are not specific to feeling pain. Much of our understanding of their contribution to processing of sensory stimuli comes from the visual system (Corbetta and Shuklman, 2002; Buschman and Miller, 2007).  What is pertinent to our discussion is that both the top-down and bottom-up attentional mechanisms are dependent on specific neural activity in the frontal and parietal areas of the cerebral cortex, respectively.

What is the cerebral cortex?

In everyday language the cerebral cortex is the “grey matter.” This grey matter is a thin outer covering of the mammalian brain that typically consists of 3-6 discrete horizontal layers of neurons and their processes. These layered neurons are interconnected vertically to create minicolumns or canonical microcircuits that are repeated across the whole surface of the brain. Each of these minicolumns is interconnected horizontally to produce a massively powerful processing machine.

These canonical microcircuits can be likened to integrated circuits or microprocessor chips in computers. As computers have evolved, more and more circuits have been added to their chips (you may remember the progression in personal computer evolution from 286 to 386 to 486 to Pentium and Core chips). The cerebral cortex has evolved by both increasing the complexity of the canonical microcircuit from 3 layers to 6 layers of neurons (the latter is called the neocortex) and by adding more and more of these “chips,” leading to an expanded surface area of the cortex (Rakic, 2009).

Pain is in the cerebral cortex

Pain causes elevated electrical activity in at least five principal regions in the human forebrain: the anterior cingulate cortex (ACC), the frontal and posterior parietal cortex, the somatosensory (S) regions I and II, the insular cortex, and the subcortical amygdala. These five regions form a core, interconnected circuit that is referred to as the pain matrix (Brooks and Tracey, 2005).

However, just because there is electrical activity in a particular brain region during pain does not mean that that region is responsible for the sensation. For example, while the amygdala is active during pain it is involved in modulating the pain (as well as many other things), rather than producing the feeling of pain. This has been clearly demonstrated in ablation studies in both rats and rhesus monkeys. These animals continue to quickly remove their tails away from a noxious heat stimulus even after bilateral ablation of their amygdala (Manning and Mayer, 1995; Manning et al., 2001; Veinante et al., 2013). Consequently, it is reasonable to remove this subcortical region from the matrix responsible for feeling pain.

On this criterion, the ACC also does not belong to the feeling-pain matrix. Lesion of the nerve fibers arising from the ACC is called cingulotomy and has been practiced clinically for over 50 years to relieve intolerable pain. However, patients continue to feel pain after this surgery — they just no longer seem to be bothered by the presence of their pain (Foltz and White, 1962). Thus, the ACC is not responsible for feeling pain per se. The frontoparietal nexus is likewise associated with attention to pain rather than the actual feeling of pain (Lobanov et al., 2013).

There is compelling evidence that SI, SII and the insular cortex are the essential components of the pain experience. For example, there is an interesting clinical case of a patient who had ischemic stroke that selectively damaged a small portion of SI and SII in the right side of the brain (Ploner et al., 1999). This patient could no longer perceive any acute pain in response to thermal noxious stimuli or pinprick to the left hand (Ploner et al., 1999). In addition, numerous other clinical studies have revealed that when cortical lesions involve a substantial portion of SI, patients no longer experience any pain (Vierck et al., 2013). Likewise, patients with lesions to the SII-insula cortex have been shown to either lack the sensation of pain (Biemond, 1956) or have altered pain perception (Starr et a., 2009; Veldhuijzen et al., 2010; Garcia-Larrea, 2012a and 2012b).

Another important test of whether a brain region is responsible for pain is to selectively stimulate that region with electrical current.There are only two cortical regions that have ever been shown to cause pain when electrically stimulated (Mazzola et al., 2012): the SII and the insula, which make these two regions the most critical components of the feeling-pain matrix (Garcia-Larrea, 2012a, 2012b).

What does conscious processing of noxious stimuli involve?

I have already described above that the brain must have attentional mechanisms in order to feel pain. But what else does the brain need to do in order to experience pain?  Since pain is, by its very definition, the conscious processing of neural signals arising from noxious stimuli, we should, in the first instance, be asking what does conscious processing in the human brain do. Ideally, if we can identify what conscious processing accomplishes, we should be able to relate this to specific neural architectures. Once these architectures are characterized they can then be used as biomarkers for the likelihood that a nervous system feels pain.

Conscious processing is dependent on at least two non-mutually exclusive processes: signal amplification and global integration over the cerebral cortex (Dehaene et al., 2014). Why are these processes so important? Amplification provides a mechanism to increase signal-to-noise ratio and to produce on-going neural activity after the initial sensory stimulus has ceased (Murphy and Miller, 2009). Global integration ensures the sharing and synchronization of neural information so that the most appropriate response is generated in the context of current and past experiences.

Recently, the amount of information transferred across distant sites within the cortex has been quantified using electroencephalography. These quantitative values have been successfully used to distinguish between conscious, minimally-conscious and non-conscious patients (Casali et al., 2013; King et al., 2013). Thus, global integration is a critical defining feature of conscious processing.

What neural architectures enable the cortex to perform signal amplification and global integration? Both of these processes rely on the global propagation of neural information over the cortex surface. Such propagation is achieved by extensive lateral interconnections (axon pathways) between cortical regions. These cortical regions must be reciprocally linked by axons transmitting both feedforward excitatory and feedback excitatory and inhibitory activities (Douglas, 1995; Ganguli et al., 2008; Murphy and Miller, 2009).

In the sensation of pain, amplification involves long-distance attentional pathways associated with the fronto-parietal cortices and their interconnections with the feeling-pain matrix (Lobanov et al., 2013). The SI and SII sensory cortices possess topographical maps of the body that process information associated with the somatosensory system (see Key, 2015). Slight offsets of these maps (at least in human SI) for different sensations has been proposed to allow integration of different qualities (e.g,. touch and nociception: Mancini et al., 2012; Haggard et al., 2013).

This idea has gained considerable support from recent high resolution mapping in primates (Vierck et al., 2013). It is now clear that different sub-modalities of pain, such as sharp-pricking pain and dull-burning pain, are mapped in different subregions of SI. Moreover, lateral interactions between these subregions significantly alter their relative levels of neural activity (Vierck et al., 2013).

Somatotopic maps of noxious stimuli also exist in the anterior and posterior insular cortex (Brooks et al., 2005; Baumgartner et al., 2010). Separate somatotopic maps are present for pinprick and heat noxious stimuli within the human anterior insular cortex (Baumgartner et al., 2010). This segregation of sensory inputs raises the possibility that integration occurs between these two sub-modalities and also allows these sub-modalities to be integrated separately as well as together with emotional and empathetic information that reaches the anterior insular cortex (Damasio et al., 2000; Baumgartner et al., 2010; Gu et al., 2010; Gu et al., 2013; Frot et al., 2014).

Amplification and global integration is also dependent on the local microcircuitry in each cortical region (Gilbert, 1983). The local cytoarchitecture of the cortex (the presence of discrete lamina and columnar organization) is capable of simultaneously maintaining both the differentiation and spatiotemporal relationships of neural signals. For example, separate features or qualities of sensory stimuli can be partitioned to different lamina while the columnar organization enables these signals to be integrated. Both short- and long-range connections between columns provide additional levels of integration.

The six-layered neocortex is well suited for this neural processing. Signals from the thalamus terminate in layer 4 and are then passed vertically to layer 2 within a minicolumn. Activity is then projected to layer 5 within the same minicolumn. Strong inhibitory circuits involving interneurons refine the flow of information through this canonical microcircuit (Wolf et al., 2014). The layer 2 neurons project to other cortical regions (local and long-distance), while layer 5 neurons project to subcortical regions.

Taken together, if the signal is strong enough and if sufficient information is transferred and integrated, then the feeling of pain emerges (at present, how this occurs remains a mystery).

In summary, to the best of our knowledge, for any vertebrate nervous system to feel pain it must be capable of transferring and integrating a certain level of neural information. I contend that such a nervous system must have, at least, the following organizational principles:

1. An attentional system to amplify neural information;

2. Distinct topographical coding of different qualities of somatosensory information;

3. The integration of different somatosensory information both between modalities (e.g., touch and pain) and within a single modality (sharp versus dull pain);

4. Higher-level integration of noxious signaling with other relevant information (e.g., emotional valence). This requires significant long-range axonal pathways (feedforward and feedback) between brain regions integrating this information;

5. Laminated and columnar organization of canonical neural circuits to differentiate between inputs and to allow preservation of spatiotemporal relationships. The lamina must be capable of processing inputs as well as outputs to either higher or lower hierarchical regions while maintaining meaningful representations of the neural information. The lamina must possess strong local inhibitory interneuron circuits to filter information;

6 Strong lateral interconnections (both local and long distance) between minicolumns to maintain integrity and biological relevance of processing in relation to initial stimulus.

I propose that each of these features is necessary but not sufficient for pain in vertebrates. On this basis it should be concluded that fish lack the prerequisite neuroanatomical features necessary to perform the required neurophysiological functions responsible for the feeling of pain. Fish lack the distinct topographical coding of spatiotemporal integration of different somatosensory modalities; they lack the higher–order integration of somatosensory information with other sensory systems; and they lack a laminated and columnar organization of somatosensory information. What, then, does it feel like to be a fish? The evidence best supports the idea that it doesn’t feel like anything to be a fish. They are non-conscious animals that survive without feeling; they just do it. There is nothing heretical about this idea. For much of our lives, we humans also exist non-consciously.

_____

Brian Key is a Professor of Developmental Neurobiology in the School of Biomedical Sciences, University of Queensland. He is the Head of the Brain Growth and Regeneration Lab there. The Lab is dedicated to understanding the principles of stem cell biology, differentiation, axon guidance, plasticity, regeneration and development of the brain.

References

Baumgartner, U., Iannetti, G.D., Zambreanu, L., Stoeter, P., Treede, R-D. and Tracey, I. (2010) Multiple somatotopic representations of heat and mechanical pain in the operculo-insular cortex: a high-resolution fMRI study. J. Neurophysiol. 104:2863-2872.

Bekoff, M. (2006) Public lives of animals. Troubled scientist, pissy baboons, angry elephants, and happy hounds. J Conscious. Studies 13, 115-131.

Biemond, A. (1956) The conduction of pain above the level of the thalamus opticus. Arch. Neurol. Psychr. 75:231-244.

Brooks, J. and Tracey, I. (2005) From nociception to pain perception: imaging the spinal and supraspinal pathways. J. Anat. 207:19-33.

Brooks, J.C.W., Zambreanu, L., Godinez, A., Craig, A.D. and Tracey, I. (2005) Somatotopic organization of the human insula to painful heat studies with high resolution functional imaging. NeuroImage 27:201-209.

Buschman, T.J. and Miller, E.K. (2007) Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices. Science 315:1860-1862.

Casali, A.G., Gosseries, O., Rosanova, M., Boly, M., Sarasso, S., Casali, K.R., Casarotto, S., Bruno, M-A., Laureys, S., Tononi, G. and Massimini, M. (2014) A theoretically based index of consciousness independent of sensory processing and behaviour. Sci. Transl. Med. 5:198ra105.

Collins, A. and Koechlin, E. (2012) Reasoning, learning, and creativity: frontal lobe function and human deision making. PLoS Biol. 10(3):e1001293.

Corbetta, M. and Shulman, G.L. (2002) Control of goal-directed and stimulus-driven attention in the brain.  Nature Rev. Neurosci. 3:201-215.

Corradini, A. and Antonietti, A. (2013) Mirror neurons and their function in cognitively understood empathy. Conscious Cogn. 22:1152-1161.

Damasio, A.R., Grabowski, T.J., Bechara, A., Damasio,H., Ponto, L.L.B., Parvizi, J. and Hichwa, R.D. (2000) Subcortical and cortical activity during the feeling of self-generated emotions. Nature Neurosci. 3:1049-1056.

Dehaene, S., Charles, L., King, J-R. and Marti, S. (2014) Toward a computational theory of conscious processing. Curr. Opin. Neurobiol. 25:76-84.

de Waal, F.B.M. (2009) Darwin’s last laugh. Nature 460, 175.

Douglas, R.J., Koch, C., Mahowald, M., Martin, K.A.C. and Suarez, H.H. (1995) Recurrent excitation in neocortical circuits. Science 269:981-985.

Driver, J. and Frackowiak, R.S.J. (2001) Neurobiological measures of human selective attention. Neuropsychog. 39:1257-1262.

Escobedo, R., Muro, C., Spector, L. and Coppinger, R.P. (2014) Group size, individual role differentiation and effectiveness of cooperation in a homogenous group of hunters. J. R. Soc. Interface 11:20140204.

Foltz, E.L. and White, L.E. (1962) Pain “relief” by frontal cingulumotomy. J. Neurosurg. 19:89-100.

Frot, M., Faillenot, I. and Mauguiere, F. (2014) Processing of nociceptive input from posterior to anterior insula in humans. Hum. Brain Mapp. 35:5486-5499.

Ganguli, S., Bisley, J.W., Roitman, J.D., Shadlen, M.N., Goldberg, M.E. and Miller, K.D. (2008) One-dimensional dynamics of attention and decision making in LIP. Neuron 58:15-25.

Garcia-Larrea, L. (2012a) Insights gained into pain processing from patients with focal brain lesions. Neurosci. Lett. 520:188-191.

Garcia-Larrea, L. (2012b) The posterior insular-opercula cortex and the search of a primary cortex for pain. Clin. Neurophysiol. 42:299-313.

Gazzola, V., Rizzolatti, G., Wicker, B. and Keyser, C. (2007) The anthropomorphic brain: the mirror system responds to human and robotic actions. Neuroimage 35:1674-1684.

Gilbert, C.D. (1983) Microcircuitry of the visual cortex. Ann. Rev. Neurosci. 6:217-247.

Gu, X., Gao, Z., Wang, X., Liu, X., Knight, R.T., Hof, P.R. and Fan, J. (2010) Anterior insular cortex is necessary for empathetic pain perception. Brain 135:2726-2735.

Gu, X., Liu, X., Van Dam, N.T., Hof, P.R. and Fan, J. (2013) Cognition-emotion integration in the anterior insular cortex. Cereb. Cortex 23:20-27.

Haggard, P., Iannetti, G.D. and Longo, M.R. (2013) Spatial sensory representation in pain perception. Curr. Biol. R164-R176.

Heberlein, A. and Adolphs, R. (2004) Impaired spontaneous anthropomorphizing despite intact perception and social knowledge. PNAS 101:7487-7491.

Key, B. (2015) Fish do not feel pain and its implications for understanding phenomenal consciousness. Biol. Philos. DOI 10.1007/s10539-014-9469-4.

King, J-R., Sitt, J.D., Faugeras, F., Rohaut, B., Karoui, I.E., Cohen, L., Naccache, L. and Dehaene, S. (2013) Information sharing in the brain indexes consciousness in noncommunicative patients. Curr. Biol. 23:1914-1919.

Lobanov, O.V., Quevedo, A.S., Hadsel, M.S., Kraft, R.A. and Coghill, R.C. (2013) Frontoparietal mechanisms supporting attention to location and intensity of painful stimuli. Pain 154:1758-1768.

Mancini, F., Haggard, P., Iannetti, G.D., Longo, M.R. and Sereno, M.I. (2012) Fine-grained nociceptive maps in primary somatosensory cortex. J. Neurosci. 32:17155-17162.

Manning, B.H. and Mayer, D.J. (1995) The central nucleus of the amygdala contributes to the production of morphine antinociception in the rat tail-flick test. J. Neurosci. 15:8199-8213.

Manning, B.H., Merin, N.M., Meng, I.D. and Amaral, D.G. (2001) Reduction in opioid- and cannabinoid-induced antinociception in rhesus monkeys after bilateral lesions of the amygdaloid complex. J. Neurosci. 21:8238-8246.

Mazzola, L., Isnard, J., Peyron, R. and Mauguiere, F.  (2012) Stimulation of the human cortex and experience of pain: Wilder Penfield’s observations revisited. Brain 135:631-640

Muro, C., Escobedo, R., Spector, L. and Coppinger, R.L. (2011) Wolf-pack (Canis lupus) hunting strategies emerge from simple rules in computational simulations. Behav. Processes 88:192-197.

Murphy, B.K. and Miler, K.D. (2009) Balanced amplification: a new mechanism of selective amplification of neural activity patterns. Neuron 61:635-648.

Ploner, M., Freund, H.-J. and Schnitzler, A. (1999) Pain affect without pain sensation in a patient with a postcentral lesion. Pain 81:211-214.

Rakic, P. (2009) Evolution of the neocortex: perspective from developmental biology. Nat. Rev. Neurosci. 10:724-735.

Rose, J.D., Arlinghaus, R., Cooke, S.J., Diggles, B.K., Sawynok, W., Stevens, E.D. and Wynne, C.D.L. (2014) Can fish really feel pain? Fish Fisher. 15:97-133.

Starr, C.J., Sawaki, L., Wittenberg, G.F., Burdette, J.H., Oshiro, Y., Quevedo, A.S. nd Coghill, R.C. (2009) Roles of the insular cortex in the modulation of pain: insights from brain lesions. J. Neurosci. 29:2684-2694.

Veinante, P., Yalcin, I. and Barrot, M. (2013) The amygdala between sensation and affect: a role in pain. J. Mol. Psychiatr. 1:9

Veldhuijzen, D.S., Greesnpan, J.D., Kim, J.H. and Lenz, F.A. (2010) Altered pain and thermal sensation in subjects with isolated parietal and insular cortical lesions. Eur. J. Pain 14:535.e1-535.e11.

Vierck, C.J., Whitsel, B.L., Favorov, O.V., Brown, A.W. and Tommerdahl, M. (2013) Role of primary somatosensory cortex in the coding of pain. Pain 154:334-3443.

Wolf, F., Engelken, R., Puelma-Touzel, M., Weidinger, J.D.F. and Neef, A. (2014) Dynamical models of cortical circuits. Curr. Opin. Neurobiol. 25:228-236.

Advertisements

127 thoughts on “Why fish (likely) don’t feel pain

  1. SocraticGadfly,
    Three things. First, I don’t think your response to my comment was very clear. I asked you what you mean by “subjective end” with regards to access consciousness, but your reply is just that access consciousness isn’t as objective as Ned Block thinks. But what do you mean by that? Do you mean that it doesn’t exist or that it’s uninformative? Moreover, you didn’t really “rephrase” what you said, but rather you provide some kind of comparison between Block’s distinction between access consciousness and phenomenal consciousness and the distinction between semantic and episodic memory. I take that the comparison to be some kind of analogy to make a point, but it doesn’t really clarify your objection and I’m not sure what the analogy is suppose to be or what conclusion you’re trying to get at.

    Second, I think your proposal to use Ockham’s razor against Block’s distinction is problematic several reasons: (1) Ockham’s razor works when positing something goes beyond necessity, but you haven’t explained why positing access consciousness goes beyond necessity. (2) you aren’t clear about which part of the distinction we’re suppose to get rid of: access or phenomenal or both? (3) Ockham’s razor works when an extra entity is being posited, but Block didn’t posit access consciousness as some extra “thing” or phenomenon. Instead, it’s an explanatory concept he uses to explain what many existing cognitive mechanisms (i.e. attention, self-reportability, discriminating stimulus, and others) have in common. At best, you can point out that access consciousness doesn’t do this work very well.

    Third, I really don’t think reading the wiki page is enough to inform you about the surrounding debate on access consciousness and phenomenal consciousness. I suggest that you try to read the literature cited by Daniel Tippen’s in order to understand why some people like Ned Block think the distinction is informative instead of being dismissive about it.

    John Smith,
    I know Daniel Tippens to be someone who really tries to understand someone’s position, interpret it charitably, and then provide a clear objection as a way to take that person seriously. It’s unfortunate that you had to be hostile, since I think there’s a potential to have a fruitful discussion.

    Like

  2. Brian Key

    Another tip for you, if I might be so bold, filling the gaps further. You mention anthropomorphism, and I mentioned in my first post how it would be difficult to determine from experiment whether and how fish actually experience their neural “finalizations” as “feelings”. As noted in my second post, I do directly correlate and EQUATE the experience with the neural event, as single neurons firing synchronously. The anthropomorphic argument would be to say that our knowledge of our own cortical arrangements for integrated awareness, and subject’s well respected reports of what they experience, do not adequately serve as a model for other cortical arrangements. We know both the subjective experience and the neural firings for humans, but only the firings and “behavioral” changes observed in the animal.

    So, it is a simple argument based on human brain structure, and the capacity for fine modulation to the extent of extreme suppression for some “feelings”. We have massive amounts of glial cells, along with other insulators and potential suppressors and enhancers in junctions. Now, as I said, regions and junctions would all use single neurons synchronously across a brain, but partition to TEND to favor hormone preserve or nerve network capacities that each neuron contains. What we might have in animals is a bit like the speculation by physics with its Higgs Particle (a heavy void, not an empty void, with even heavier particles moving in it and unaware of it by relative measurement). Other animals might in fact feel MORE than humans, by being far less modulated and far less intricately integrated using that modulation. Humans might emphasize suppression to unify an intricate experience that includes complex thoughts-arrangements. That would be our burden, LESS feelings than other animals (per cubic centimeter of anatomy, if you like). Prepare yourself for the possibility (only a fairly speculative but interesting argument) that humans are a bad example to compare to animals, and the absence of clear subjective REPORTS of pain by a fish do not derogate from its feeling pain, or any feeling, but quite “raw” for a fish.

    Like

  3. (Continued)

    You might be thinking (or not) that my comment above somehow contradicts the idea of feelings buidling from basic to complex in human and all other animals, as a common neural process across species. In fact, its all quite consistent, but you need to factor in the reality of a continual closed cycle of neural flow – to brain from functional sites, then back to them by MODULATED outputs complete with a given state of supression (Sensory Attenuation) to be immediately detected as supressed when they come right back as inputs (along with what they collect from a world). Its a contained continual cycle that evolevd to human cortical structure from basics, including consistent fish structures, meaning that we never feel like a fish. We always cycle modulated outputs using a neo cortex to receive them, as an evolved structure stuck in modulation.

    Consequently, when basic builds to complex as neurons continually flow, that is basic as received from modulated ouputs, continually for humans – modulated basics as opposed to raw fish basics. It is a self-contained cycle for the experience of feelings, stuck in raw & basic for a fish, or risen to modulated & complex for humans. It is a locked in cycle for humans using a neo cortex that fish do not have, and lower levels leading up to it would be locked into its modulation, continually. Humans would not have basic feelings at the level of a fish, so much as locked-in modulated feelings building to complex. See how this model works?.

    Like

  4. The following are the main ones I understand the author is making:

    1) Some animals are conscious of feelings of pain.

    2) Specific neuroanatomical structures are responsible for consciousness, feelings, and pain.

    3) Some animals, fish in this case, that do not possess those specific structures, are not conscious, and do not feel pain.

    4) Therefore it doesn’t feel like anything to be a fish.

    But why rule out that any other neuroanatomical structures in fish can be considered to support different or more primitive forms of pain or consciousness.

    I prefer how the following piece states the case:
    “The new study severely doubts that fish are aware of pain as defined by human terms”
    “There is still no final proof that fish can feel pain”
    “These findings suggest that fish either have absolutely no awareness of pain in human terms or they react completely different to pain”
    http://www.sciencedaily.com/releases/2013/08/130808123719.htm

    “That is, fish experience “fish pain”, not human or “human-like” pain. This position is rather difficult to challenge when the “fish pain” definition can be argued to absurdity to be correlated with neural activity anywhere in the fish nervous system”

    Yes.

    “However, I contend here and elsewhere that a fish is not conscious (i.e. aware) of noxious stimuli. If it is not conscious, then it can neither feel “fish pain” nor feel “human-like” pain.”

    ‘Fish consciousness’ can also be setup in the same way, so I don’t think that to argue from the non existence of fish consciousness helps.

    Like

  5. “To feel pain requires that you are aware or conscious of your own mental state.”

    If *that’s* what is meant by “feel pain” I doubt anyone would disagree, but then the author might as well have just written “fish don’t introspect” and be done with it!

    Like

  6. SocraticGadfly,
    “these things complexly bootstrap each other, like “good” emergent properties often can and will do. You mention feedback loops, in distinguishing consciousness from thought,”

    Exactly. And think how those loops circulate primal activities and forms throughout the process. Consider the ways magnetic attraction and repulsion, from feelings of love and hate to the more evolved and static concepts of good and bad, permeate our feelings and thoughts. Then the ways we mix and match them, creating endless configurations, from basic black and white, to the most nuanced subtleties.
    I find John Smith’s argument quite interesting, that we have evolved modulating processes to blunt the basic impulses and so fish would sense a much more basic and raw expression of pain than we would. Of course, the negative for us, is that we have the ability to fear a broad range of possible situations, far beyond immediate physical harm.
    I think the discussion of access and phenomenal consciousness is an example of how we instinctively categorize/frame/quantify/digitize the details and loose sight of the process creating them. So then this debate of what is conscious, as though it can be quantified, when we have no idea how to define it beyond the thoughts it manifests.
    Personally I work in a job which requires a lot of instinct, such that it only interferes to overthink it and so I have various mental tricks to keep that linear over consciousness occupied, while the multiple feedback loops keep everything functioning properly. Thinking about these discussions is one of them. Given that much of this is dealing with other people and race horses, to say there is no real consciousness beyond the narrative retrospection is utter nonsense. The phenomenal consciousness is profoundly hyperlinked and access consciousness is only an extremely linear and simplistic reduction of this, like the history we extract from the actual events.

    Like

  7. Fish are perhaps too hard. There are several papers (also from UQ!) addressing whether insects are conscious and whether they feel pain, with the general conclusion that they probably aren’t and don’t. One telling observation is that they continue normal activities despite severe injury.

    In the case of fish, the Rose et al (2014) paper cited in the references is a response to work by Seddon and others (and the book by Braithwaite _Do Fish Feel Pain_ 2010) on behavioural markers of pain (animal learns to avoid, non-simple behavioural response to stimulus). Seddon et al (2003) present experimental results that those authors interpreted as evidence of pain ie rubbing the affected part along the ground, “rocking”, behaviours reduced by administration of morphine. Replications of this work did not see similar behaviour, and the authors of those papers commented that the fish go back to eating soon after (for example) having acetic acid injected into their jaw. In parallel to this, there is a certain amount of literature on stress in fish, where there are detectable longer term physiological effects – but this does not necessarily mean that suffering is involved,

    In the case of sheep, the brain is certainly closer to the human, but behavioural correlates of pain are masked – it is suggested that prey animals hide such signs of suffering (they have to be Stoics :)). But measures of stress (cortisol) show acute changes, and from an animal welfare point of view, painful procedures may be modified in the future, just as in the case of human infants Valerie Hardcastle cited above.

    Like

  8. Once again the issue of fish feeling pain with other parts of the brain has been raised. In my papers I have specifically addressed what a vertebrate nervous system must have in terms of “hardware” in order to be capable of experiencing pain. This approach allows one to infer that the fish brain lacks the necessary prerequisite neuroanatomy for feeling pain. There is no evidence of some unique or “primitive” form of fish pain that arises from a unique region of the fish brain. The classic way of determining how different parts of the vertebrate brain function (either in fish or humans) is to lesion that part of the brain (e.g. by physically removing it). From such studies, we know that there is no unique part of the fish brain, and we know that the function of similar parts is conserved across vertebrate species.

    I contend that “structure determines function” and that theses need to be in line with the basic principles of evolutionary biology.

    I would also like to make a point about ‘analogies”. I use analogies to either highlight a specific point or to have readers examine an issue in a different way. Analogies are not the main argument, they are just a tool. If the analogy didn’t work for you, then it didn’t work. The wolf studies I highlighted were merely an attempt to make you think differently when observing animal behaviour – they were not meant to suggest that wolves do not feel pain.

    Like

  9. BK: “Pain is in the cerebral cortex”

    Various neurological observations challenge corticocentric views of “pain in the brain” and undermine BK’s view that “pain is in the cerebral cortex” and his use of the “Pain Matrix” concept.

    The most compelling observation is that direct electrical stimulation of the cortical convexity (e.g., using transcranial magnetic stimulation), including cortical regions activated by noxious stimuli, rarely produces pain in awake patients (Libet, 1973; Penfield & Rasmussen, 1955). This contrasts with stimulation of cortical regions associated with vision, audition, olfaction, and innocuous (non-noxious) touch, which reliably produce the corresponding experiences.

    In response to these findings, it may be argued that unlike the other senses, several regions of cortex must be co-activated to produce pain. However, in epileptic seizures cortical discharge is typically widespread and it is extremely rare for epilepsy to present auras that are painful (Nair et al. 2001). It is also relevant to note that in contrast to the cortex, pain is produced by focal (microelectrode) stimulation in certain areas of the thalamus and brainstem (Dostrovsky 2000; Yen & Lu, 2013).

    Insular cortex is a core node of the “pain matrix” and is mentioned by BK as essential to pain experience. However, Pereira et al. (2005) found that direct stimulation of the meninges and blood vessels that overly the insular cortex produces pain. These structures are supplied nociceptive innervation by the trigeminal ganglion. Thus, pain reports following insular stimulation may not actually be due to activation of the insular cortex, but (simultaneous) stimulation of local non-neural tissues.

    Finally, concerning the “Pain Matrix’’, some researchers have argued that the fraction of the neuronal activity measured using currently available macroscopic functional neuroimaging techniques (e.g., EEG, MEG, fMRI, PET) in response to transient nociceptive stimulation is likely to be largely unspecific for nociception (Iannetti & Mouraux, 2010); Iannetti et al. 2013). If this is correct, the term ‘‘Pain Matrix’’ should be used with greater care, because it misleadingly implies that the recorded responses are specific for pain.

    References
    Dostrovsky, J. O. (2000). Role of thalamus in pain. Progress in Brain Research 129:245-57.

    Iannetti, G.D. and Mouraux, A. (2010). From the neuromatrix to the pain matrix (and back). Experimental Brain Research, 205:1-12.

    Iannetti, G. D., Salomons, T. V., Moayedi, M., Mouraux, A., & Davis, K. D. (2013). Beyond metaphor: contrasting mechanisms of social and physical pain. Trends in cognitive sciences, 17(8):371-378.

    Libet, B. (1973). Electrical stimulation of cortex in human subjects, and conscious sensory aspects. In: Handbook of sensory physiology, vol. II, ed. A. Iggo, pp. 743-90. Springer-Verlag.

    Mazzola, L., Isnard, J. & Mauguiere, F. (2006). Somatosensory and pain responses to stimulation of the second somatosensory area (SII) in humans: A comparison with SI and insular responses. Cerebral Cortex 16:960-68.

    Nair, D. R., Najm, I., Bulacio, J. & Luders, H. (2001). Painful auras in focal epilepsy. Neurology 57:700-702.
    Penfield, W. & Rasmussen, T. (1955). The cerebral cortex of man. MacMillan.

    Yen, C. T., & Lu, P. L. (2013). Thalamus and pain. Acta Anaesthesiologica Taiwanica, 51(2):73-80.

    Like

  10. Hi Brian, that was a very interesting article with some excellent detail regarding the neuroanatomy of pain. I enjoyed it very much. Unfortunately I do not wholly agree with the conclusions and see others have raised similar concerns (particularly Daniel Tippens and marclevesque. I’ll try to avoid complete repetition.

    In this essay you developed a model of how pain is generated and dealt with in the human brain (and arguably for some other vertebrates). It is certainly plausible to point to the absence of elements from that model as evidence that fish do not “feel pain.” But some questionable underlying assumptions prevent drawing any hard conclusions…

    1) Is the singular model of neuroanatomy presented the only one capable of generating the feeling of pain? Given convergent evolution, there may very well be different architectures to produce similar (though I would agree not exact) feelings. There was no argument/evidence put forward to exclude such a possibility. This is especially true for less complicated systems, which are dealing with less information processing in general. All of your criteria are arguably necessary for a neural system that has/requires advanced attention control, with a capability for fine spatio-temporal, modality discrimination. That does not mean all vertebrates require such complex ‘analyses’ about the world. The only reason for neural amplification structures (to use one example) is if there are a lot more items to draw one’s attention (and require computational power) than pain/pleasure.

    2) Does feeling pain require consciousness (sustained awareness)? It certainly would not be the exact same experience (prolonged, integrated with other experiences) without consciousness, but that does not mean pain is not felt in some temporally limited fashion. That they move in response to damage can very well indicate an immediate sort of feeling. It just wouldn’t ‘stay with them’ in the same way. Though Dr Keys makes the point that longterm health of the system itself can be effected (stressed).

    3) Lack of pain does not mean lack of other unpleasant sensations. I was at a conference where a neuroscientist reported that severing a nerve to prevent feelings of pain, resulted in the replacement of pain with emotional distress (extreme fear, panic, etc). The human body contains separate ‘messaging’ systems about one’s wellbeing. Is it not possible that fish can still feel fear, panic, etc? If they have an amygdala (and relevant neurochemistry) that would seem possible. Regardless of feeling pain, the general feeling of ‘fight or flight’ is not always pleasant, especially ‘flight’.

    It would be more convincing to me to see what we know about fish, rather than humans. Specifically do they have nociceptors (I checked and it seems they have a low % which supports your case) and how do they interact with the fish brain? Do signals lead to reflex-like muscle activity, or do they involve more complex (than reflex) neural activity in the brain including release of chemicals related to stress?

    Like

  11. Brian Key claims one needs a cortex to suffer pain. Reptiles and birds have no cortex, and they suffer pain.

    http://www.wiringthebrain.com/2010/09/ancient-origins-of-cerebral-cortex.html

    The cortex is over-valued: modulation by glial cells occur along axons, for example.

    It has been notoriously difficult to find out how birds’ brains work. Still, some bird species score possibly higher in some mental ways than any primate, but man.

    Attributing to animal brains the same general purpose that our brains have is just common sense. It is not forming the world according to man (anthropo-morphizing). It is just the most natural explanation, the most economical one, too (“Occam Razor”).

    Starting an essay by telling us one can think of wolves differently, like machines, exhibits a mood to impel on us the notion that animals are machines. When human hunters go out after game, they use the same tactic, as described by Brian, not because we can think of them as simple computer program, but because it is the smartest strategy to follow.

    Common sense is found in computer programs, wolves and humans, because sense is common.

    And brains are into making sense. By the way, dear Brian, computer programs are written by humans, and, apparently, wolves. This is all you have demonstrated.

    In the link I gave in an earlier comment, groupers are found to recruit complementary predators to hunt. Other fishes do this. The idea is to find a predator such as a Moray Eel to get in cracks and caves. The eel understands this, and the grouper makes a suggestive dance and mimic to get the eel into action.

    Since I wrote the initial article linked above, other species of fish have been found to also suggest transpacific cooperation to fetch food.

    Any trout fisher will tell you that old trouts are very smart. You can put the juiciest morsel in front of them, once they know it’s an ape who proposes dinner, they won’t bite.

    SocraticG: I had indications of extreme intelligence, and plotting capability, on the part of parrots.
    Parrots don’t have cortexes.

    I know a very dirty look when I see one. I actually operated a prompt retreat from my perch, as I got seriously worried that the lioness come around to get even.

    The question here is what does it mean to “understand”, and to “know”. Once again, those familiar with clever animals know very well that they are capable of “knowing”, and “understanding”. Border Collies have been scientifically documented to be able to learn hundreds of words, in several languages. A wild eagle can be taught to land on a hang glider, or to kill a wolf for its human companion…

    Robin: I added a significant detail on the octopus. It was clearly a last act. It required a special effort whose emotional motivation was of a moral character, as it had no survival value whatsoever.
    I agree with your remarks. I have seen birds indulge in very smart behavior. Including the hummingbirds I presently feed.

    Like

  12. Brian,
    While there seems to be the general agreement that fish have very limited, if not non-existent pain sensing functions, I’m still not sure how the connection is then drawn that they must be wholy non-conscious, given they have other sense organs. It does seem anthropomorphic to assume any creature lacking sensory functions related to humans must be effectively a zombie.
    One of the threads running through this discussion and similar ones, is whether phenomenal consciousness is really on the same level with analytic or introspective consciousness and I would argue anyone holding that position simply hasn’t been analytic or introspective enough. We have complex brains in order to process information. As such it is an evolved trait to serve in our survival. Consider how it functions; By taking in large amounts of external input and processing useful scenarios and causal processes to enable us to navigate our environment. In some ways, this resembles what the digestive system does, in order to extract energy from the environment. Both break down the input and extract what is necessary, discarding the rest. Which then allows us to progress on our path. Given that academics are in the business of analyzing information, is it surprising there is a general assumption in the community that this is a higher order function than simply taking it in? The problem with this premise is that taking the whole of society as a large organism, this isn’t necessarily the case. In government and the business world, think tanks and corporate boards don’t necessarily run things, but serve as a brain trust for those who do. There is that constant feedback loop between the details and the decision making and a significant aspect of the decision making process is knowing when not to make decisions. Not to jump the gun, Not to make judgements unnecessarily. Not to overthink the details and distract from the broader reality. Obviously analysis is necessary, even for the more basic life forms, but when it impedes ones broader phenomenal perceptions, it simply becomes another blind spot. A lack of that very phenomenal sensitivity by which you would define consciousness.
    My last. Cheers.

    Like

  13. Philonous First, the SEP isn’t always the be-all and end-all for a handy reference. As I noted on (I think) Massimo’s most recent Stoicism essay, the SEP doesn’t even have an entry on Cynicism. I’ve found it wanting one or two other times as well.

    I otherwise think the Wiki page has more than enough information, beyond what I already know on the subject, to let me have an informed judgment. In fact, I could argue that Wiki is better than the SEP on this specific subissue by the fact that it groups Lycan’s eightfold bifurcation together; the SEP entry doesn’t. Indeed, it doesn’t reference them at all.

    If you don’t like the fact that I don’t agree with you or Daniel on this point, fine. But, I think I’ve made a reasoned assessment.

    And, per previous discussion on this issue, including some by Massimo himself, Wikipedia’s generally not a bad reference on philosophy issues. On the “consciousness” entry, while it’s certainly organized in a different way than SEP, its entry is more than 50 percent the length of SEP’s (I did word counts on both), so it’s not “cursory,” and of course, being Wiki, has plenty of links to other posts for more detail.

    So, I’m not convinced by your objections, and I move on, while also politely asking you not to raise the “Wikipedia objection,” or to reify the SEP based specifically on a “Wikipedia objection.”

    Brodix I’d at least partially buy into this idea from your latest post, without calling phenomenal perception “conscious.” Rather, I’d say that as our consciousness develops (people here know I’m non-polar on consciousness, it’s not an on-off) we’d become more aware of phenomenal perceptions, we’d put “interpretations” on them, etc., which would then in turn influence how future said perceptions would interact with and drive consciousness. And, mayhaps fish indeed aren’t totally nonconscious, so I am OK with modifying earlier statements.

    Patrice Parrots in particular and birds in general are known to have homologues to the cortex. How this may relate to pain is probably needing some fleshing out. http://www.uchospitals.edu/news/2012/20121001-neocortex.html

    Human hunters consciously discuss their plans for stalking prey before setting out; wolves do not. Therefore, there’s no such parallel like you imply.

    Brian Thanks for getting us back to the meat. Yes, if “lower” animals do not have certain brain functions, or they don’t operate in certain ways, they cannot experience pain. It’s not anthropomorphizing. Now, to define “pain” as we do may be anthropomorphizing, but that’s a different issue.

    That leads to what others have hinted at — what is it like to be a fish?

    I suggest this is far different from Nagel’s famous essay. With a nearly nonconscious fish, it’s probably not “like” anything. Since fish cannot make conscious, non-instinctual judgments of difference (bats at least maybe can), a fish can’t consciously compare itself to a fellow member of the same species, let alone other species, genera, etc. It’s certainly not something we can “map” from our own background.

    Like

  14. This has been illuminating. Fish, unsurprisingly now, do not have human feelings. They experience fishy feelings perhaps, or perhaps not. The neural correlates of fish existence is a long way off.

    There is another dimension of human pain that was not touched upon, which opens the door to the possibility that fish do experience pain. The review above deals exclusively with somatic pain. Humans also experience visceral pain, the neural correlates of which are completely unknown, to the best of my knowledge, which is not authoritative.

    Ischemia (lack of blood supply and oxygen) of heart or skeletal muscle causes pain. The worst pain of all is supposed to be ureteral colic, presumably due to increased tension or pressure in the wall of the tube as a stone is being expelled. Inflammation of lung is painless, inflammation of pleura very painful. Broken bone very painful, death of bone not immediately.

    The pathways and neural correlates of these pathways are not well known, I think, and may, therefore, be applicable to fish.

    Like

  15. At the risk of making a complete fool of myself, let me say at the outset that I am completely ignorant of the science in this matter. Like Marko, I must confess to being overwhelmed by “usage of technical jargon” in the article. But, given the OP’s thesis, perhaps this was unavoidable.

    Both the subject matter, article, and commentary have been fascinating to me.

    Just to pick a couple of comments that I tend to favor:

    Marko Vojinovic: “You need to define the concept of pain in a hardware-independent way, and then argue that human hardware enables them to experience it, while fish hardware does not. Defining ‘pain’ via the presence of appropriate hardware is the very definition of ‘anthropocentric’.”

    Bill Skaggs: “But here’s the thing. That solution just doesn’t work. It is wrong even for mammals and other people, and if you try to apply it to octopuses or lobsters, it doesn’t produce anything except bafflement.”

    Yes, the anthropocentric and bafflement. It is unclear to me whether your criteria either attenuates or amplifies what can be said about the experience of pain. It just seems to reduce to an argument about how complexly we can describe or what we observe when we believe an event could possibly produce pain in a sentient being.

    I discussed this with my wife since she was within shouting distance. Hypothetical experiment: A wood splinter is inserted into a child and an adult. The splinters are identical in size and composition and are inserted at identical points on the child and adult. An MD is present and asks each to describe the level of pain experienced on a scale of 1 to 5, with 5 representing the highest range.

    My wife immediately says, “That’s not helpful a fish can’t describe a threshold of pain to an MD.” Good point. “Besides, the child’s psychological response will intensify his physiological response to a greater degree than the adult’s.”

    I tell her she’s not helping. “What is pain?” I ask. She says, “Do you want me to demonstrate by knee kicking you in your privates?”

    Hmm? “That I understand,” I say, “but what about outliers?”

    “What?” she asks.

    “You know events where it’s not clear whether an experience is pleasurable or painful like erotic asphyxiation.”

    “I don’t think we need concern ourselves with human or with fishy erotic asphyxiation.”

    “True. This reminds me of a sorites paradox.”

    “What’s that?”

    “Ugh. It’s complicated. Like explaining what ‘tall’ is or a ‘heap.’ But fish don’t care anyway.”

    Like

  16. I think the major point is that the neocortex or the higher machinery which our brains have allow us to perform the engineering function of feedback so we can actually exacerbate our pain or by discipline actually suppress the pain. The author is theorizing that animals with simpler nervous systems may exhibit the classic stimulus response to pain or injury, but lack the ability to amplify it (or suppress) which higher mammals can do.

    Like

  17. Victorpanzica, “The author is theorizing that animals with simpler nervous systems may exhibit the classic stimulus response to pain or injury, but lack the ability to amplify it (or suppress) which higher mammals can do.” Yes, I get that, but don’t necessarily accept the “higher machinery” part or what might be meant by “ability.” Ability for what? Haven’t I read somewhere that when being attacked there comes a point where the sensation of pain in prey shuts down completely as a result of shock and nothing is felt or experienced?

    Like

  18. Several of the last posts have turned to more details about neuroanatomy. This is probably not the place to continue in too much detail in this regard. Nonetheless, I will make a few brief comments. For those people with access to the more recent literature (e.g. SimonVS), I will direct them to papers that may help.

    The amazing thing about direct electric stimulation of the brain is that it can produce pain. Given that pain involves global integration it is somewhat surprising that localized current can produce pain. A crude analogy is expecting to get something coherent on a computer screen by applying electric current to the computer hardware. It was mentioned that current to the thalamus can produce pain. This is not unexpected as the thalamus is part of the pathway from the periphery to the brain. With regards to the concept of the pain matrix I refer SimonSV to a recent paper I happen to have on my desk – you can track backwards in time and see some of the really fascinating work on the neural signatures of pain. Please note that there is no pain-specific brain region. The circuitry is shared, it has to be, as our brains would otherwise need to be huge.

    Woo, C-W, Roy, M, Buhle JT and Wager, TD (2015) Distinct brain systems mediate the effects of nociceptive input and self-regulation on pain. PLOS Biology 13:e1002036.

    dbHolmes has raised another neuroanatomy question, and has asked whether other neuroanatomies could generate pain. As I said in my last post we should stick to evolutionary biology principles. Given we are discussing fish and humans within the vertebrate lineage we can make reasonably powerful inferences. There is no evidence that fish have any specialized neuroanatomy that can duplicate the functions of the human cortex with respect to pain.

    dbHolmes also asks whether ongoing pain requires sustained awareness. If you are feeling pain at this present time it is because it has reached your consciousness. Of course, the thing that is causing you pain may not have gone away when you are not feeling it and it is then acting non-consciously (but you don’t feel it).

    Several readers have shifted their thoughts from pain to fear and panic and wanted to know if these are separate pathways. There is a prominent USA “fear” researcher called Joseph LeDoux who has recently admitted that there is widespread misconceptions about what fear is. He nicely articulates that there are two parts to fear – the motor response (e.g. facial grimaces) and the “feeling” response, which is in the cortex.

    Patrice raises the question about birds and pain. In my paper in Biology & Philosophy I clearly indicated that while birds do not have a cortex they have neuroanatomical features that would theoretically allow them to fear pain. Patrice raises the idea that “common sense” tells us that animal brains have the same general purpose as humans. I challenge readers to go beyond their everyday experiences because sometimes “common sense” can be misleading.

    Like

  19. Brian Key

    “The classic way of determining how different parts of the vertebrate brain function (either in fish or humans) is to lesion that part of the brain (e.g. by physically removing it). From such studies, we know that there is no unique part of the fish brain, and we know that the function of similar parts is conserved across vertebrate species. ”

    Tampering is Jenga – if you knew how the neurons were integrated across structure, you wouldn’t need to pull out a stick. Structure is indeed the key, including the removal of key components to affect a synchronous whole! Structure evolved “as one”, we ought to assume, and in any animal it would be specifically, differently integrated despite some comparisons being made. You are moving to higher levels and different structures that you only understand by Jenga. You also overlook the fact that this is all simply “awareness”. You cannot so easily isolate pain from how any awareness is created – and you admit openly but very briefly in your essay that we don’t know how that “experience” arises – pain or any state of awareness, You just correlate to structure when in fact you should correlate all awareness, perceptual, thought, pain, pleasure, anything other than a flat neural line (because it might contribute to the experience directly or indirectly as that line build to a finalization by neurons.

    The fact that there are cortices and structure across species does not remove the common likelihood that the experience arises fundamentally from individual neurons firing synchronously all over the brain! Pain or any experience, which humans certainly have, and fish might have by THEIR neurons. They see, hear, smell, and yet you say they have no neo cortex and may not be awareness that they see, hear & smell? They don’t control their moves and fashion their neural service to moves sufficiently to construct thoughts at all about those feelings? No glimmer of an idea that sticks or builds to aid their conditioning – or you say its all “automatic” without any glimmer at all of though construction at a fish level of awareness?

    I need to know from you – what awareness is; how awareness arises; why pain needs to be completely isolated from any other feelings, including vision, hearing & smell (it looks like pain is Jenga in cortical processing, nothing more); why cortical Jenga prone to upset advanced species apple carts is a good model for lower species without their reports; how can any assumptions be made about fish thoughts to have a glimmer of intentional control of their moves in line with useful or possible conditioning? Too many holes.

    Like

  20. Brian Key, thanks for your very informative essay. Daniel Tippens, that Lamme paper is an absolute gem; thanks for linking it.

    Asher Kay, I don’t see a need to define pain in a hardware-independent way even if we are talking about aliens. (Which isn’t to say that aliens couldn’t have some other aversive internal state.) But generally – yeah – what you said. Come to notice it, “what you said” seems to be what I always think about your comments.

    Like

  21. First, a nitpick, I don’t think Keys means to restrict his discussion to acanthopterygians (spiny rayed fishes) since most of our fish physiology research is done on soft rayed fishes (actinopterygians, but not acanthopterygians) such as trout, eels, carp and the white rat of fish, zebrafish. Such discussions as this are usually limited to teleosts (modern bony fishes) which are essentially the actinopterygians, but excluding the most primitive: sturgeon and gar.

    Now I offer the bane of reasoned scientific consensus, the anecdote. It is a one supportive to his argument, however, so I like to think of it as a practical illustration. I was once fishing in the vicinity of a number of other anglers and catch an undersized fish. Unhooking it prior to release, I see a piece of fishing line running from its mouth. Closer inspection reveals an eye of a hook protruding out of its gullet deep in its mouth behind the gill rakers. Deciding to attempt a “good Samaritan” act, I take my long nose pliers and carefully work the hook out of fish’s esophagus. As the bend of the hook emerges I am amazed to see that there is a minnow, still hooked through the lips. The minnow, while dead, was bright eyed, the fins not frayed, the scales and mucous coating intact, despite the gastric acid of the gut. My conclusion was that the minnow had been dead for less than 30 minutes, more likely, well less than 15. Now, those who antropopmohize pain to fish, reverse the process. While eating dinner you somehow swallow your fork and it becomes impaled in your esophagus, only the tip of the handle visible in the back of your mouth. Your next action? Grab something tasty off the desert tray and eat it!

    Like

  22. I have a lot of educational catching up to do before I am able to completely understand the arguments here–nevertheless–I have a question for the author or anyone else who cares to answer.

    Having kept fish in aquariums and ponds, I notice that they are sometimes more hungry than at others. If they haven’t been fed a while–they exhibit excited behavior when it comes feeding time and if they have just fed, they will ignore additional food. Also, one can predict what areas of the pond they will move to based on temperature. I always interpreted these actions as their feeling a preference. That is, when they haven’t ingested nutrients for a while, they are, in some way, compelled to ingest nutrients. I experience my own hunger as a pain of sorts.

    My questions for the author or anyone else: are fish experiencing anything when they choose to feed as opposed to when they choose not to? What exactly is it that is determining their action if not some sort of sentience–and if this sentience is not ‘pain’ and ‘pleasure’…what is it? We can make automatons which do things based on input, and we know no sentient experience is involved because we know exactly why the automaton behaves as it does; we built it after all. But until we can say exactly what makes a fish take actions which are conducive to it’s survival, shouldn’t we assume that it is due to some sort of sentient experience? If something behaves as if it is suffering, shouldn’t we assume suffering is what is causing the behavior until we know exactly what is causing it? The fact that it’s bodily anatomy (including it’s brain) is not constructed in the same way ours is, is not an explanation of it’s behavior is it?

    Like

  23. Hello Brian,

    I heartily applaud the work which you and others are doing to perhaps someday “quantify” the conscious experience. I don’t know when science will bring us reasonable ways to estimate the magnitudes of our pain, hope, fear, hunger, itchiness, and so on, but such tools should revolutionize the field of philosophy in general if/when we get them. In a sense I see this as the “engineering” side of philosophy, as opposed to its more standard “architecture” side. Observe that the architect must not become too concerned with the “how” portion of building a proposed structure, since this might otherwise taint a given design. Instead the architect says “Here is what I want built, so your task as an engineer is to figure out how to effectively put my plan together (though we will discuss various options if you do find it impractical).” But unfortunately we philosophers do not have any “great architecture” to our credit so far, given that we haven’t yet developed any generally accepted understandings. Therefore perhaps the “engineers” among us will one day sort us out. I personally reside on the “architecture” side however (and even to the extent that I avoid using the term “brain”). Nevertheless I do enjoy casually hearing the technical issues associated with your side (and I’m especially pleased that you personally use “non-conscious” rather than Freud’s old “unconscious” term). I’ll now weigh in on the matter at hand, or “fish consciousness.”

    I must begin by saying that I can’t ultimately know that anything other than I myself happen to be conscious — I proudly accept the “Cartesian Materialist” title (which I believe Daniel Dennett invented to derisively label philosophers such as myself). I also presume that I’m not reality’s only conscious entity, since I do suspect that there are various other real and conscious humans. Furthermore I do suspect this to be true of mammals, marsupials, and birds as well. In the end I’m inclined to also place fish in this category, since I suspect that their function demands too much autonomy for evolution to have painstakingly designed them to “roboticly” deal with their diverse environments — or to lack the punishment/reward dynamic which is associated with consciousness. Nevertheless, the day in which humanity becomes armed with instruments to quantify the sensations which a given conscious entity experiences, will please no one more than me! I’d simply love for this question to empirically be resolved. As I’ve said, all I can really know right now is that I personally happen to be conscious.

    Brian, all architects need good engineers to help them build what they envision. I’ve been developing my own such theory since I was a college kid, or for about 26 years now. Back then I decided that if so many brilliant philosophers in academia keep failing, then couldn’t academia be the problem itself? Regardless, if you are willing to give me a bit of your time (and I do know that you’re busy right now), then I’d love for you to assess my own plan, or how I mean to help the ancient field of philosophy, become the science upon which psychology, psychiatry, sociology, cognition, and so on, finally gain solid foundations from which to practically serve humanity.

    email: thephilosophereric@gmail.com

    Like

  24. So simplifying, the author assumes that our mental words (of pains and feelings) refer to mental states and in fact that these states are identical to brain states, and then concludes that because fish have different brain physiology they can’t have those mental states.

    If you define pains as the essay does (as particular kinds of brain states), the conclusion is then quite plausible. However, it is not obvious that this is how mental words are used in ordinary language. (Note that I most definitely am not saying that this “identity theory” is wrong because some kind of “dualism” is right, for me those positions are both wrong for the same kinds of reasons.)

    The claim that we anthropomorphize animals too easily and that this can lead to bad theories about animal behavior is no doubt true. But ascribing some creature pains in ordinary speech is not really putting forward a theory.

    What these kinds of new empirical results present to us (about the neurophysiology of fish for example) are challenges to our inherited ways of speaking. Perhaps in the future we might develop and adopt a concept of pain that excludes fish from having it. But that is not simply a discovery, it is a redefinition of a sort.

    Like

  25. spirit is not a mankind science, rather, is something only one’s self can discover. Once upon a time someone said, “know thy self and thou shall be master of the universe”….

    Like

  26. Thanks for the reference, Socratic. Cells seemingly functioning like mammalian cortical cells were found in some different places in bird brains, and still in another place in reptile. Maybe. That was in 2012. Fishes?

    Not yet. But considering how smart some fishes are known to be, it’s just a matter of time.

    Brian Key claimed: “Pain is in the cerebral cortex”. I was just protesting that assertion. Anybody who had a parrot know they can suffer (and occasionally inflict) psychological pain.

    I was shocked to learn that wolves did not discuss hunts before engaging in them. I have also some knowledge of African wild dogs (Lycaons). I must confess I discussed with them the merits of sugar versus meat pieces. You see, I don’t speak just half a dozen human languages, but also wild dog language.

    I met, in the wild, these terrible predators, capable of killing and eating lions. They found my company interesting, so we discussed stuff. You should try it, someday, very instructive.

    I had another go at Mr. Key’s argument on my site:
    https://patriceayme.wordpress.com/2015/02/07/why-how-humans-think/

    I have there a picture of a Sumatra tiger who cut his leg to stay free, and alive (the leg was found in the trap).

    I also found in the meantime that Mr. Darwin espoused my opinion. Said Charles: “the difference in mind between man and the higher animals, great as it is, certainly is one of degree and not of kind.

    It’s fishy to tell us fish are fishy, and thus quite different. Even Romans felt very differently from us. That fishes do too, is scampi fritti: delicious, but not good for mental health, if abused.

    The real question is whether fishes have the analogue of pain. Some of them (groupers) have the analogue of higher mental functions. To believe they would be very smart without having evolved (an analogue of) pain stretches the imagination.

    Grouper go discuss with eels (sign language). That was discovered in the last few years, so I don’t expect the knowledge to percolate to the masses yet.

    Fish ethology stinks to high heavens. Why and how swordfishes use their swords was only discovered in 2014. Although these fishes have been caught for millennia.

    Brian: Thanks for the answer (and for the stimulating ideas). Rest reassured that I eat with relish enormous servings of common sense every day.

    I just found enlightening to contradict common sense ONLY when very good reasons offer themselves irresistibly. The bird (and Homo Floresiensis) situation show that the highest mental functions can appear with a very different neuroanatomy.

    Pain is too much of an advantage for such a smart animal such as a grouper, not to be present.
    Groupers, certainly, and even sharks, can be tamed (some individual sharks can become man eaters, why 99.9% do not, studies of Reunion islands have revealed).

    To suppose animals do not have common sense and common feelings is quite opposed to ethological evidence, so it requires much more than different neuroanatomy to be demonstrated.

    Like

  27. Dear Wm. Burgess
    Thank you for your insight about fish feeding. Please have a look at this youtube video.

    It is a video of a white blood cell tracking down a bacteria and finally consuming it. If viewed like most people see it, you will think that the bacteria is in panic and fleeing from the white blood cell. It tries desperately to escape, zig-zagging to throw off its chaser. The white blood cell is relentless, it is hungry, and has the bacteria in its sights. Finally the bacteria is cornered and eaten. The chase is over and the white blood cell is contented after eating its prey.

    It is hard to resist these thoughts, it is common sense. While you can invoke the idea of “white blood cell hunger” or “white blood cell consciousness” or “bacteria panic”, it really doesn’t help to understand the underlying biology of this behaviour. Now let’s return to fish. Similarly, you don’t have to invoke “fish hunger sensations” to drive feeding behaviour. You will be amazed to know that fish continue to feed even when most of their brains are removed. It is a lower brain/brainstem reflex behaviour. There are actually many behaviours that are “hard-wired” and independent of feelings such as hunger. You could perhaps argue that in many cases, survival is dependent on this “hard-wiring”.

    Dear Eric
    thanks you your interesting comments but I have a mountain of work on my desk and 300 undergraduate students demanding my attention. I wish there were more hours in the day. Best of luck.

    Like

  28. So the fish is a sort of p-zombie. From Wikipedia: “A philosophical zombie could be poked with a sharp object, and not feel any pain sensation, but yet, behave exactly as if it does feel pain.”
    The neurobiology in the article is over my head, and I’m not qualified to tell whether any of it is correct. I’d rather know where the conclusion stands between the fringe and a scientific consensus.
    But I’ve always argued that p-zombies could exist if we just simulate the human brain on a standard computer or a Turing machine. It would behave like a real brain, but without an actual nervous system it wouldn’t be conscious or feel anything. Has any philosopher or sci-fi author made this argument before?
    If consciousness is not necessary for intelligent and even humanoid behavior, then is consciousness just an evolutionary fluke? Does it mean that we can’t tell whether a seemingly intelligent creature is conscious just by observing its behavior or communicating with it? If we create humanoid p-zombies, should we pretend that they’re actual humans with human rights, or is it ok to force them into slavery and whip them?

    Like

  29. Professor Key,

    Thanks for your response, and in particular for addressing the issue of feeling, awareness, attention and consciousness. The main problem I still have is that you seem to be taking it that all these terms are basically synonymous, and thinking of them as an all-or-nothing affair. That is, you seem to be saying that we are either fully aware of something, explicitly thinking about it, and making it the focus of our attention, or else we are purely unconscious automata, zombies, feeling nothing at all. It is perhaps because you take this to be obvious that you think it’s enough to simply restate your position rather than address any of the counter-arguments various commentators have made. Thus you write:

    “If you don’t feel it (not aware of it) then it is simply non-conscious and not a feeling (i.e. not pain). … The important point is that if you are not aware of it, then you can’t feel it (hence my “ischial tuberosity/backside” analogy).”

    Since you don’t offer any further support for your claims here, but only refer us back to your “ischial tuberosity/backside” example, let’s take an analogous example:

    Consider the fact that as you are reading these words your ideas are moving from left to right and then back again at the end of each line of text. You probably weren’t consciously aware of this – that is, you weren’t paying attention to it – until I pointed it out to you just now. What does this establish? That you were an unconscious automaton, devoid of all feeling, awareness or consciousness, until propelled into consciousness by my sentence? Indeed, since you were not explicitly attending to your own mental state in seeing and reading these words, does it follow that you were not in fact seeing at all, that you were not undergoing any kind of experience whatsoever? After all, if we have to be consciously attending to the mental state of feeling pain in order to feel pain, do we not also have to be consciously attending to our mental state of seeing in order to see, consciously attending to our mental state of hearing a sound in order to hear, etc.?

    Apart from how implausible this looks on its face, I have numerous other difficulties with it, only a couple of which I have space to mention. For example, if feeling pain “requires that you are aware or conscious of your own mental state” and “[m]ost of the time we act like automatons” because we do things “without thinking about what it feels like”, what are these mental states and feelings we are thinking about when we do so? In other words, it seems that you are defining consciousness in terms of second-order awareness, or self-consciousness. But apart from the fact that this leaves it entirely obscure what the first-order states are supposed to be, and threatens an infinite regress, how many species of animal are capable of becoming conscious of their own mental states, or of forming second-order thoughts about their first-order mental states?

    Like

  30. Just found an article that seems to follow the lines of Brian Key’s article while at the same time cautioning against anthropocentric or, perhaps better in this case, cortex-centric views:

    “These findings suggest that fish either have absolutely no awareness of pain in human terms or they react completely different to pain. By and large, it is absolutely not advisable to interpret the behaviour of fish from a human perspective.”

    When so expressed, it just seems to me rather underwhelming.

    http://www.sciencedaily.com/releases/2013/08/130808123719.htm

    Like

  31. Since it looks like you’re not going to respond again I will not bother stating my remaining objections, but let me just add that it seems to me that, if followed through consistently, the line you are taking would lead to the conclusion Peter Carruthers arrived at in his 1989 paper ‘Brute Experience’, according to which since no non-human animals are conscious, or feel pain (again, since none of them are capable of second-order conscious thoughts about their own first-order mental states) it is morally objectionable to try to alleviate their apparent (but actually non-existent) suffering, since it would be a waste of time and energy better spent on human beings.

    Of course, you do not explicitly state that only human beings experience pain, and would probably explicitly deny it, but given that (like Carruthers) you stipulate that the only animals capable of feeling pain are those that can self-consciously introspect, reflect upon, or form second-order thoughts about their own mental states, it seems to me that you are committed to such a conclusion nonetheless. So, are you so confident in your arguments that you would be willing to condone the unlimited use of non-human animal experimentation on the grounds that (1) no matter how much other animals appear to suffer, only we are capable of experiencing pain, and (2) it would be morally objectionable not to do so whenever it might benefit the only truly sentient creatures on earth: namely, ourselves?

    It seems to me that while one must be on guard not to commit anthropomorphic fallacies, one must also be careful not to land oneself in anthropocentrism of the kind that Carruthers endorses (or at least did so circa 1989), and which, I would argue, you are also committing yourself to here.

    Since I have 200 words left in this (my fourth) comment, I might just add that it seems to me that Professor Key’s post provides a perfect example of why it’s crucially important that suitably qualified philosophers of mind (which most of us here are obviously not) should work together with cognitive scientists, neuroscientists, ethologists et al. on these kinds of issues. Frankly, while I am very grateful to Professor Key for his essay and for participating in this discussion, it seems to me that there is a whole lot of work he still needs to do on the sorts of conceptual questions many of us have raised here (however ineptly by many of us who are not professional philosophers), and that he could do a whole lot worse than consult the best of the work that has been done in the philosophy of mind in order to make further headway on these matters. By the same token, of course, most philosophers of mind really need to work on their neuroscience. But there is also another solution: interdisciplinary collaboration.

    Finally, since Massimo is both a philosopher and a biologist (and might therefore be familiar with the work of biologists and neuroscientists who would disagree with Professor Key on these matters: e.g. Sneddon, Brathwaite, Chandroo et al. on the matter of fish pain, and Damasio, Llinas and many other neuroscientists on the neuroscience of feeling), I’d love to hear his considered opinion on all of this (thanks in advance!).

    Like

  32. While I like the summary of the cortical regions that mediate pain in humans (and, likely, primates), I question the extrapolation to other vertebrates. First, let’s address the question of what structures in the primitive vertebrate brain gave rise to neocortex and each of its regions; then ask what structures in the fish brain derived from the common ancestor. The likely answer is that the fish pallium is homologous to neocortex and, one could guess, that distributed or localized functions within the pallium could process neocortical functions such as “pain”. From my knowledge of fish neurophysiology, we simply don’t know enough about processing in fish, or non-mammalian vertebrate forebrain to address these questions. At present, I don’t find findings of localized pain processing in S1, S2, insular cortex etc, as evidence that animals that don’t have these identified regions (or 6-layered neocortex) lack the functions of these regions. (An aside: lateral processing in some allocortical areas, such as CA3, may be more extensive than neocortex.)

    Like

  33. I think a rock feels no pain but I am not certain of that. Are you? As for fish: they have eyes, and if they have eyes they see, and if they see, they think, and if they think, they know, and if they know then how is it Brian that you do not? Is it your knowledge that stands in the Way. =

    Like

  34. We are still stuck at the same point: animals that do not have certain of our, or similar, structures in their brain for processing electro-chemical signals are automatons; non-conscious, unconscious blobs, that just are, no different than stones.

    The other side of the story is that all animals can be observed to exhibit purposive behavior just as we do. The only difference is that we can talk about our experiences and, in a sense, validate them. We know that we ‘feel’.

    We do not ‘know’ whether chimpanzees feel pain or not. Observing them there is no doubt that they do. They appear to be conscious. One can climb down the ladder of complexity, one tiny step at a time. By what criteria could we judge where consciousness disappears? Purposive behavior is present in all life forms. Brian’s video is a nice example.

    My research into the subject suggests there is no line anywhere in the kingdom of life where such a line can be definitively drawn: consciousnesses to the left, zombies to the right. Living organisms, by definition, interact with, therefore are aware of, their surroundings, even if in the most minimal way.

    Like

  35. I’ll just make my last comment by way of quoting from an article by Adam Calhoun that appeared this past December in “Medium”:

    “Leah Krubitzer asks whether we can understand a nervous system without understanding the rest of the animal. A duck-billed platypus can use its bill to sense electrical signals, a property which is foundational for many of its social behaviors in the water. So how could we possibly understand the brain of the platypus without understanding its duck-bill? Ironically, this means that a brain-centric approach to the brain could end up being misleading. Like a precautionary note from Darwin, the duck-billed platypus reminds us that the brain is a product of its peculiar evolutionary history. This should make us seriously ask how many ways there are to ‘make’ a brain.”

    Like

  36. Abe, and everyone, I’ve been too busy with work this past week, and next week looks even worse, so apologies for not having chimed in on this interesting discussion. Broadly speaking, I actually agree with Brian’s take, with all the due caution considering that we have limited knowledge of the pertinent facts, not to mention the considerable degree of conceptual disagreement it is reasonably possible to have on the issue. However, I have asked Brian to write this essay in large part because of a paper of his in press in Biology and Philosophy, which I think examines alternative possibilities very clearly, and proposes a pretty good, if not unassailable, argument for his own. Brian’s article is here: http://link.springer.com/article/10.1007%2Fs10539-014-9469-4

    Like

  37. Scientiasalon should invite a competent Wittgensteinian to critique this essay, because it is pretty much a textbook example of the kind of conceptual muddle relating to “pain” that is discussed in those circles, even W himself wrote about these kinds of issues about animal pain.

    Like

  38. In the comments there were three keys objections to Brian Keys’ hypothesis, one based on our knowledge of the fish brain and the other two based on our knowledge of consciousness.

    1. J Kubie said
    The likely answer is that the fish pallium is homologous to neocortex and, one could guess, that distributed or localized functions within the pallium could process neocortical functions such as “pain”. From my knowledge of fish neurophysiology, we simply don’t know enough about processing in fish, or non-mammalian vertebrate forebrain to address these questions.

    2. Liam Ubert said
    ” One can climb down the ladder of complexity, one tiny step at a time. By what criteria could we judge where consciousness disappears? Purposive behavior is present in all life forms … suggests there is no line anywhere in the kingdom of life where such a line can be definitively drawn: consciousnesses to the left, zombies to the right.”

    3. Daniel Tippens said
    So, it seems possible that you have pointed out all the requirements for access consciousness of pain, but not phenomenal consciousness. In other words, it is possible that fish experience pain, they just have none of the higher level neural processes required for our kind of robust access consciousness.

    We cannot imagine what lower forms of consciousness is like but we do know it exists. It is likely that the development of consciousness is one of the most profoundly important developments in animal life forms, up there with that of the eye. We will probably never know, but I am guessing that the first, most rudimentary forms of consciousness were the result of the development of the eye, when consciousness could make sense of what the eye revealed. Consciousness would then(concurrently?) have expanded in scope to take into account other sensory information.

    Brian Key’s hypothesis is certainly interesting, but I think unlikely. I find his description clear, well reasoned and very informative but I question his conclusions. Much more work needs to be done to understand consciousness and fish. At least he is not suggesting that our own conciousness is an illusion or that we lack free will 🙂

    Like

  39. Dear Abe Cochrane

    Thank you for raising the visual analogy with regards to attention. To highlight the importance of “attention” in “seeing” I would like you to watch the youtube video. It’s only a short test but it is very important to follow its instructions. I will briefly discuss what this video demonstrates in the text below.

    I would be amazed if you were not amazed by this video. It is quite confronting as it demonstrates that you (i.e. your cortex) don’t always see what your eyes (your retina) sees. The attentional system determines what you are conscious of (in this case, what you see). This is what I mean when I say when you have to attend to pain, to feel it (i.e. be conscious or aware of it). When you are not attending to it, then you don’t feel it (or see it, in the case of the dancing bear).

    Like

  40. I begin my final comment with a quote from Werner Herzog, from the movie Grizzly Man, specifically because of the issue of predator behavior and mindset:

    And what haunts me, is that in all the faces of all the bears that Treadwell ever filmed, I discover no kinship, no understanding, no mercy. I see only the overwhelming indifference of nature. To me, there is no such thing as a secret world of the bears. And this blank stare speaks only of a half-bored interest in food.

    Bingo. And a bear is smarter than a fish.

    Going beyond that, the fact that a fish, or a bear, has eyes or whatever, has nothing to do with it. I think back to the late 1970s when people anthropomorphized “pet rocks” by painting eyes on them.

    Thomas Taking a stab at a platypus might be harder than trying to do Nagel’s bat, as I noted in my previous comment. That said, the author has noted that, although birds don’t have a cortex, they have shown analogous evolution. And, on the platypus, if we look at other relatively social animals, and know that they use each others pheromones to organize their groups by smell, we can probably devise at least some degree of understanding of the platypus’ sociality by analogy. (Again, per Brian, we shouldn’t take such analogies too far, but I don’t think this would.)

    Max No, the fish isn’t a p-zombie. The idea of p-zombies is the idea of an allegedly nonconscious imitation of an ordinary human being, a person who is assumed to be conscious. A fish isn’t imitating anything, let alone something ordinarily assumed to be conscious, and certainly not of a level of consciousness with second-order (or higher) thinking.

    All I see nowhere in this essay where Brian has said that consciousness, second-order thinking or other things are A: Purely polarities and/or B: Limited to humans. If some people think that because he doesn’t want to anthropomorphize about fish pain, that he’s almost like a fundamentalist Christian drawing a fence around Homo sapiens, well, he can’t stop you from thinking that way, but I see no warrant in this piece

    And, finally, per the long quote that Massimo posts, there’s no need to unduly anthropomorphize animals in order to treat them humanely, as I said two posts ago.

    Beyond that, I think anthropomorphizing, while it may not provide much of a window into animal behavior, does provide a fair-sized one into human psychology.

    Massimo, I throw that out as a suggestion for another essay, in fact. Other than free-will issues, and your piece on Stoic meditation, we’ve really not had much here on psychology.

    Like

  41. Brian Key seems to be focusing on attentional mechanisms, arguing, roughly, that attention is a late-evolving feature. While we don’t know much about the neurophysiology of attention in non-mammalian vertebrates, I’d argue that their behavior suggests perfectly fine attention. In my way of thinking, most animals do one thing at a time (largely because body, appendages, etc, are limiting). With this limitation, its functional to have an attention system that selects stuff relevant to the present task and filters out task-irrelevant details. I don’t know of anything about fish behavior that suggests they are less task-focused than mammals, primates or humans.

    Like

  42. Professor Key,
    Fascinating video, thanks. I accept that there is not a qualitative difference between white cell behavior here and fish behavior. Also, no qualitative difference between this kind of fish behavior and a gazelle (or baboon) running from a lion either. I assume with Darwin, per the Patrice’s quote above: “the difference in mind between man and the higher animals, great as it is, certainly is one of degree and not of kind.”
    I wonder if you agree with Darwin on this or did something completely new and unprecedented suddenly emerge into existence in the mind of humans (and/or some higher animals) when they started experiencing some sensations that they preferred over others? This question seems central to the discussion to me and I”d appreciate an answer to this however brief.

    I’m certainly not a scientist, but I had already read about the chinese woman who never had a cerebellum and yet was able to move and talk, so I don’t find it surprising that fish still feed after part of their central nervous system was removed. I understand that feeding may be ‘hard wired’ in fish, but I’m wondering why you don’t say the same of sensation. Maybe you explain this in your article but it was over my head. If you tell me this is the case, I will accept that.

    In this fish study http://deepblue.lib.umich.edu/bitstream/handle/2027.42/33473/0000878.pdf?sequence=1 Goldfish were able to learn that when a light comes on they have to move to a particular corner of a fish tank to avoid an electric shock. Perhaps, if their brain was removed, they would forget to move to the corner and their bodies would simply produce a non purposeful nervous reaction when shocked. Is this nervous reaction any different a sensation than one that is remembered by a brain? I can see where a brain that remembers a sensation to be avoided can respond by producing physiological reactions which in themselves may amplify the sensation. Does this make a qualitative difference in the sensation that could be called ‘fish pain’? Is there some similar qualitative difference that produces human pain? Or is it all quantitative difference in spite of the difference being, as Darwin put it, ‘great as it is’?

    Is there such a being as an unconscious fish? I remember smelting with my father when I was a child and when gutting the fish I would thump them just right on back their head (it didn’t have to be a hard thump) and their mouths would fly open, they would stop wriggling, and become perfectly rigid. If left unharmed, they’d eventually ‘come to’ and start moving around in the bucket again.
    I’m not being argumentative; just doing my part as a student I hope. If you’re too busy to answer, I completely understand that you’ve done your part already with your great article and comments you’ve already contributed. Perhaps someone else would like to chime in.

    Like

  43. Timanti wrote:

    Scientiasalon should invite a competent Wittgensteinian to critique this essay, because it is pretty much a textbook example of the kind of conceptual muddle relating to “pain” that is discussed in those circles, even W himself wrote about these kinds of issues about animal pain.

    —————–

    I would have, as I have done before. The trouble is that there is virtually no interest, around these parts, in the Wittgensteinian point of view on these sorts of questions. I wind up sounding like some kind of preacher, and no one is convinced.

    Too many here with too scientistic of an outlook for Witt. Also, too analytic (and I say that as an analytic philosopher). Continental philosophers get pretty savaged here, as well, irrespective of how highly their work is regarded in the tradition.

    Like

  44. Its difficult to say from Brain Key’s work whether he has adequately covered the alternative structures of other species, including fish. Here is another view.

    “The cerebellum-like structures in three groups of fish act as adaptive sensory processors in which the signals conveyed by parallel fibers in the molecular layer predict the patterns of sensory input to the deep layers through a process of associative synaptic plasticity. Similarities between the cerebellum-like structures and the cerebellum suggest that the cerebellum may also generate predictions about expected sensory inputs or states of the system, as suggested also by clinical, experimental, and theoretical studies of the cerebellum.”

    http://www.ncbi.nlm.nih.gov/pubmed/18275284

    Like

  45. Dear Wm. Burgess

    You ask an interesting question about whether feelings just sprang into existence. I suppose the best way to address this question is to direct you to the Smithsonian Museum of Natural History.

    http://humanorigins.si.edu/evidence/human-evolution-timeline-interactive

    This site contains a nice evolutionary timescale of “humans”. Note that the record goes back to 7 million years ago. It ultimately depends on your definition of “sprang”. This is a mere drop in the bucket compared to when fish and humans are believed to have last shared a common ancestor (~500 million years ago).

    Like

  46. Just an additional thought as it relates to the variables associated with the problem of pain and consciousness: attention would affect the ability to assess and respond to a situation, as well as the ability to learn. The following quote illustrates the astonishing limitation of attention among rats in learning.

    “Roberts (2002) reviewed studies showing that animals can learn effectively if reinforcement is immediate, but learning diminishes sharply with even brief delays. Thus, in a classic study by Grice (1948), rats learned quickly if the behavior was rewarded immediately, but even a 5-s delay meant that learning required hundreds of trials, and a 10-s delay produced failure to learn even after a thousand trials.” Baumeister and Masicampo, 2010.

    An ‘attention span’ of less than 10 seconds! That would make doing one’s homework a challenge. This shows what an extraordinary gift of evolution human consciousness is. Regarding lower animals as unconscious may be a practical expediency, I just think it is incorrect.

    Like

  47. I enjoyed the article overall. I think it is fascinating to see what neuroscience can bring to the table regarding the understanding of our mental lives.

    I do have two problems with two of the claims used by the author to justify his thesis. The first is the claim that we know that certain regions of the human brain are necessary for the experience of pain. The second is that an experience of pain (or any other mental state) requires that the being in question have an awareness of that state (which in turn supposes consciousness as a pre-requisite).

    To the first I will not elaborate in detail (mainly for the sake of space) but I would highly recommend David Lewis’s short but wonderful paper “Mad Pain and Martian Pain” on the differences between our concept and name of pain qua mental state, and on the other hand the identification of mental states with neural states. Anyone familiar with Lewis will be aware that he is known for endorsing the latter, but in this paper he is also elaborates on how the concept of some mental-state term is defined by its functional role in a network of such concepts. Whatever it is that plays the functional role of pain in humans is a contingent fact about human life. The state we call pain could have been some other state, even though the identification of that state with some neural state is just a statement of identity (which cannot be contingent).

    (I’d be interested in any Wittgensteinian elaboration of this point 🙂 )

    To the second, I have a concern that this point isn’t merely anthropocentric, but is a full-blown retreat into Cartesianism and its privileging of the mental as foundational representational states. If this is the case, and what the author is arguing is that experiences are intrinsically ‘concept-laden’, then I have to disagree. I think that it’s the other way around. But as I’m not sure I’ve understood the position thoroughly to press this point, I won’t.

    Like

  48. A man with arthritis (this is diagnosable) stands up, says, “I feel pain.” No one questions this.

    He takes aspirin, sits down, listens to music, he doesn’t say anything about his pain. We ask him, “has the pain gone away?”

    “There’s sensation in my knees; but this Mahler symphony carries me away.”

    He doesn’t feel pain? This sensation in his knees, what is it? Or is it that he needs to keep reporting for our benefit? That seems asking too much. He lies to us? There isn’t sensation in his knees just because he no longer calls it pain?

    “Pain requires nociceptors and attention.” Perfectly adequate report of neurological systems and cognition. It has what to do with the 5 year old girl crying because she lost her doll? Is that a kind of “pain,” as she reports (“my dolly lost, I hurt”), an emotional stress? Or is she deluded? She wants response. Should we distract her with discussion of nociceptors?

    All living things engage in some sensate interaction to their environment. Plants lean toward the sun. We don’t want to claim ‘sensation’ in the primate understanding of the term, they are a different life form, a different system. Yet there is something there that motivates the bending of their forms to a catalyzing agent for nutrition.

    There seems something wrong in reducing the problem of ‘suffering’ to anthropocentric conceptualizations. The wilting plant suffers something – it’s dying – one doesn’t need to evoke “pain” in this description.

    “My dog caught her paw in a trap; she whined; I answered, and released her.” Was this wrong? What did the utterer ‘owe’ the dog in releasing it? Why was this felt necessary?”

    Once I saw a doe that had broken its leg jumping a fence. It evidenced no “pain” behavior. It evidenced survival behavior.

    “They shoot horses, don’t they?” – why do they do that? Does that justify suicide per Horace McCoy’s novel? That sounds like a category error, but perhaps not. Perhaps we are concerned about human suicide because we are also concerned with the suffering of other animals.

    There seems to be a basic interest here that the question about the ‘consciousness’ of fish, or the ‘pain’ they might feel completely misses.

    Scientists seem sometimes to want to find facts that support human presumptions about the world. “Go ahead, slaughter fish, fish don’t feel pain.” Very comforting. But that’s really not interesting. A fisherman who says, “I fish because I must feed my family” tells us more about the nature of humanly experienced reality.

    Causing suffering – experiencing suffering – may well be in the nature of being human; perhaps in the nature of being alive. That doesn’t mean we need – or should – find excuses for it. Science is a description; that’s the best it can do. The rest is left to philosophy.

    “Life is impermanent since (it is beset by) many misfortunes like a bubble of water caught by the wind; that one inhales after exhaling and awakens from sleep is wonderful.” – Nagarjuna

    Like

Comments are closed.