Shell Game

An item at the CNN website reports that a study from Queen’s University, Belfast suggests that crabs “feel” pain.

The study, by researchers Bob Elwood and Mirjam Appel, examined the behavior of hermit crabs subjected to electric shocks. Hermit crabs, as I am sure you know, live in the abandoned shells of other animals, and what the researchers did was to wire up their shells, give them a jolt, and see how they reacted. If the shock was strong enough, the crab would pop right out, but if it was just below this threshold, the crab would still have a higher likelihood of moving to a new shell when one was offered than crabs who hadn’t been shocked: presumably it remembers the shock and wants to avoid getting another.

What this result indicates, then, is that electric shocks influence the behavior of crabs. The spin here seems to be, however, that hermit crabs consciously experience the shocks — that they suffer — and that therefore we have a moral obligation to avoid causing them pain. While this may also be true, it is a far more audacious claim, far less certain, and raises some difficult questions.

How expensive is consciousness? How valuable is it? What machinery is required to generate it? It is easy enough to see the adaptive value of detecting threats to the body and reacting aversively toward them; were I designing a little hermit-crab robot, one of the systems I’d put in the spec would be a way of detecting when the shell I had chosen was no longer a safe place, and some routine for moving to a new shell when needed. No consciousness would be necessary for this, though; all it would take would be some sensors, and some logical “flags” for them to set in the operating system. If this design were enough to get the job done, expensive extra features like consciousness might not be worth the cost. There’d be nobody “in there” actually suffering; there’d just be a little crab-shaped robot (a “zombie”, in philosophical parlance) moving to a new shell. If consciousness is indeed expensive, and Nature parsimonious, then I would imagine that actual crabs are no more conscious than my little robot.

But if consciousness is not expensive — if it arises naturally, say, from even simple nervous systems — then the moral-obligation assertion begins to gain traction. My own feeling, given how labile and gappy even human consciousness is, and how easily deleted, is that consciousness is an expensive feature indeed, and unlikely to be needed in a crab.

But then we must ask: well then, what exactly do we need consciousness for?

It’s all too much for a late Sunday night, I think. Read the story here.

27 Comments

  1. I’m half-joking, but maybe what the crabs have is a sort of cryptic consciousness.

    Kevin

    Posted March 29, 2009 at 11:38 pm | Permalink
  2. Malcolm says

    Hi Kevin,

    I did see that item over at CE (such an interesting website) – but the question here, I think, is not what kind of subjective consciousness crabs might have, but whether they have any at all.

    Posted March 30, 2009 at 10:41 am | Permalink
  3. bob koepp says

    I don’t know what we need (phenomenal/subjective/qualitative) consciousness for, and since I can’t imagine how it’s generated, I also can’t imagine how to assess its relative costliness. Of course, if we follow a Dennettian line, setting aside questions about phenomenal/subjective/qualitative states, and view consciousness in purely functional terms (as a matter of mapping inputs to outputs for hypothesized information processing modules), then we might well get a handle on what it’s for, and what it costs. But the ultimately “formal models” of I/O relations that we get from the Dennettian approach to consciousness don’t mesh well with moral intuitions about the “badness” of pain — unless we actually think that a tinker toy Turing machine could be programmed to “feel” pain.

    So, just to be on the safe side, I’d recommend that when dispatching crabs, we do the deed in a way that is likely to minimize any possible suffering.

    Posted March 30, 2009 at 10:42 am | Permalink
  4. Malcolm says

    Hi Bob,

    Though I admire much of Dennett’s work, I have never been strongly attracted to hard-core functionalism; I think it likely that beside the functional organization of biological brains, there is something about their material substrate that is responsible for consciousness. (It would be very helpful indeed, needless to say, if we could determine what that is.)

    I expect that what applies to crabs applies to lobsters as well; perhaps going forward I’ll take them out back and shoot them before dropping them in the pot.

    Oh dear… what about oysters?

    Posted March 30, 2009 at 10:55 am | Permalink
  5. JK says

    http://en.wikipedia.org/wiki/Cleve_Backster

    Crabs indeed!

    Posted March 30, 2009 at 11:00 am | Permalink
  6. bob koepp says

    JK – I almost wrote, “… I’d recommend that when dispatching crabs (or even humble plants!), we do the deed in a way that is likely to minimize any possible suffering.”

    Posted March 30, 2009 at 11:07 am | Permalink
  7. Malcolm says

    Hi JK,

    This is an awfully slippery slope. If plants are also going to be brought into the moral-inclusion circle, we are soon going to be reduced to dining on Hostess Sno-Balls.

    Posted March 30, 2009 at 11:12 am | Permalink
  8. bob koepp says

    Malcolm – Many plants have already been brought into the moral-inclusion circle, though not because of concerns about causing unnecessary pain. Instead, a large contingent of “environmental ethicists” claim that natural functions (adaptations) are all that’s needed to underwrite moral considerability. I sometimes wonder if these ethicists appreciate just how close to traditional “natural law ethics” they are venturing.

    As a committed onmivore, I have no moral qualms about killing in order to continue living. I do, however, have a firm belief that killing should be done as quickly as possible, while minimizing the actual tissue damage inflicted on those who are about to become my dinner.

    Posted March 30, 2009 at 11:39 am | Permalink
  9. Malcolm says

    …a large contingent of “environmental ethicists” claim that natural functions (adaptations) are all that’s needed to underwrite moral considerability.

    What’s the argument given there? Just a “what right do we have to interfere” kind of thing?

    It seems to me that the capacity for subjective experience of suffering — i.e., consciousness — is an appropriate threshold for moral consideration.

    Posted March 30, 2009 at 11:46 am | Permalink
  10. JK says

    I really can’t participate meaningfully here but Malcolm, should you encounter any obnoxious Vegans on the subway again, you might mention Mr. Backster’s experiments. That should give even a vegan, “food for thought.”

    Posted March 30, 2009 at 12:01 pm | Permalink
  11. bob koepp says

    Malcolm – It’s precisely in order to expand the circle of moral considerability beyond the sentient that these environmental ethicists turn to the presence of biological adaptations as the mark of considerability. If I recall correctly, this move was first made with reference to trees. Presumably, this view brings all of life within the circle of moral considerability.

    The reasoning is that in order to be morally considerable, an entity must have “interests,” without, however, implying that one “takes an interest” in one’s interests. For example, it’s thought to be in the interest of a plant to have adequate nutrients accessible in its immediate environment; and realizing that interest contributes to a plant’s thriving. And adaptations are assumed to provide an objective standard relative to which we can assess whether an organism is thriving.

    Posted March 30, 2009 at 12:26 pm | Permalink
  12. Malcolm says

    Hi Bob,

    Right, of course I am well aware of the notion of “interests”. Indeed, over the years here (and at Bill’s) I have argued that intentionality entered the world precisely as a result of the “interests” of adaptive, living, evolving replicators. So I get that part.

    So, granted, all replicators have interests. But what I want to know is: why does simply “having interests” (i.e. being the kind of thing that can be understood from the “intentional stance”) become an ethical issue, in the absence of conscious experience? A tree, presumably, doesn’t suffer when its interests are thwarted, so why should we care?

    Posted March 30, 2009 at 1:45 pm | Permalink
  13. bob koepp says

    Malcolm – I’m not (yet) ready to endorse the claim that trees, viruses, etc., do have morally relevant interests, but I’m also dubious about the adequacy of a criterion tied to the capacity to suffer. One thing that can be said in favor of the broader criterion is that it recognizes that harms can be inflicted even if there is no suffering. That seems right, to me. If, as seems plausible, a cogent case can be made that harm is a more basic moral consideration than suffering, well, …

    Posted March 30, 2009 at 3:57 pm | Permalink
  14. Malcolm says

    Hi Bob,

    One thing that can be said in favor of the broader criterion is that it recognizes that harms can be inflicted even if there is no suffering. That seems right, to me.

    That does focus the question. Unlike you, though, I am not as inclined to sign on with the idea that harms can be inflicted if there is no suffering.

    Also, if not trees and viruses, then where should we draw the line, if not at consciousness, and why?

    Posted March 30, 2009 at 4:09 pm | Permalink
  15. Cliff says

    Can you believe this?
    My God, what is wrong with people?
    We have children starving around the world, homeless, peopled needing medical help and these LIBERAL WACKOS are trying to “convince us” a stupid crab FEELS THE PAIN….and evidently getting money from someone to spend on such a STUPID and senseless study.
    Sorry folks, but we have lost or values and sneer in the eye of reasoning in the 21st Century!

    Posted March 30, 2009 at 4:36 pm | Permalink
  16. Malcolm says

    The issue here is more philosophical than political, Cliff.

    Posted March 30, 2009 at 4:42 pm | Permalink
  17. bob koepp says

    Malcolm – Drawing lines is tricky and, of course, real lines (unlike their theoretical counterparts) are always somewhat vague. I have no doubt, though, that plants can be harmed (if you knew my history of trying to get the buggers to grow, you’d appreciate my lack of doubt in this matter). Whether the harm that can be done to a plant is a morally relevant sort of harm is another matter — one on which I have not found a satisfyingly stable position. But even if we were to grant that plants (and even dangerous [to you and me] viruses) have “moral standing”, they might still have a “lesser standing” than, say, starving children, or homeless people, or people needing medical care.

    Posted March 30, 2009 at 5:21 pm | Permalink
  18. bob koepp says

    Malcolm – I forgot to address your lack of inclination to dissociate harms from suffering. Presumably, a well-placed bullet can dispatch Jones without his experiencing anything, let alone suffering. Still, I think even a “clean kill” harms the one who ends up dead.

    Posted March 30, 2009 at 5:27 pm | Permalink
  19. Malcolm says

    Hi Bob,

    I quite agree that from the perspective of a plant’s “interests” we can inflict “harm” – i.e., the creation of states of affairs that interfere with those interests. But what I am interested in is, as you acknowledge, the question of moral relevance. If the plant itself is utterly unaware of having any interests, and incapable of any suffering when they are thwarted, then it is hard to see why there is any moral case to be made against harming a plant (at least from the plant’s point of view; I can easily see why we might have an interest for ourselves in preserving forests, etc.).

    Jones, on the other hand, has a subjective, consciously represented interest in not being shot, and even if you kill him instantly and painlessly, that subjectivity makes a morally relevant difference. (He would certainly suffer in prospect at the knowledge of a painless execution coming his way, in a way that a plant, presumably, would not.) But you are right to bring in the notion of gradations of morally culpable infliction of harm; when George shot Lenny he did so just as you dispatched Jones, for this very reason.

    And I expect you would agree that subjective experience of suffering is a non-negligible moral factor: that there is a significant ethical difference between inflicting harm that causes such an experience in a conscious subject, and harm that doesn’t.

    Posted March 30, 2009 at 5:51 pm | Permalink
  20. bob koepp says

    Malcolm – Yes, suffering is always morally relevant, and always negatively so. I just doubt that it’s the only thing of that sort. As for prospective harms, the suffering caused by their contemplation is, again, always morally relevant. That doesn’t mean that what is contemplated is itself a kind of suffering — at least not in the experiential sense in which you use the term.

    Posted March 30, 2009 at 6:34 pm | Permalink
  21. Malcolm says

    bob, I do concede that your Jones case above is an example of a morally culpable act that causes no suffering (leaving aside the suffering of Jones’s widow and creditors).

    I expect you will agree, though, that it is not just Jones’s having interests that are thwarted by your bullet that makes the act reprehensible, but Jones’s conscious awareness of those interests — his hopes, cares, and dreams for a future life that you steal with your bullet.

    It is part of our moral wiring that we, not wishing to be shot ourselves, feel a moral compulsion to treat others that we assume to be like us in this way — i.e., conscious of our yearnings and fears, interests and aversions — as we would hope to be treated in turn. In short, to the extent we can generate an empathic “fellow-feeling” for another being — to the extent that we can imagine what it is like to be that other being — we feel a moral obligation. But it isn’t like anything to be unconscious. There is no subjectivity there.

    Are we, then, morally obligated toward zombies?

    Posted March 30, 2009 at 11:16 pm | Permalink
  22. bob koepp says

    Malcolm – I do agree that it is not _just_ thwarting Jones’s interests that makes the act reprehensible. Where consciousness and “taking an interest” is present, that is, of course, relevant to moral appraisal. But is it not only relevant, but a necessary condition of moral considerability? I doubt it, just like I doubt that empathic fellow feeling is the sine qua non of moral awareness. After all, a fellow’s feelings might not be morally praiseworthy at all, even if I can empathize.

    (BTW, once upon a time, before trendiness replaced careful use of language, what you’re callling empathy went by the name ‘sympathy’.)

    Posted March 31, 2009 at 10:33 am | Permalink
  23. Malcolm says

    “Trendiness” in language? Carelessness?? Ouch. You sure know how to hurt a guy.

    If this is trendiness, you are looking at long, slow trends; the use of “empathy” in this way goes back to 1858. But I will not cop to carelessness here: I see empathy as closer in meaning, in modern usage, to actually putting oneself in the other fellow’s shoes, rather than simply understanding the person and feeling sorry for him. But sure, etymologically sympathy comes from syn and pathos, or “feeling together”.

    I’ll return the favor by splitting a hair on your use of sine qua non: what you suggest in your remarks is that you think the presence of a subjective, experiencing awareness with which we can empathize (or sympathize, if you like) is not a sufficent condition for moral consideration; what I was getting at, though, is that it seems to me that it may well be a necessary condition, which is of course what “sine qua non” means.

    Why should we care about moral responsibility to a subject that isn’t even there? The only counterexamples I can think of would be how we treat those formerly experiencing subjects who have lost their consciousness: those in a coma, say, or even the bodies of the dead. And that comes back to an empathic awareness of how they would have wanted to be treated in such a situation, and what we would wish for ourselves in a similar situation.

    Posted March 31, 2009 at 11:51 am | Permalink
  24. bob koepp says

    Malcolm – Oh my! I’ve caused harm, for which I must apologize.

    Re empathy vs sympathy: I know that this is a rather unsettled thing, but _widespread_ use of ’empathy’ instead of ‘sympathy’ is a relatively recent phenomenon. But linguistic practice aside, if empathy involves actually experiencing the same feelings as the with whom one empathizes, then I think it’s probably a rare thing. Suppose I see you stub your toe. I certainly don’t feel excruciating pain in my own toe, so I can’t really empathize with you here. I can, however, sympathize with you, since I do know that stubbing one’s toe tends to hurt, and I assume that you don’t like hurting any more than I do. As a reliable underpinning for the moral virtue of compassion, I’ll take sympathy. If empathy is there, that’s welcome, too — but it’s not necessary to get the moral ball rolling.

    Re sine qua non: I think if you look again, you’ll see that I did in fact use it to express the notion of necessity, not sufficiency.

    Re your third paragraph: I only reported that there are ethicists who reject the idea that sentience is a necessary condition for moral considerability. Even if they are wrong about this, I don’t think their error is obvious. Without defending the view in question, let me at least tweak intuitions… Suppose you could be quite certain that no sentient creature would suffer as a consequence. Would it be morally OK to kill a thousand year old tree just to see if you could fell it in a day using only a small hatchet? Would this warrant even a moment’s hesitation?

    Posted March 31, 2009 at 2:00 pm | Permalink
  25. Malcolm says

    Hi Bob,

    It seemed to me that your example was of a situation where empathy was present, but not sufficient for moral consideration, and that therefore empathy was NOT a “sine qua non”. But OK, sympathy, empathy: we’ll leave it to the readers. The point is that the ability to put ourselves in the other’s shoes is at the root of “golden-rule” morality, and we are hard-wired to do it. (It is exactly what psychopaths lack.)

    Sure, I’d be unlikely to cut down that tree. But the tree doesn’t care one way or another. So whose ox is gored here? We are acculturated not to waste resources, not to kill a tree that other sentient beings might appreciate, not to trash the environment, etc., and we also have an anthropomorphic tendency to project our own wishes for golden-rule treatment onto other living things, which I think is the root of the intuition you are trying to pump here. But on any sort of close examination I can see no moral reason not to cut it down if it really only affects the tree.

    I have a friend who argues that the entire foundation of morality is resisting entropy; he‘d certainly see a moral argument against felling that tree.

    I have the feeling that you do see a moral argument against cutting the tree down; can you articulate it?

    Posted March 31, 2009 at 2:16 pm | Permalink
  26. bob koepp says

    Malcolm – You ask “whose ox is gored?” Well, nobody’s, which is exactly the point of such an example.

    Actually, I don’t see a moral argument against cutting the tree down, at least not with sufficient clarity to be convinced that such an argument can be properly articulated. I’ve also been over this territory in discussions with “deep ecologists”, and have found their attempts to articulate a proper moral argument unpersuasive. So even though I suspect there’s a moral argument to be made against such pointless destruction of trees, I don’t know what shape the argument should take.

    Posted March 31, 2009 at 4:21 pm | Permalink
  27. Malcolm says

    Thanks, Bob. I don’t either, though I admit I share the intuition you were trying to evoke.

    Posted March 31, 2009 at 4:27 pm | Permalink

Post a Comment

Your email is never shared. Required fields are marked *

*
*