One topic that was discussed was whole brain emulation, the idea of a software model of a specific brain that, run on the appropriate hardware, would produce functionality essentially identical to the original brain. Despite the irrational optimism of some of the folks interviewed, this is fascinating stuff. I think it's plausible but not definitely possible2, but in any case it is cool to think about if only for the ethical and existential implications.
What caught my eye was the lack of any ethical discussion of the following proposition:
Randal Koene, a neuroscientist at the European technology firm Fatronik-Tecnalia...offered some reasons for why anyone would want to work so hard to make a whole brain emulation in the first place. Even if it behaved like a generic human brain rather than my or your brain in particular, scientists could still use it to run marvelous new experiments. They might test drugs for depression, Parkinson's and other disorders.
Woah, hold them horses just a minute. There's a potentially huge ethical problem with this proposition.
Depending on the accuracy of the simulation, it is quite conceivable that this "generic human brain" would experience suffering just as meaningfully as a real human brain. In fact, the more perfect the simulation, the less appropriate it is to compare the experience analogously, and the more appropriate it is to consider the experiences identical.
The best and most recent research indicates that consciousness is a product of the interaction of the various components of our brain. If each of those components were emulated to a high degree, and the emulations allowed to interact, there is no non-supernatural explanation for why the emulations would not also produce a "conscious being" on the same level as you or me. To then subject this conscious being to experimentation without consent or any regard for possible damage or suffering, just because it is "a generic human brain rather than my or your brain in particular," would be ethically on par with creating embryos with a "generic" human genome, letting them be born and develop to adolescence (!) and then performing medical experiments on them arguing that they are just a "a generic human rather than me or you in particular." In other words, an ethical and moral disaster.
A rather surprising irony arises from the word "non-supernatural" in the previous paragraph. All of these grave ethical concerns about experimenting on generic whole brain emulations get neatly swept under the rug if you accept the ludicrous concept of "ensoulment". These "generic human brains" don't actually have free will or experience suffering, because Jeebus never reached down and put a baby soul inside them, right? So experiment away, and be damned with the "suffering" of these mindless machines!
We could envision a future where, much like some theists today oppose stem cell research because their belief in ensoulment precludes them from rationally evaluating whether a clump of undifferentiated cells can be meaningfully referred to as "human", future nontheists might oppose "whole brain emulation research", while theists support it because, inversely, their belief in ensoulment precludes them from rationally evaluating whether a perfect emulation of consciousness can be meaningfully referred to as "human".
I find this to be a fascinating potential role reversal. Of course, there are similar issues today (e.g. you sometimes hear theistic justifications to ignore issues about treatment of animals, since beasts don't have a "soul") but the strong parallels to stem cell research -- with the roles simply reversed -- makes this an issue worth pondering.
1I know nobody is going to believe this, but the reason I get Playboy is that my wife secretly subscribed us to it so she could read the articles. Just like the old commercial, right! Except we soon discovered that Playboy's heyday of publishing edgy fiction and high-quality thought-provoking opinion pieces was long gone. It's just a mediocre-to-poor men's magazine now. Of course, we already paid for the subscription, and despite an irritating and slow-to-change misogynist undertone to the whole magazine, many of the articles are at least good enough to make for sufficient bathroom reading material... so that's how I came upon this.
2Why might whole brain emulation be impossible? Well, there could be some subtlety of brain functionality that we are missing, which makes the problem far more complex than we currently imagine (though current advances in neuroscience are making this seem less and less likely every day). On the technology side, Moore's Law will not necessarily hold forever. Actually, it won't hold forever -- "I would've gotten away with it too, if it weren't for you meddling fundamental laws of physics!" -- so the question is just whether or not it will hold for long enough. Even at present, it looks like we are nearing the practical speed limit for a single-threaded processor (Moore's Law continues to hold because of increasing parallelization, but who's to say that whole brain emulation wouldn't require some rip-roaring fast serial process?). Of course, logically an artificial brain must be possible, since there's nothing supernatural involved in "natural" brains, but it may turn out that doing it with computing technology as we presently understand it is a pipe dream. I'm guessing it will be possible after all, but this is anything but a slam dunk.
I and (judging by cheers as I responded to this topic) also the audience at the Singularity Summit share your concerns about the ethical treatment of emulated minds.
ReplyDeleteThe point about clinical trials or experiments with emulated brains/minds is not to deny the ethics or morals involved, but to consider the lesser permanent harm that a procedure can have on such an emulation. I pointed this out in response to an explicit question about clinical trials with emulated brains.
There should be no difference for ethical purposes between enlisting the help of an emulated brain or the brain of a biological person with a clinical trial. In any case there is the matter of informed consent to participate. In both cases there are some risks and the possibility of suffering. The great advantage in the case of an emulated brain is that every clinical trial or experiment should be completely reversable, without permanent effects.
At present, we don't have that option - all clinical trials by necessity need to be conducted with risk of permanent consequences to human or animal.
Thank you very much for the comment, Dr. Koene! It is gratifying to get a response right from the source.
ReplyDeleteI should make it clear that my comment about being surprised at any lack of ethical discussion was more directed in reference to the Playboy article than towards any assumptions I might have about what you personally have to say about the topic -- I of course did not hear the talk and would have no idea about the latter :)
So with that out of the way, now we're talking. The Playboy article seemed to imply that experimenting on these brain emulations was okay because it emulated a "generic" brain, which it seems we agree is an ethical and moral disaster. Informed consent + reversibility is a much but answer -- but I think that, depending on the mechanism of reversibility, it still raises some interesting existential questions. (Not like any of this work has the potential to do that, eh?)
So, as a hypothetical, let's say the mechanism was to take a "snapshot" of the software state of the emulation prior to the experiment (after obtaining consent, of course) and restoring the snapshot at the conclusion of the experiment. Ethically this is probably fine, but existentially we've got a possible conundrum akin to the "teleporter" thought experiment, i.e. if I scan all of the particles in your body, and then simultaneously create an identical copy at a remote location and destroy the original particles, is that still "you"? Doesn't the "original" you cease to exist, even "die" as it were? And if so, how can I explain exactly why this is or why it matters without invoking a ghost in the machine-type argument? By the same token, is the individual who underwent the experiment the same one that is restored from the software "snapshot"?
I suppose, depending on the technology, one might get around all of this by some sort of gradual reversal process, rather than what amounts to restoring a backup. There could still be some weird existential issues, e.g. if you had some sort of journaling system and then backed it out to the previous state one step at a time, there could still be discontinuities, and in any case "lost" "memories" (though from the emulation's perspective, it never had those memories to begin with...) And if the model were well understood enough to "fix" any damage without explicitly backing out to previous states, then one wonders why the experiment was necessary to begin with...
My gut feeling is that if/when technologies that raise these sorts of existential questions become a reality, people will rapidly decide they don't give a shit :D Back to the teleporter thought experiment, the experience of the "teleported" individual after each teleportation would be a seamless continuity of consciousness, i.e. there would be no experience of having died or of having been born anew. Those who believed in a "soul" would almost certainly be convinced they had arrived with their supernatural baggage intact, and most people would stop thinking about it there. The same presumably holds true for any discontinuities in the experience of a brain emulation.
(as a brief aside, I think the scan-all-particles-and-replicate Star Trek-esque teleporter will forever remain a practical impossibility because of the sheer number of particles we're talking about behaving under chaotic conditions -- I merely find it a useful existential thought experiment)
(continued in following comment...)
''There should be no difference for ethical purposes between enlisting the help of an emulated brain or the brain of a biological person with a clinical trial.''
ReplyDeleteHmmm, well, there is potentially one difference, at least when you are talking about studying brain diseases like Parkinson's, etc... if we can manipulate the brain emulations with enough precision to cleanly back out the results of the experiment (without resorting to a brute force "snapshot" approach), then presumably the brain diseases would all be induced. Informed consent, sure, but if your pitch to the IRB is "We're going to give the participants Parkinson's disease, and they will live with the full symptoms for the duration of the trial, after which we cure them," well... that might be a tough sell.
Thank you again for your comment, and rest assured I am not trying to be closed-minded about this. The idea of reversibility had not occurred to me, and while I don't think it's a panacea for all potential ethical issues raised by intensive experimentation on brain emulations, it at least provides a possible way to make it work.
Of course the answers to any of these questions -- if they are answerable at all -- would be highly influenced by the specifics of the technologies involved, should this become a reality.
A few thoughts:
ReplyDeleteIs it ethical to make a self aware, concious being suffer if you can erase it's memory of the suffering? Is a restore/journaling rewind any more ethical than rohypnol?
The ethics of even creating such a brain would be probelmatic. How can a conciousness that you can presumably program to do whatever you want, such as give informed consent, really make a free will choice? (oops, I opened up the free will can of worms)
There are many models, and they have a plan.....
So yeah, I steer away from "free will" because I think the term is about as meaningful as "soul"... but I don't think we need to invoke the metaphysical to get at the problem you are pointing out; rather, I think we can stick with just the simple word "freedom".
ReplyDeleteIf a brain implant was developed that would induce a behavioral change on command -- even an apparently innocuous one like limiting frustration or something -- it would of course be unethical to install it in someone without their consent. By the same token, if there was a gene that could be manipulated so that, in the developed person, you could induce a behavioral change on command, it would be unethical to have that gene tweaked in your children. You don't need to invoke free will to say this; this is a basic issue of personal autonomy.
So I think it would be the same with any hypothetical whole brain emulation. There would need to be very strict regulations on how such a being could be manipulated, as well as on constructing it in a way to allow unwanted future manipulation. This does not mean closing the door on the technology, any more than we ought to close the door on brain implants or genetics because of the scenarios I described earlier.
The problem, I think, with invoking free will is that you could argue that any artificially created being would inherently not have free will because all of its decisions would be the product -- whether directly or indirectly -- of something its creators did. I think that's hogwash. You and I don't have "free will" in that sense either, because all of our decisions are the product of simple phenomena as well, with mechanisms that are inaccessible to us. (In fact, the more I read about recent advancements in neuroscience, the more disturbed I am at just how consciously inaccessible a lot of my true decision-making might be...) As long as there were regulations to prohibit the creators of such brain emulations from deliberately exercising control, I don't see a "free will" issue...
Regarding the first paragraph: yeah, exactly. Imagine a researcher today proposing a study where she will inflict a painful and psychologically damaging procedure on study participants, but hey, it's all okay because they will be given a drug to erase their short-term memory just afterwards. Good luck getting that past the IRB...
I agree with you about "free will". The main reason I don't like to open that can of worms in regards to free will being a myth, is that someone will try to counter that by saying, "If I don't have free will, then I can go kill, rape, and steal whatever I want, and I can't be blamed because it's all a result of my biological programming." To which I have to then respond, OK, but we're going to put you in prison in an attermpt to correct that programming.
ReplyDeleteheh, yeah... just like the, "If there's no God, why don't you go around raping and pillaging?" I mean, really?!?!? People really want to make that argument??? hehehe.....
ReplyDeleteIn any case, all of us (except maybe Calvinists) go around acting like we have something that might sloppily be termed "free will", just as we also act like we have a "soul". I wrote before on a comment to another blog that I have come to accept that the idea of an atomic self, i.e. a "me" that is inseparable, is clearly an illusion based on what we know about the interaction of the different regions of the brain -- but that also it's a damn useful illusion, and not one I plan on giving up any time soon! In fact, I daresay it is impossible not to operate on the assumption of an atomic self in our day-to-day lives.
The fact that in some sense we are literally automatons should not take away from our feelings of meaning and wonder and responsibility. The experience of it is all still there, and explaining the underlying mechanism shouldn't alter the experience of it anymore than "unweaving the rainbow" should.
But I guess it's really hard to accept for some people. Including my wife, for that matter; while I find an atheistic worldview liberating, she finds it depressing -- despite being just as much an atheist as me. Go figure.
Hi James,
ReplyDeleteI'm one of the Neuroengineers group in San Sebastian Spain that work under Randal Koene. We thought the dialogue & blog are great but *love* that it came out of Playboy. We want to get a copy and post it next to the other press about our group. What issue was it? A scanned page would be great! You could contact me off line about this is you want. Really like the blog. Stefan Carmien, Fatronik
Stefan,
ReplyDeleteAlas, my wife just recycled it a few days ago, so I won't be able to provide a scan :( It was the year-end issue for 2009 (so actually the January 2010 issue, I guess?) with Tara Reid on the cover.
http://www.pbcovers.com/pbcovers.php?c=us&y=2010&cover=us_201001a
Sorry I couldn't be of more help!