tag:blogger.com,1999:blog-1973938108988281018.post6691361397838889346..comments2024-02-06T03:23:37.329-08:00Comments on No Jesus, No Peas: A future technology that could produce a surprising role reversal for theists and nontheistsJames Sweethttp://www.blogger.com/profile/17212877636980569324noreply@blogger.comBlogger9125tag:blogger.com,1999:blog-1973938108988281018.post-4454956760519735512010-01-11T05:30:48.301-08:002010-01-11T05:30:48.301-08:00Stefan,
Alas, my wife just recycled it a few days...Stefan,<br /><br />Alas, my wife just recycled it a few days ago, so I won't be able to provide a scan :( It was the year-end issue for 2009 (so actually the January 2010 issue, I guess?) with Tara Reid on the cover.<br /><br />http://www.pbcovers.com/pbcovers.php?c=us&y=2010&cover=us_201001a<br /><br />Sorry I couldn't be of more help!James Sweethttps://www.blogger.com/profile/17212877636980569324noreply@blogger.comtag:blogger.com,1999:blog-1973938108988281018.post-2508602723613126432010-01-11T02:51:18.496-08:002010-01-11T02:51:18.496-08:00Hi James,
I'm one of the Neuroengineer...Hi James,<br /> I'm one of the Neuroengineers group in San Sebastian Spain that work under Randal Koene. We thought the dialogue & blog are great but *love* that it came out of Playboy. We want to get a copy and post it next to the other press about our group. What issue was it? A scanned page would be great! You could contact me off line about this is you want. Really like the blog. Stefan Carmien, FatronikStefan Carmienhttp://www.scarmien.comnoreply@blogger.comtag:blogger.com,1999:blog-1973938108988281018.post-34913806447740841942010-01-06T07:23:14.710-08:002010-01-06T07:23:14.710-08:00heh, yeah... just like the, "If there's ...heh, yeah... just like the, "If there's no God, why don't you go around raping and pillaging?" I mean, really?!?!? People really want to make that argument??? hehehe.....<br /><br />In any case, all of us (except maybe Calvinists) go around <i>acting</i> like we have something that might sloppily be termed "free will", just as we also act like we have a "soul". I wrote before on a comment to another blog that I have come to accept that the idea of an atomic self, i.e. a "me" that is inseparable, is clearly an illusion based on what we know about the interaction of the different regions of the brain -- but that also it's a damn useful illusion, and not one I plan on giving up any time soon! In fact, I daresay it is impossible <i>not</i> to operate on the assumption of an atomic self in our day-to-day lives.<br /><br />The fact that in some sense we are literally automatons should not take away from our feelings of meaning and wonder and responsibility. The experience of it is all still there, and explaining the underlying mechanism shouldn't alter the experience of it anymore than "unweaving the rainbow" should.<br /><br />But I guess it's really hard to accept for some people. Including my wife, for that matter; while I find an atheistic worldview liberating, she finds it depressing -- despite being just as much an atheist as me. Go figure.James Sweethttps://www.blogger.com/profile/17212877636980569324noreply@blogger.comtag:blogger.com,1999:blog-1973938108988281018.post-38090466986100903022010-01-06T07:07:09.838-08:002010-01-06T07:07:09.838-08:00I agree with you about "free will". The...I agree with you about "free will". The main reason I don't like to open that can of worms in regards to free will being a myth, is that someone will try to counter that by saying, "If I don't have free will, then I can go kill, rape, and steal whatever I want, and I can't be blamed because it's all a result of my biological programming." To which I have to then respond, OK, but we're going to put you in prison in an attermpt to correct that programming.Karl Withakayhttp://blog.cordialdeconstruction.comnoreply@blogger.comtag:blogger.com,1999:blog-1973938108988281018.post-53675398224907585872010-01-06T06:58:46.086-08:002010-01-06T06:58:46.086-08:00So yeah, I steer away from "free will" b...So yeah, I steer away from "free will" because I think the term is about as meaningful as "soul"... but I don't think we need to invoke the metaphysical to get at the problem you are pointing out; rather, I think we can stick with just the simple word "freedom".<br /><br />If a brain implant was developed that would induce a behavioral change on command -- even an apparently innocuous one like limiting frustration or something -- it would of course be unethical to install it in someone without their consent. By the same token, if there was a gene that could be manipulated so that, in the developed person, you could induce a behavioral change on command, it would be unethical to have that gene tweaked in your children. You don't need to invoke free will to say this; this is a basic issue of personal autonomy.<br /><br />So I think it would be the same with any hypothetical whole brain emulation. There would need to be very strict regulations on how such a being could be manipulated, as well as on constructing it in a way to allow unwanted future manipulation. This does not mean closing the door on the technology, any more than we ought to close the door on brain implants or genetics because of the scenarios I described earlier.<br /><br />The problem, I think, with invoking free will is that you could argue that any artificially created being would inherently not have free will because all of its decisions would be the product -- whether directly or indirectly -- of something its creators did. I think that's hogwash. You and I don't have "free will" in that sense either, because all of our decisions are the product of simple phenomena as well, with mechanisms that are inaccessible to us. (In fact, the more I read about recent advancements in neuroscience, the more disturbed I am at just how consciously inaccessible a lot of my true decision-making might be...) As long as there were regulations to prohibit the creators of such brain emulations from <i>deliberately</i> exercising control, I don't see a "free will" issue...<br /><br />Regarding the first paragraph: yeah, exactly. Imagine a researcher today proposing a study where she will inflict a painful and psychologically damaging procedure on study participants, but hey, it's all okay because they will be given a drug to erase their short-term memory just afterwards. Good luck getting <i>that</i> past the IRB...James Sweethttps://www.blogger.com/profile/17212877636980569324noreply@blogger.comtag:blogger.com,1999:blog-1973938108988281018.post-9857356454397011002010-01-05T13:52:12.085-08:002010-01-05T13:52:12.085-08:00A few thoughts:
Is it ethical to make a self awar...A few thoughts:<br /><br />Is it ethical to make a self aware, concious being suffer if you can erase it's memory of the suffering? Is a restore/journaling rewind any more ethical than rohypnol?<br /><br />The ethics of even creating such a brain would be probelmatic. How can a conciousness that you can presumably program to do whatever you want, such as give informed consent, really make a free will choice? (oops, I opened up the free will can of worms)<br /><br />There are many models, and they have a plan.....Karl Withakayhttp://blog.cordialdeconstruction.comnoreply@blogger.comtag:blogger.com,1999:blog-1973938108988281018.post-12757288806536897322010-01-05T06:55:35.379-08:002010-01-05T06:55:35.379-08:00''There should be no difference for ethica...''There should be no difference for ethical purposes between enlisting the help of an emulated brain or the brain of a biological person with a clinical trial.''<br /><br />Hmmm, well, there is potentially one difference, at least when you are talking about studying brain diseases like Parkinson's, etc... <i>if</i> we can manipulate the brain emulations with enough precision to cleanly back out the results of the experiment (without resorting to a brute force "snapshot" approach), then presumably the brain diseases would all be induced. Informed consent, sure, but if your pitch to the IRB is "We're going to <i>give</i> the participants Parkinson's disease, and they will live with the full symptoms for the duration of the trial, after which we cure them," well... that might be a tough sell.<br /><br />Thank you again for your comment, and rest assured I am not trying to be closed-minded about this. The idea of reversibility had not occurred to me, and while I don't think it's a panacea for all potential ethical issues raised by intensive experimentation on brain emulations, it at least provides a possible way to make it work.<br /><br />Of course the answers to any of these questions -- if they are answerable at all -- would be highly influenced by the specifics of the technologies involved, should this become a reality.James Sweethttps://www.blogger.com/profile/17212877636980569324noreply@blogger.comtag:blogger.com,1999:blog-1973938108988281018.post-3688303328318371102010-01-05T06:54:56.358-08:002010-01-05T06:54:56.358-08:00Thank you very much for the comment, Dr. Koene! I...Thank you very much for the comment, Dr. Koene! It is gratifying to get a response right from the source. <br /><br />I should make it clear that my comment about being surprised at any lack of ethical discussion was more directed in reference to the Playboy article than towards any assumptions I might have about what you personally have to say about the topic -- I of course did not hear the talk and would have no idea about the latter :)<br /><br />So with that out of the way, <i>now</i> we're talking. The Playboy article seemed to imply that experimenting on these brain emulations was okay because it emulated a "generic" brain, which it seems we agree is an ethical and moral disaster. Informed consent + reversibility is a <i>much</i> but answer -- but I think that, depending on the mechanism of reversibility, it still raises some interesting existential questions. (Not like any of this work has the potential to do <i>that</i>, eh?)<br /><br />So, as a hypothetical, let's say the mechanism was to take a "snapshot" of the software state of the emulation prior to the experiment (after obtaining consent, of course) and restoring the snapshot at the conclusion of the experiment. Ethically this is probably fine, but existentially we've got a possible conundrum akin to the "teleporter" thought experiment, i.e. if I scan all of the particles in your body, and then simultaneously create an identical copy at a remote location and destroy the original particles, is that still "you"? Doesn't the "original" you cease to exist, even "die" as it were? And if so, how can I explain exactly why this is or why it matters without invoking a <a href="http://en.wikipedia.org/wiki/Ghost_in_the_machine" rel="nofollow">ghost in the machine</a>-type argument? By the same token, is the individual who underwent the experiment the same one that is restored from the software "snapshot"?<br /><br />I suppose, depending on the technology, one might get around all of this by some sort of gradual reversal process, rather than what amounts to restoring a backup. There could still be some weird existential issues, e.g. if you had some sort of journaling system and then backed it out to the previous state one step at a time, there could still be discontinuities, and in any case "lost" "memories" (though from the emulation's perspective, it never had those memories to begin with...) And if the model were well understood enough to "fix" any damage without explicitly backing out to previous states, then one wonders why the experiment was necessary to begin with...<br /><br />My gut feeling is that if/when technologies that raise these sorts of existential questions become a reality, people will rapidly decide they don't give a shit :D Back to the teleporter thought experiment, the experience of the "teleported" individual after each teleportation would be a seamless continuity of consciousness, i.e. there would be no experience of having died or of having been born anew. Those who believed in a "soul" would almost certainly be convinced they had arrived with their supernatural baggage intact, and most people would stop thinking about it there. The same presumably holds true for any discontinuities in the experience of a brain emulation.<br /><br />(as a brief aside, I think the scan-all-particles-and-replicate Star Trek-esque teleporter will forever remain a practical impossibility because of the sheer number of particles we're talking about behaving under chaotic conditions -- I merely find it a useful existential thought experiment)<br /><br />(continued in following comment...)James Sweethttps://www.blogger.com/profile/17212877636980569324noreply@blogger.comtag:blogger.com,1999:blog-1973938108988281018.post-68840729074150716952010-01-05T01:50:14.305-08:002010-01-05T01:50:14.305-08:00I and (judging by cheers as I responded to this to...I and (judging by cheers as I responded to this topic) also the audience at the Singularity Summit share your concerns about the ethical treatment of emulated minds.<br /><br />The point about clinical trials or experiments with emulated brains/minds is not to deny the ethics or morals involved, but to consider the lesser permanent harm that a procedure can have on such an emulation. I pointed this out in response to an explicit question about clinical trials with emulated brains.<br /><br />There should be no difference for ethical purposes between enlisting the help of an emulated brain or the brain of a biological person with a clinical trial. In any case there is the matter of informed consent to participate. In both cases there are some risks and the possibility of suffering. The great advantage in the case of an emulated brain is that every clinical trial or experiment should be completely reversable, without permanent effects.<br /><br />At present, we don't have that option - all clinical trials by necessity need to be conducted with risk of permanent consequences to human or animal.Randal A. Koenehttp://rak.minduploading.orgnoreply@blogger.com