Uploading Consciousness to the ‘Net

Do you think it will ever be possible to upload one’s consciousness to the ‘Net? I mean, like Soulkiller engrams-style representations of self? Or something like the Altered Carbon cortical stacks?

The human brain is so complex, and oftentimes has aberrations or mutations that make each individual truly unique. And we haven’t even come close to actually mapping a human brain in its entirety as of yet, much less discovering what it is about a person that makes them “them”. I feel like until we can do both of those things, uploading myself to the ‘Net is not an achievable dream.

And what about emotions? How do you codify essentially irrational decisions and feelings into a logical model?

What if the “soul” or the “ghost” or whatever you want to call it actually exists? Some of us believe that it is that component that makes you “you”. How does that get translated into an engram? Is it even possible? If it’s something that is somehow tied to a body, would an engram lack any sort of empathy or a crucial piece of their humanity?

All of these are interesting questions that I don’t expect to find the answers to in my lifetime, but I spend a lot of time thinking about them. It’s a nice dream to have, to be “myself” but not tied to a body that limits me in so many ways.

Thoughts?

3 Likes

I don’t think it will ever be possible, but it’ll be really really interesting if it does. Could you imagine how scary it could be though, uploading yourself to the 'net?
I mean, today’s 'net is generally corpo-owned, with big centralized websites. Literally giving your life to one of them would be awful, but probably better than sitting on your own server and trying to do the security side of things yourself. Cyber attacks would be so much scarier.

3 Likes

Well, I think I’m making the assumption that the ‘net would have to be a truly free and open one, like an essential, basic utility in all the cyberpunk novels. I wouldn’t give my “self” to any corpo in that manner, that’s for sure. I don’t want Soulkiller to become reality. :skull_and_crossbones: And I’m also not super keen on having to be wary of R.A.B.I.D.S. from beyond the Blackwall that are attempting to absorb or destroy me just because that’s what they’re programmed to do. :face_with_peeking_eye:

If you’d be willing to share, I’m curious to know some of the reasons why you think that uploading consciousness will never be possible. If you don’t want to share, that’s okay. :slight_smile:

It would be nice if that’s what the net would be! I mean, I still wouldn’t try to make myself digital since I’d rather be human but it definitely would make it a lot more viable.
I guess I don’t really have any solid reasons as to why I don’t think it’ll ever be possible. I guess it’s just because the brain is so complex and the ethical debate that would surround it? But both of those could be eventually solved.
I suppose it’s a guess more than anything

1 Like

I’ve always wondered what the perception of time / processing speed difference would make. While not everything we think can be compared to a computer in useful, direct terms, but to skip the inconveniences of engineering problems:

How would you think differently with information and calculations being immediately available? Even if your thought processes themselves weren’t accelerated, you’d never spend any effort on these types of problems, and I think that would significantly change one’s relationship to that kind of ‘problem’.

I guess we’re part way there with the whole ‘computer in my pocket’ nature of things, and we can maybe make some extrapolations there, but as I think you’re getting at with this post: What we have now is not the top of the mountain, not even close.

I think total upload would make a totally different being. So much of our thoughts, personalities, and concerns are directly born from the consequences of being meat. Removing that radically changes the core essence of what it is to be a human. Not to suggest that this new being is a bad thing or lacks personhood or the like… but in the same way that I cannot know what it is to be a Black Person, what hope do I have of understanding a Person Without A Body?

I asked more questions than I answered, but enjoy. ;D

2 Likes

( And if anyone is up for some horrific interpretations of this idea, I highly suggest the shortstory ‘Mayfly’: https://www.rifters.com/real/shorts/Watts_Murphy_Mayfly.pdf )

I like Hawkins’ theory about the possible state of consciousness in the form of a digital self. Hawkins once said that mankind has managed to create complex virtual worlds, partly in photorealism, in a very short time (~30 years). what if other virtual worlds were created within these virtual worlds but much faster than before because the basis is already known. we could already be iteration number x of a virtualized world and not realize it because the systems are so advanced. a wonderful idea from Hawkins that regularly melts my brain :sweat_smile:

I read a fiction book once (the title of which I’ve been trying to recall but so far I have been unsuccessful) that explored this idea. The main premise of the story was that the accelerated processing power and ability to split oneself into multiple copies to accomplish parallel processing always led to insanity. All of the AIs would end up going insane at various frequencies as the available computing power increased for reasons like:

  • the ability to absorb, collate, and analyze all available information in nanoseconds led to extreme boredom and hallucinations when there was nothing else to learn

  • the ability to predict all possible outcomes and choose the desired path from point A to B led to a complete disdain for other intelligences that could not do the same (inability to tolerate irrational and unpredictable behavior)

  • having to wait for (effectively) eons between responses when interacting with humans in meatspace made those interactions irrelevant because the AI outgrew the human by many times in the space between a single call and response. Imagine a conversation between two babies where one baby matures 10x faster than the other.

  • any creative endeavors became incomprehensible gibberish to humans beyond 4 dimensions, and were only understandable by other AIs of sufficiently advanced generation

  • splitting and recombining replicas of the self to accomplish parallel processing could lead to corruption when the child process was attempted to be merged back into the main entity if any of the sub-selves evolved in an incompatible way due to a multitude of factors, some as innocuous as a simple clock skew or a temporary network outage

If I can recall what book that was (or if anyone else here happens to know), I’ll be sure to post it here. It was really quite interesting to think about how “unlimited processing power” could actually be too much, and that a consciousness ultimately needs boundaries to maintain sanity.

1 Like

Only slightly related: