"I want to live forever in AI"
"I want to live forever in AI"
cross-posted from: https://lemmy.ml/post/14869314
"I want to live forever in AI"
"I want to live forever in AI"
cross-posted from: https://lemmy.ml/post/14869314
"I want to live forever in AI"
Even if it were possible to scan the contents of your brain and reproduce them in a digital form, there's no reason that scan would be anything more than bits of data on the digital system. You could have a database of your brain... but it wouldn't be conscious.
No one has any idea how to replicate the activity of the brain. As far as I know there aren't any practical proposals in this area. All we have are vague theories about what might be going on, and a limited grasp of neurochemistry. It will be a very long time before reproducing the functions of a conscious mind is anything more than fantasy.
Counterpoint, from a complex systems perspective:
We don't fully know or are able toodel the details of neurochemistry, but we know some essential features which we can model, action potentials in spiking neuron models for example.
It's likely that the details don't actually matter much. Take traffic jams as an example. There is lots of details going on, driver psychology, the physical mechanics of the car etc. but you only need a handful of very rough parameters to reproduce traffic jams in a computer.
That's the thing with "emergent" phenomena, they are less complicated than the sum of their parts, which means you can achieve the same dynamics using other parts.
Even if you ignore all the neuromodulatory chemistry, much of the interesting processing happens at sub-threshold depolarizations, depending on millisecond-scale coincidence detection from synapses distributed through an enormous, and slow-conducting dendritic network. The simple electrical signal transmission model, where an input neuron causes reliable spiking in an output neuron, comes from skeletal muscle, which served as the model for synaptic transmission for decades, just because it was a lot easier to study than actual inter-neural synapses.
But even that doesn't matter if we can't map the inter-neuronal connections, and so far that's only been done for the 300 neurons of the c elegans ganglia (i.e., not even a 'real' brain), after a decade of work. Nowhere close to mapping the neuroscientists' favorite model, aplysia, which only has 20,000 neurons. Maybe statistics will wash out some of those details by the time you get to humans 10^11 neuron systems, but considering how badly current network models are for predicting even simple behaviors, I'm going to say more details matter than we will discover any time soon.
I heard a hypothesis that the first human made consciousness will be an AI algorithm designed to monitor and coordinate other AI algorithms which makes a lot of sense to me.
Our consciousness is just the monitoring system of all our bodies subsystems. It is most certainly an emergent phenomenon of the interaction and management of different functions competing or coordinating for resources within the body.
To me it seems very likely that the first human made consciousness will not be designed to be conscious. It also seems likely that we won't be aware of the first consciousnesses because we won't be looking for it. Consciousness won't be the goal of the development that makes it possible.
I’d say the details matter, based on the PEAR laboratory’s findings that consciousness can affect the outcomes of chaotic systems.
Perhaps the reason evolution selected for enormous brains is that’s the minimum necessary complexity to get a system chaotic enough to be sensitive to and hence swayed by conscious will.
We don't even know what consciousness is, let alone if it's technically "real" (as in physical in any way.) It's perfectly possible an uploaded brain would be just as conscious as a real brain because there was no physical thing making us conscious, and rather it was just a result of our ability to think at all.
Similarly, I've heard people argue a machine couldn't feel emotions because it doesn't have the physical parts of the brain that allow that, so it could only ever simulate them. That argument has the same hole in that we don't actually know that we need those to feel emotions, or if the final result is all that matters. If we replaced the whole "this happens, release this hormone to cause these changes in behavior and physical function" with a simple statement that said "this happened, change behavior and function," maybe there isn't really enough of a difference to call one simulated and the other real. Just different ways of achieving the same result.
My point is, we treat all these things, consciousness, emotions, etc, like they're special things that can't be replicated, but we have no evidence to suggest this. It's basically the scientific equivalent of mysticism, like the insistence that free will must exist even though all evidence points to the contrary.
Also, some of what happens in the brain is just storytelling. Like, when the doctor hits your patellar tendon, just under your knee, with a reflex hammer. Your knee jerks, but the signals telling it to do that don't even make it to the brain. Instead the signal gets to your spinal cord and it "instructs" your knee muscles.
But, they've studied similar things and have found out that in many cases where the brain isn't involved in making a decision, the brain does make up a story that explains why you did something, to make it seem like it was a decision, not merely a reaction to stimulus.
let alone if it’s technically “real” (as in physical in any way.)
This right here might already be a flaw in your argument. Something doesn’t need to be physical to be real. In fact, there’s scientific evidence that physical reality itself is an illusion created through observation. That implies (although it cannot prove) that consciousness may be a higher construct that exists outside of physical reality itself.
If you’re interested in the philosophical questions this raises, there’s a great summary article that was published in Nature: https://www.nature.com/articles/436029a
Consciousness might not even be “attached” to the brain. We think with our brains but being conscious could be a separate function or even non-local.
I read that and the summary is, "Here are current physical models that don't explain everything. Therefore, because science doesn't have an answer it could be magic."
We know consciousness is attached to the brain because physical changes in the brain cause changes in consciousness. Physical damage can cause complete personality changes. We also have a complete spectrum of observed consciousness from the flatworm with 300 neurons, to the chimpanzee with 28 billion. Chimps have emotions, self reflection and everything but full language. We can step backwards from chimps to simpler animals and it's a continuous spectrum of consciousness. There isn't a hard divide, it's only less. Humans aren't magical.
Thank you for this. That was a fantastic survey of some non-materialistic perspectives on consciousness. I have no idea what future research might reveal, but it's refreshing to see that there are people who are both very interested in the questions and also committed to the scientific method.
I think we're going to learn how to mimic a transfer of consciousness before we learn how to actually do one. Basically we'll figure out how to boot up a new brain with all of your memories intact. But that's not actually a transfer, that's a clone. How many millions of people will we murder before we find out the Zombie Zuckerberg Corp was lying about it being a transfer?
You could have a database of your brain… but it wouldn’t be conscious.
Where is the proof of your statement?
Well there's no proof, it's all speculative and even the concept of scanning all the information in a human brain is fantasy so there isn't going to be a real answer for awhile.
But just as a conceptual argument, how do you figure that a one-time brain scan would be able to replicate active processes that occur over time? Or would you expect the brain scan to be done over the course of a year or something like that?
Why would bits not be conscious?
Consciousness and conscience are not the same thing, this naming is horrible
This just makes it more realistic
Hey, just be glad I changed it from asdf_test_3, okay?
If anyone's interested in a hard sci-fi show about uploading consciousness they should watch the animated series Pantheon. Not only does the technology feel realistic, but the way it's created and used by big tech companies is uncomfortably real.
The show got kinda screwed over on advertising and fell to obscurity because of streaming service fuck ups and region locking, and I can't help but wonder if it's at least partially because of its harsh criticisms of the tech industry.
Just FYI content warning for Pantheon there is a seriously disturbing gore/kill scene that is animated too well in the first season. Anyone who has seen the show knows what scene I am talking about, I found the scene pretty upsetting and I almost didn't finish the show. I am still a little upset that the scene is burned in my memory.
Sounds good. Did it come to a conclusion or get axed mid way?
The show got kinda screwed over on advertising and fell to obscurity because of streaming service fuck ups and region locking, and I can’t help but wonder if it’s at least partially because of its harsh criticisms of the tech industry.
Okay so I can't 100% confirm this, but the first season wasn't popular because it was on whatever the fuck AMC+ is. Amazon bought it because of the writer's strike to get something out.
Yes, I just finished watching Pantheon and absolutely loved it!
Totally agree that it deserved more attention. At least it got a proper ending with season 2.
Also, the voice acting talent they got was impressive. Paul Dano was fantastic as one of the leads.
I really thought you were going to mention "Upload" on Prime. Same creator as the office.
That show is garbage
Checking in to see if this show was mentioned. Highly recommend! Well written
The game SOMA represents this case the best. Highly recommended!
Yes, I immediately thought about SOMA after reading the post. recommendations++
Soma is so fucking bleak and I love it
Did they ever allow for turning off head bob and blur? That game makes me motion sick to an insane degree.
I already know I will never play this game, could you elaborate for me?
Soma is a wonderful game that covers this type of thing. It does make you wonder what consciousness really is... Maybe the ability to perceive and store information, along with retrieving that information, is enough to provide an illusion of consistent self?
Or maybe it's some competely strange system, unkown to science. Who knows?
I think the definition of consciousness needs to not be solely about abilities or attributes. It needs to account for the active process of consciousness. Like a hair dryer can burn things.. but a fire is things burning. Without the active nature its simply not conscious.
Maybe consciousness is everywhere, and has nothing to do with mechanisms.
I don't think anything gave me existential doom quite as much as the ending of that game.
Provide the illusion to whom?
Self? Seemed pretty clear in their comment
The comic sans makes this even deeper
Who the fuck uses comic sans for programming? I use comic mono.
damn bro
oh god why is it real
At least it's not Comic Sans IN THE IDE (or vim/emacs for the brave).
Comic sans in vim is peak insanity
What if you do it in a ship of theseus type of way. Like, swapping each part of the brain with an electronic one slowly until there is no brain left.
Wonder if that will work.
If I remember right, the game The Talos Principle calls that the Talos principle
Sounds like the sort of the The Talos Principle would call that
The tv show Pantheon figures it will work, but it will be very disturbing.
Was looking for the Pantheon reference in this thread! Just finished that show and loved it. Of course it takes plenty of liberties for the sake of the storytelling, but still, at least it explores these interesting topics!
Anyone reading this thread, do yourself a favor and check out Pantheon!
Right? Like what if as cells die or degrade instead of being replaced by the body naturally they are replaced by nanites/cybernetics/tech magic. If the process of fully converting took place over the course of 10 years, then I don't see how the subject would even notice.
It's an interesting thing to ponder.
The subject also doesn’t notice if you end their consciousness either.
This prospect doesnt bother me in the least. I've already been replaced 5 times in my life so far. The soul is a spook. Let my clone smother me in my sleep and deal with the IRS instead.
Makes me wonder how many times I've been replaced. Also makes me wonder if I just died yesterday and today I'm actually a new person. I have no evidence that yesterday happened except for a memory of it, and let's face it, since it was a public holiday, that's a pretty foggy memory
I wonder about that. During the deepest part of sleep does your brain have enough activity to maintain a continuous stream of consciousness? If you go through two sleep cycles in a night does yesterday you die, and you from the first sleep cycle who only dreamed die, and you're a new consciousness in the morning?
yeah, went down this rabbit hole recently: what if I'm the .001% that lives until
<max age variable for my genome>
? or what if 'me' is an amalgam of all the ones that die, and I get to live all those lives until the variable runs out.I feel like there's a great story behind each one of the five
Damn dude. Was each time a death? I think a someone’s following me around and snuffing me out. Mandela Effects keep happening. Also I’m getting elf ears? Reality is weird.
"The soul is a spook"
I'm sorry I understand those words not in those orders though, are you saying the soul is an olde timey anti-black racial slur or that it's inherently scary?
A spook is a pretty niche concept from philosophy, I believe coined by Max Stirner
It basically means a social construct that is being taken as if it is a real factual thing instead of something made up?
I am bad at explaining stuff but I hope you get the gist of it.
spook
could also indicate ghost or intelligence operative. I don't assume they were going racist with it.
Spook = ghost (aka a soul unhoused a living body)
Spook is from the german "spuking" which means haunting. Its use in this context comes from the german philosopher Max Stirner who is infamous for the memes where X is declared to be a spook.
Understanding what exactly spooks are is somewhat challenging, and plenty of people get the wrong ubderstanding about what is meany by spooks. But at least in the meme way of using the word, a spook is anything you think is a fairy tale, or nonsense that you don't care about.
A copy is fine. I can still seek vengeance on my enemies from beyond the grave.
It's definitely an improvement to just being plain old dead
I dunno. I’m starting to suspect nobody’s ever dead.
undefined
throws UserNotPaidException
would've made more sense if it was rust
(or is the copy intential here?)
Plottwist: consciousness is : Copy
It's pinned and !Unpin
, and only has private constructors.
Uploading is a matter of implementing Clone
rust
#[derive(Clone, Copy)] struct Consciousness {...} fn upload_brain(brain: Consciousness) -> Result<(), Error>
If we're gonna have a dystopian future, then damn it, it's gonna be memory safe.
The semantics in Rust would be completely out of wack. What does ownership mean?
I guess the point of the joke is that consciousness is a shallow value.
Whats the difference between void fn(Type& var)
and void fn(Type* var)
?
Sends original data vs making a copy of data and sending it.
In meme context you'd be just making a copy of your consciousness and putting it in a machine. Whatever reason you're doing it for - escape illness, survive armageddon, nothing changes for you. A copy of you lives on though.
It's not like the post, secont is a pointer.
I guess you ask for C++. There Type* can be null while Type& can't be null. When it gets compiled Type& is compiled (mostly) to the same machinecode as Type*.
You can pass nullptr in the second example (that is not what OP wrote though, hes second is making a copy).
Thanks, I was Just curious. I knew what * did but I wasn't sure about &
the plot of ::: spoiler spoiler SOMA ::: in a nutshell?
There's a cool computer game that makes this point as part of the story line... I'd recommend it, but I can't recommend it in this context without it being a spoiler!
Guys probably talking about
Lost the coin flip.
There's also a book with a similar concept. It's not the focus until later in the book though. It's called
Related book recommendation!!
Kil'n People by David Brin - it's a futuristic Murder Mystery Novel about a society where people copy their consciousnesses to temporary clay clones to do mundane tasks for them. Got some really interesting discussions about what constitutes personhood!
Some of the concepts in this book really stuck with me, but I had no idea what the title was! Thanks!
"Some days you're the original, some days you're the copy" or something like that
I've had this thought and felt it was so profound I should write a short story about it. Now I see this meme and I feel dumb.
I saw a great comic about it once, one sec
Edit: more focused on teleportation, but a lot of the same idea. Here https://existentialcomics.com/comic/1
That's why I'm going for brain in a jar.
I know myself deeply enough to be totally fine with a copy. I’d be my own copy’s pet if it came to that. I trust me.
Yeah we'd work together well and the sex would be great.
It needs an empty catch block
I had to turn my phone sideways and go cross-eyed to spot the difference.
The best part is, unless that function name is misleading, it doesn't matter how the data is passed; a copy is being sent out over TCP/IP to another device regardless.
I don't get it
The joke is that there are some people who think that by uploading themselves into a machine "to live forever," their consciousness will also be transferred, like when you travel by bus from one city to another. In reality, you "upload yourself," but that yourself is not you, but a copy of you. So, once the copy is done, you will still be in your original body, and the copy will "think" it is you, but it's not you. It's a copy of you! So, you continue to live in your body until you die, and, well, for you - that's it. You're dead. You're not living. You're finished. Everything is black. Void. Null. Done - unless you believe in the afterlife, so you'll be in heaven, hell, purgatory or whatever, but the point is, you're not longer on Earth "living forever." That's just some other entity who thinks it is you, but it's not you (again, because you're dead.)
This is represented by the parameters being passed by value (a copy) instead of by reference (same data) in the poster's image.
It wouldn't be you, it would just be another person with the same memories that you had up until the point the copy was made.
When you transfer a file, for example, all you are really doing is sending a message telling the other machine what bits the file is made up of, and then that other machines creates a file that is just like the original - a copy, while the original still remains in the first machine. Nothing is even actually transferred.
If we apply this logic to consciousness, then to "transfer" your brain to a machine you will have to make a copy, which exist simultaneously with the original you. At that point in time, there will be two different instances of "you"; and in fact, from that point forward, the two instances will begin to create different memories and experience different things, thereby becoming two different identities.
The first line passes the argument by reference, ie, the object itself.
The second line passes the object by value, ie, a copy.
Also in Rust that would be the opposite which is funny but confusing
Thank
Sorry Dave, I'm afraid I can't do that
Are you sure the roon of today is a reference to yesterday's roon?
everyone watch this clip and tell me what you think
https://www.youtube.com/watch?v=szzVlQ653as
what if it's year 3000 right now and we're all playing a game?
Intellisense commands you to fix your method
What needs to happen for it to actually work.
c++
bool uploadConsciousness(Consciousness&& Conscience) {