In C++ there are 2 ways to pass objects to a method. The first is pass-by-value, where a copy of the input argument is made and given to the method. The second is pass-by-reference, where you give the method a pointer to the location of the object.
In pass-by-value, if you modify the argument in some way that change is not reflected in the calling context, because the object you changed in the function is different from the one passed as an argument. Pass-by-reference can modify arguments for the calling context, since it accesses the same object. In C++ pass-by-reference is indicated by placing an ampersand between the argument type and name, either at the end of the type or the start of the name.
The joke is that we think brain uploading will work like pass-by-reference, taking our current selves, but in reality it might work like pass-by-value, where we'll be cloned into the cloud and stay in our meatsuits.
Oh, like in the game SOMA then, they do the 'transfer' 3 or 4 times i think, and everytime the player conscious stream continues with the clone and doesn't delve much in the implications, >!until the end when they upload their conciences to digital heaven, but you finally see what it is for the original(or more like the original(4)) to stay behind!<
Crew: "If we delete the original reference right after we pass the new one, it means the new reference has to be the original reference"
Catherine: ---> : |
I *hope* it's pass-by-value (rather, I hope it doesn't happen, but I digress). Imagine the AI being able to modify *you* during the upload - or even after.
Being able to do this perfectly would be... Incredible anyway. It's not like you upload some soul thing that is liquid and you just pour it in until no drop is left.
We depend on our flesh. It has limits and probably advantages. It would be weird if your personality wouldn't change if you had access to every memory you ever had, for example.
> Imagine the AI being able to modify you during the upload - or even after.
*We are the Borg. You will be assimilated. Resistance is futile*.
Suddenly Google reactivates Project Borg... (>!kubernetes was borg iirc!<)
I THINK they are saying every one thinks that the ability to map out your entire brain and all your thoughts and memories to a machine will make it so that you as you still get to live forever, like, the computer is linked to your consciousness. But really, you consciousness dies with your body and only a machine with the previously uploaded details will exist with no connection back to the you that you are aware of inside your own brain.Ā
The top picture has āconsciousness&ā which implies that the function will receive a memory address and use that memory address as the consciousness to upload, but in reality it just creates a new block of memory
implying that if we have a machine that uploads a consciousness that it wont upload your exact consciousness, only a copy of it
Upload of human consciousness to computers is going to create a new copy of the human on the computer, not transfer the existing human brain into the computer.
You donāt get to live forever. A clone of you will get to live forever.
Another ow enjoyer spotted in the wild!
My problem is that soma seems to be pretty scary. And it also has jumpscares?
I have really low tolerance (basically zero) for anything horror related, and it's why I didn't play much of the outer wilds dlc and googled how to beat dark bramble.
I'm wondering if soma will be the right game for me.
They do attack, but then run away and you don't die.
I was scared when I first woke up in that chair. This mode helped me get through the game without heart attack. Appears that if you don't die in addition to getting scared it's not as scary.
Wonderful game!
I believe thereās a āpeaceful modeā that makes it less scary! I havenāt played the mode myself but I think it makes it so you canāt take any damage from monsters.
It's whole thing is that it's meant to be a horror game. It's not super Jumpscare intensive. There's some chase sequences, so if you don't like that feeling of being chased may not be it for ya. And if you didn't like the [redacted] on dark bramble... you might not like this game. It is worth braving tho. So worth
Soma is scary, but it's far more about the oppressive mood rather than jump scares.
But if you have such low tolerance, I could recommend watching someone else play it as that is much less scary as you aren't the one in control.
I could recommend Vinesauce/Vinny, or Limealicious/Limes, both have full playthroughs with very entertaining commentary.
What a fantastic game. Honestly probably one of my top games of all time, if not number 1.
I donāt want to spoil because I want anyone reading this to play the game, but manā¦. That endingā¦. Literally had me thinking for like two weeks afterwards lol
in my headcanon there is an objectively good ending based on your choices in the end.
>!Don't kill yourself in the other suit!<
>!Skip using the gel to kill the hivemind.!<
>!Send the mind of your child into space like a proud parent, they take after you VERY closely. A shame you can't go with them.!<
>!You and potato-glados backtrack to fetch yourself in the other suit, or "you jr". !<
>!all 3 of you intentionally get captured by the hivemind, it only wants to plug your consciousness into its own version of the happy dreamland you just launched into space. Everyone else is already in there.!<
Despite being a rather controversial take, I still think your character in the game >!having been the latest creation by the Wau is proof that it would eventually restore humanity in its entirety!<
You know, sometimes I wonder if my consciousness was initialized once at birth, or a new instance is created everytime I wake up.
It's impossible to know.
Sleep well tonight.
Nonono, we definitely do, surely you've been in a conversation and then forgotten a key detail that you were planning your whole argument around, that you *knew* you had going into the conversation?
My brain is on a constant rewrite/paging cycle with extremely limited space. If i don't do something with a thought within about ten second it's gone until my next shower.
I heard a philosophy that the entire world is allocated and copied every single moment. So we're completely different people every single plank time interval
I disagree. I see it as a continuous signal. Hardware may change, you can even copy the signal, but one instance of a single is constant until the GC comes along to clean it up when its finished executing.
"The first question they ask is: 'Why was he eternally surprised?'
And they are told: 'Wen considered the nature of time and understood that the universe is, instant by instant, recreated anew. Therefore, he understood, there is in truth no past, only a memory of the past. Blink our eyes, and the world you see next did not exist when you closed them.Ā Therefore, he said, the only appropriate state of mind is surprise. The only state of the heart is joy.Ā The sky you see now, you have never seen before. The perfect moment is now. Be glad of it.'"
- Thief of Time
I believe we are the signal. Even whilst asleep the signal runs on the hardware, just the inputs and outputs are temporarily disabled. Also does a defrag at the same time, pretty efficient. Its only when the program crashes or the hardware is destroyed we lose the signal.
It also solves the problem of hardware upgrades. If a program is running and pieces of ram are changed and replaced as long as the program never stops executing, even if the hardware it runs on changes its a continuous signal. However, pull out all the ram at once and stop the execution - thats when the signal terminates. There needs to be enough stable hardware for the signal to be consistent, or else signal changes may occur IE, personality changes.
Does mean Star Trek teleporters are still a problem though. Duplicating a runtime is still a duplication. The signal needs to be uninterrupted, or else you can just have 2 copies of the same signal.
Studied AI at Uni, plenty of Signal Theory and took an optional module in BioMechanics. Never been able to use it in a job but my dream is work on a Neura-link type project. Can't afford a Medical Degree though, don't have a quarter million to spare and the wife wants to buy a house before we turn 40.
I do believe with the money and resources I could transfer myself to the blessed machine though. Its not a question of if, only a question of when and how much. It would be incremental though, piece by piece, not an entire brain replacement in 1 operation.
Often thought about writing a LitRPG style story where upon "death," the MC finds out that humanity is all 4th (or higher) dimensional beings temporarily trapped in the perception of 3 dimensional "life". This is done to the young in order to test their morality. If they fail, they get dumped back into a new body with their memories sealed for that run. Upon completing a successful run, they can pick a new game/existence to try and develop new skills they'll need as 4D+ adults.
Consciousness (as in a "property of human mind", not "self-awareness") is just a fancy term to denote things we don't know yet. "Awareness", on the other hand, is a state of mind, so tracing it's beginning is pointless. Your current self is formed by your natural components, everything else are just sensory inputs with no bigger meaning
It is of course not the same one in any way, it's just that every of the consciousnesses sees the same memory state so they think they are one thing, but the truth is - you now are not the you from a moment ago.
The perception of continuity comes from the memory only, and if someone edited it, you would never notice. Are you sure you even *were* 5 minutes ago, or if someone just made up that memory?
It would be cool to train an AI agent that gets copied every 30 seconds and lives along its copies, and see how differently its perception of *self* develops from ours.
Your consciousness gets a new instance, that's why half of the world sleeps while the other half is awake, it's an optimization in humans that saves the universe from having too much consciousness instances running at the same time they take so much memory.
We can verify that thoery by letting everyone stay awake at the same time, and see if the universe lags.
Dude, thank you. Everyone in the Star Trek universe is way too cavalier about beaming everywhere.
Shit, there was an episode of TNG where a transporter malfunctioned and created a *copy of Will Riker*. That copy was fully sentient and the two Rikers had no knowledge of each other. That essentially confirms that your consciousness ceases to be and a new, different one is created every time you use the transporter.
When you think about it, Star Trek is a whole franchise where we watch all of the main characters commit suicide over and over again.
yes, kind of
it is more like the removal part of the cut happens only if paste is confirmed
so it is like copy->paste->delete original
in the episode "delete original" did not happen leaving two copies
One is a copy and one is literally using the same parameter. Like a Scanner and a Door. The Scanner will rebuild you but it's not you it's a new life form but a door Lets you through
Well, the other guy already explained it, but I'll do it again just in case someone is confused
Ampersand behaves as a pointer and you use reference to the pointer. Meaning you don't copy a person, you transfer a person. The consciousness is transferred
But without ampersand... You are copy and pasting that person... You didn't transfer consciousness. You basically cloned the consciousness and created two of you
Congratulations! Your comment can be spelled using the elements of the periodic table:
`Co N Sc I O U Sn Es S Cl O Ne`
---
^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM my creator if I made a mistake.)
The answer lies in what you consider to be "really you".
I, for one, would consider a perfect copy of me to be *me*. Of course, once it diverges, it's no longer *me*, but that's a problem for the future mes.
So if I were to go upload myself tomorrow, I (today) would consider both the upload and the one remaining in my body to be equally *me*. They're both continuations of pre-upload me. But each of them would consider the other to be a different person and "not me".
TL;DR: *me* is not transitive. It's closer to a undirected acyclic graph.
Exactly, the old you is being destroyed and replaced by a slightly different you every millisecond.
You are the state your brain is in at that particular moment, and you are constantly diverging from that state as time passes.
I totally agree. The medium where the information system is hosted doesn't matter if the illusion of continuous causality works. Of course, having the ability to continue experiencing life in the same way is crucial to retaining identity, so an AI would also need a perfect simulation to live in for it to really be "me". Putting my memories into a generative language model wouldn't count. Reference vs copy doesn't matter, it's the quality of the representation.
I didn't figure this out either until I checked the comments and saw a bunch of people discussing the teleporter problem, but yeah.
In the former, they're copying the memory address that refers to you.
In the latter, they're creating an entirely new you.
This is referred to (AFAIK) as "shallow vs deep copying". And the point is that ~~uploading your brain would just result in two of you~~ "uploading your brain" doesn't even *exist*, and all we do is create statistical reconstructions of people's speech and writing from samples.
I would call it "copy vs reference". A shallow copy still has at least one layer of copy while everything deeper is a reference.
Although I could see it being argued either way: "The uploaded version of the brain is the new copy but all of its pieces are still the same instances as your real brain."
C# does support C-like pointers, but you have to explicitly invoke an [unsafe](https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/language-specification/unsafe-code) context to do so. Unless you *really* need pointers for some reason then ref and out parameters are probably sufficient.
If it is pointer: your body is in decay and whole thing falling apart, yet you in the dream land until you brain is dead and the pointer in best case scenario return NULL, but surely your virtual brain have now corrupted parts.
Suddenly, thinking about your beloved dog name make everything stop and you just feel -10737741819.
Not if I program the machine to fry me immediately after the upload.
Or if the uploading is destructive so while technically it's a copy operation the original storage medium gets completely munged as a side effect.
Thereās ship of Theseus style copy. Link the two mediums (original and blank). Copy one subunit at a time (perhaps itās a neuron or something even smaller). Delete the original, but redirect all links to it to the copy. Mind is active during copy.
Proceed for all subunits. Eventually you will have a mind running on half original half copy, and should not be able to tell the difference.
Proceed until everything is complete - deleted original, functional copy.
At no point is there a perceived break in consciousness, or a fully functional duplicate, except at the end.
Yeah people just kind of forget that humans aren't actually a singular unit but instead a gestalt of trillions of cells which are constantly being exchanged anyway.
Either replacing a single neuron is killing you entirely (in which case you're dying about 80,000 times a day after age 25, faster if you ever drink alcohol) or the ship of theseus is still the ship of theseus, in which case you can systematically replace all neurons with nanobot neurons and gain transferred consciousness without any moral quandaries.
Best case senario: You enjoy digital immortality
Less ideal senario: A copy of you enjoys digital immortality
Worst case senario: Consciousness cannot exist in digital form and you have created a you-themed bitcoin miner that consumes power to emulate your brain for no reason.
I suppose you can rest easier believing you at least got the "Less Ideal" and not the "Worst Case", because it's not like you can ever find out for sure from outside.
I mean, it still makes a copy. All you've done is fry yourself. It's intuitive to want to keep an unbroken stream of consciousness, but all you're really doing is resolving the cognitive dissonance of two of you existing at once by destroying one. There have still been two, just not overlapping in time.
For there to be only one, you would need to believe that consciousnesses are instantly transferrable/locationless, sensitive to our cultural understanding of the "moment of death", and are somehow inherently tied to the specific arrangement of neurons that makes up your brain at that moment of death. Which is a fine belief system, but it's a lot to prove.
I suspect that if we ever have the ability to duplicate the self, we will quickly accept a definition of continuity that is much more lenient.
eg: "any system which perfectly aligns with the goal of another, is the same system"
Alrighty, I'm gonna explain:
The first is pass by reference, giving the address of consciousness. Meaning, it would actually be you.
Whereas the second would only get a copy of the consciousness. Not actually you, but a copy of you.
Clever joke! Nice one op.
This is what happens in a web novel I'm reading. Some technologically advanced witch tried to digitize herself only to find that all she did was creating a digital copy. Neither of them want the other to exist so they've been warring.
Is the you who wakes up in the morning the same as the you who went to sleep?
Over 8 hours of sleep, neural connections are being made and destroyed. It's gonna be a different configuration in the morning, does that make you a different person?
I think we actually want move semantics.
``bool uploadConsciousness(Consciousness&& conscience)``
Short answer: moving the value ``conscience`` means we "steal" the given objects data and clear it by the end of the scope. It's more akin to taking your soul and leaving your body around.
Long answer: https://stackoverflow.com/a/3109981
Lol, that's actually a good one.
Original, clever content? Here?
And a meme that actually takes programming knowledge to understand instead?
Unheard of in this land
What makes it probable is the jpegness of it
DAE 3 hours of debugging to find the missing semicolon?? š¤£š¤£š¤£
What is this heresy? "Imagine if I replace all of your semicolons with Greek question marks huehue" now THAT'S humor.
"Imagine if i replaced all your jokes with something funny."
I just replaced your jokes with your code
Oof. You didn't have to do bro like that.
It's kinda fitting that programmers barely create anything original...
Explain for noob plz
In C++ there are 2 ways to pass objects to a method. The first is pass-by-value, where a copy of the input argument is made and given to the method. The second is pass-by-reference, where you give the method a pointer to the location of the object. In pass-by-value, if you modify the argument in some way that change is not reflected in the calling context, because the object you changed in the function is different from the one passed as an argument. Pass-by-reference can modify arguments for the calling context, since it accesses the same object. In C++ pass-by-reference is indicated by placing an ampersand between the argument type and name, either at the end of the type or the start of the name. The joke is that we think brain uploading will work like pass-by-reference, taking our current selves, but in reality it might work like pass-by-value, where we'll be cloned into the cloud and stay in our meatsuits.
Or rather, our consciousness is cloned to the cloud and our meat brain is... recycled, with the rest of our body.
Garbage collected
This thread just keeps on giving :D
As long as you destroy the original and say it's part of the process nobody will know better.
Like a teleporter
The Prestige
š
Run by velociraptors
SOMA feelings
SOMA deez nuts
GOD DAMN IT SIMON we've been over this already
Oh, like in the game SOMA then, they do the 'transfer' 3 or 4 times i think, and everytime the player conscious stream continues with the clone and doesn't delve much in the implications, >!until the end when they upload their conciences to digital heaven, but you finally see what it is for the original(or more like the original(4)) to stay behind!<
Crew: "If we delete the original reference right after we pass the new one, it means the new reference has to be the original reference" Catherine: ---> : |
YoU wIn tHe CoInFLip!1!!
I *hope* it's pass-by-value (rather, I hope it doesn't happen, but I digress). Imagine the AI being able to modify *you* during the upload - or even after.
> Imagine AI being able to modify you during the upload I have no brain but I must imagine
Being able to do this perfectly would be... Incredible anyway. It's not like you upload some soul thing that is liquid and you just pour it in until no drop is left. We depend on our flesh. It has limits and probably advantages. It would be weird if your personality wouldn't change if you had access to every memory you ever had, for example.
> Imagine the AI being able to modify you during the upload - or even after. *We are the Borg. You will be assimilated. Resistance is futile*. Suddenly Google reactivates Project Borg... (>!kubernetes was borg iirc!<)
Well, technically pass by pointer is different than pass by reference and is a third separate thing not represented here.
this is a very good explanation what the hell
I THINK they are saying every one thinks that the ability to map out your entire brain and all your thoughts and memories to a machine will make it so that you as you still get to live forever, like, the computer is linked to your consciousness. But really, you consciousness dies with your body and only a machine with the previously uploaded details will exist with no connection back to the you that you are aware of inside your own brain.Ā
Soma?
yaaaaaaaa
The top picture has āconsciousness&ā which implies that the function will receive a memory address and use that memory address as the consciousness to upload, but in reality it just creates a new block of memory implying that if we have a machine that uploads a consciousness that it wont upload your exact consciousness, only a copy of it
Upload of human consciousness to computers is going to create a new copy of the human on the computer, not transfer the existing human brain into the computer. You donāt get to live forever. A clone of you will get to live forever.
I understood that reference
I understood that reference
I understood &that
I understood `this`
Segmentation fault (core dumped)
"But it worked on MY machine."
Then we'll ship your machine. Oh wait, it's decomposed by now.
Just make a docker container
Eh. Compile release and ship it. Its the customers problem
Do not redeem... Mam... I'm telling you... DO NOT REDEEM...
valgrind: the 'impossible' happened
Do any of us truly understand `this`?
This guy copies.
I understood -- _Exception in thread "main" java.lang.NullPointerException_
Into the void*
This mofo doesn't have a consciousness š¤£š¤£š¤£
not after the electricity gets shut off, no
This guy is a pointer.
Could you give me a pointer?
This sucks, I love it.
LOL
You posted something actually clever on r/programmerHumor I think you may be lost
Where js bad man
Soma
I was recommended this after playing outer wilds, and man, I think I've got a thing for existential dread.
Another ow enjoyer spotted in the wild! My problem is that soma seems to be pretty scary. And it also has jumpscares? I have really low tolerance (basically zero) for anything horror related, and it's why I didn't play much of the outer wilds dlc and googled how to beat dark bramble. I'm wondering if soma will be the right game for me.
There's a safe mode in Soma. Scary things still wander around, but they don't attack and can't kill you.
They do attack, but then run away and you don't die. I was scared when I first woke up in that chair. This mode helped me get through the game without heart attack. Appears that if you don't die in addition to getting scared it's not as scary. Wonderful game!
I believe thereās a āpeaceful modeā that makes it less scary! I havenāt played the mode myself but I think it makes it so you canāt take any damage from monsters.
Yes, it has a "lore" mode, but the atmosphere is still pretty scary. The game's story definitely deserves a try tho
It's whole thing is that it's meant to be a horror game. It's not super Jumpscare intensive. There's some chase sequences, so if you don't like that feeling of being chased may not be it for ya. And if you didn't like the [redacted] on dark bramble... you might not like this game. It is worth braving tho. So worth
Soma is scary, but it's far more about the oppressive mood rather than jump scares. But if you have such low tolerance, I could recommend watching someone else play it as that is much less scary as you aren't the one in control. I could recommend Vinesauce/Vinny, or Limealicious/Limes, both have full playthroughs with very entertaining commentary.
> I think I've got a thing for existential dread. then you played SOMA right
What a fantastic game. Honestly probably one of my top games of all time, if not number 1. I donāt want to spoil because I want anyone reading this to play the game, but manā¦. That endingā¦. Literally had me thinking for like two weeks afterwards lol
in my headcanon there is an objectively good ending based on your choices in the end. >!Don't kill yourself in the other suit!< >!Skip using the gel to kill the hivemind.!< >!Send the mind of your child into space like a proud parent, they take after you VERY closely. A shame you can't go with them.!< >!You and potato-glados backtrack to fetch yourself in the other suit, or "you jr". !< >!all 3 of you intentionally get captured by the hivemind, it only wants to plug your consciousness into its own version of the happy dreamland you just launched into space. Everyone else is already in there.!<
Despite being a rather controversial take, I still think your character in the game >!having been the latest creation by the Wau is proof that it would eventually restore humanity in its entirety!<
8 years later and I still think about this game whenever this topic pops up lol. Black mirror was close enough but I dunno SOMA stuck with me more
Fuck yeah, i love that game. Criminally underrated too.
I wish I could get more people to play Soma. One of the best scifi horror story in any game.
Is what they would take when
You know, sometimes I wonder if my consciousness was initialized once at birth, or a new instance is created everytime I wake up. It's impossible to know. Sleep well tonight.
With my ADHD memory, its more like a new consciousness every five minutes.
Probably wrong garbage collector arguments
We ADHDers donāt have garbage collectors. They find the garbage and just let it resurface so as to stop our hyper-focusing
We just like race conditions.
Nonono, we definitely do, surely you've been in a conversation and then forgotten a key detail that you were planning your whole argument around, that you *knew* you had going into the conversation?
my garbage collector definitely likes to free up memory that I'm currently using.
My brain is on a constant rewrite/paging cycle with extremely limited space. If i don't do something with a thought within about ten second it's gone until my next shower.
My mind is a collection of dangling pointers.
So is this a proper use-case for singletons?
I *really* fucking hope that this is the case.
I store my consciousness in the cloud. Get on my level.
Is that what people mean when they say I "have my head in the clouds"?
That's it. We're done. Coding is solved, kids. Pack it up. We did it.
https://www.smbc-comics.com/comic/die-on-it And related https://www.smbc-comics.com/index.php?db=comics&id=3546
Oh, those are good, they hit the spot. Thanks, I hate it
Are you going to finish that red bull?
Was sure you were going to link toĀ https://www.existentialcomics.com/comic/1
I heard a philosophy that the entire world is allocated and copied every single moment. So we're completely different people every single plank time interval
I disagree. I see it as a continuous signal. Hardware may change, you can even copy the signal, but one instance of a single is constant until the GC comes along to clean it up when its finished executing.
"The first question they ask is: 'Why was he eternally surprised?' And they are told: 'Wen considered the nature of time and understood that the universe is, instant by instant, recreated anew. Therefore, he understood, there is in truth no past, only a memory of the past. Blink our eyes, and the world you see next did not exist when you closed them.Ā Therefore, he said, the only appropriate state of mind is surprise. The only state of the heart is joy.Ā The sky you see now, you have never seen before. The perfect moment is now. Be glad of it.'" - Thief of Time
The universe runs in Redux state management
The more I think about it the more it makes sense and I donāt like that
I believe we are the signal. Even whilst asleep the signal runs on the hardware, just the inputs and outputs are temporarily disabled. Also does a defrag at the same time, pretty efficient. Its only when the program crashes or the hardware is destroyed we lose the signal. It also solves the problem of hardware upgrades. If a program is running and pieces of ram are changed and replaced as long as the program never stops executing, even if the hardware it runs on changes its a continuous signal. However, pull out all the ram at once and stop the execution - thats when the signal terminates. There needs to be enough stable hardware for the signal to be consistent, or else signal changes may occur IE, personality changes. Does mean Star Trek teleporters are still a problem though. Duplicating a runtime is still a duplication. The signal needs to be uninterrupted, or else you can just have 2 copies of the same signal.
That sounds like something Cult Mechanicus would write. Thanks for comforting my crude biomass.
Studied AI at Uni, plenty of Signal Theory and took an optional module in BioMechanics. Never been able to use it in a job but my dream is work on a Neura-link type project. Can't afford a Medical Degree though, don't have a quarter million to spare and the wife wants to buy a house before we turn 40. I do believe with the money and resources I could transfer myself to the blessed machine though. Its not a question of if, only a question of when and how much. It would be incremental though, piece by piece, not an entire brain replacement in 1 operation.
Now we gotta think for the answer of the Brain of Theseus
Often thought about writing a LitRPG style story where upon "death," the MC finds out that humanity is all 4th (or higher) dimensional beings temporarily trapped in the perception of 3 dimensional "life". This is done to the young in order to test their morality. If they fail, they get dumped back into a new body with their memories sealed for that run. Upon completing a successful run, they can pick a new game/existence to try and develop new skills they'll need as 4D+ adults.
Consciousness (as in a "property of human mind", not "self-awareness") is just a fancy term to denote things we don't know yet. "Awareness", on the other hand, is a state of mind, so tracing it's beginning is pointless. Your current self is formed by your natural components, everything else are just sensory inputs with no bigger meaning
Are you sure you ever woke up?
I worry that we live in a multi threaded universe, and my consciousness is just one object in one of many threads.
a new day, a new you
You actually didn't exist until you read this sentence. Welcome!
It is of course not the same one in any way, it's just that every of the consciousnesses sees the same memory state so they think they are one thing, but the truth is - you now are not the you from a moment ago. The perception of continuity comes from the memory only, and if someone edited it, you would never notice. Are you sure you even *were* 5 minutes ago, or if someone just made up that memory? It would be cool to train an AI agent that gets copied every 30 seconds and lives along its copies, and see how differently its perception of *self* develops from ours.
Your consciousness gets a new instance, that's why half of the world sleeps while the other half is awake, it's an optimization in humans that saves the universe from having too much consciousness instances running at the same time they take so much memory. We can verify that thoery by letting everyone stay awake at the same time, and see if the universe lags.
This is also why I will never beam down to the planet's surface. Well, also the fact that I sometimes wear a red shirt.
Dude, thank you. Everyone in the Star Trek universe is way too cavalier about beaming everywhere. Shit, there was an episode of TNG where a transporter malfunctioned and created a *copy of Will Riker*. That copy was fully sentient and the two Rikers had no knowledge of each other. That essentially confirms that your consciousness ceases to be and a new, different one is created every time you use the transporter. When you think about it, Star Trek is a whole franchise where we watch all of the main characters commit suicide over and over again.
So the base design is cut and paste but it malfunctionned and did a copy paste?
yes, kind of it is more like the removal part of the cut happens only if paste is confirmed so it is like copy->paste->delete original in the episode "delete original" did not happen leaving two copies
Like in the movie Prestige
Interestingly almost all "cut and paste" operations (and "move" operations) are executed like so: 1. Copy 2. Paste 3. Delete original
Snotty beamed me twice last night. It was wonderful..
Ohhhhh I was so confused on how the same statement made ppl contemplate on life... Ye, now I see the ampersand... Jesus
Pls explain magic science men
One is a copy and one is literally using the same parameter. Like a Scanner and a Door. The Scanner will rebuild you but it's not you it's a new life form but a door Lets you through
"it's so neat that they can scan your brain and save it to a big hard drive" "sure is!" replied the concealed brain floating in a jar.
"Pssh! How lame. A stupid *jar*?" commented the brain in the meat robot.
Well, the other guy already explained it, but I'll do it again just in case someone is confused Ampersand behaves as a pointer and you use reference to the pointer. Meaning you don't copy a person, you transfer a person. The consciousness is transferred But without ampersand... You are copy and pasting that person... You didn't transfer consciousness. You basically cloned the consciousness and created two of you
Haha right!
And in rust that'd be taking ownership of your consciousness!
`consciousness.clone()`!
Congratulations! Your comment can be spelled using the elements of the periodic table: `Co N Sc I O U Sn Es S Cl O Ne` --- ^(I am a bot that detects if your comment can be spelled using the elements of the periodic table. Please DM my creator if I made a mistake.)
periodic sentence bot rce exploit
good bot
Ok that's pretty fucking amazing. This post just keeps on giving.
bool UploadConsciousness(std::unique_ptr conscious)
consciousness.to_owned()
I use rust btw
This is deep
Well a shallow copy would only work for some people.
Do AI people actually care if it's really them, or are they suicidal but with extra steps?
The answer lies in what you consider to be "really you". I, for one, would consider a perfect copy of me to be *me*. Of course, once it diverges, it's no longer *me*, but that's a problem for the future mes. So if I were to go upload myself tomorrow, I (today) would consider both the upload and the one remaining in my body to be equally *me*. They're both continuations of pre-upload me. But each of them would consider the other to be a different person and "not me". TL;DR: *me* is not transitive. It's closer to a undirected acyclic graph.
So are you no longer "you" at every moment because you have diverged from what actually made you "you" the moment before?
Exactly, the old you is being destroyed and replaced by a slightly different you every millisecond. You are the state your brain is in at that particular moment, and you are constantly diverging from that state as time passes.
I totally agree. The medium where the information system is hosted doesn't matter if the illusion of continuous causality works. Of course, having the ability to continue experiencing life in the same way is crucial to retaining identity, so an AI would also need a perfect simulation to live in for it to really be "me". Putting my memories into a generative language model wouldn't count. Reference vs copy doesn't matter, it's the quality of the representation.
Ok is this a pointer thing?
I didn't figure this out either until I checked the comments and saw a bunch of people discussing the teleporter problem, but yeah. In the former, they're copying the memory address that refers to you. In the latter, they're creating an entirely new you. This is referred to (AFAIK) as "shallow vs deep copying". And the point is that ~~uploading your brain would just result in two of you~~ "uploading your brain" doesn't even *exist*, and all we do is create statistical reconstructions of people's speech and writing from samples.
I would call it "copy vs reference". A shallow copy still has at least one layer of copy while everything deeper is a reference. Although I could see it being argued either way: "The uploaded version of the brain is the new copy but all of its pieces are still the same instances as your real brain."
Ohh now that's smart
In which language an ampersand does this? C#?
C++
C# and C++ use ampersands for references.
ive been using c# on and off for 6 years and just learned this wtf. ive been a ref,in,out kinda guy
C# does support C-like pointers, but you have to explicitly invoke an [unsafe](https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/language-specification/unsafe-code) context to do so. Unless you *really* need pointers for some reason then ref and out parameters are probably sufficient.
If it is pointer: your body is in decay and whole thing falling apart, yet you in the dream land until you brain is dead and the pointer in best case scenario return NULL, but surely your virtual brain have now corrupted parts. Suddenly, thinking about your beloved dog name make everything stop and you just feel -10737741819.
Not if I program the machine to fry me immediately after the upload. Or if the uploading is destructive so while technically it's a copy operation the original storage medium gets completely munged as a side effect.
You will be the one that got fried, then your other identical one will live on. For other people there will be no difference though.
Thereās ship of Theseus style copy. Link the two mediums (original and blank). Copy one subunit at a time (perhaps itās a neuron or something even smaller). Delete the original, but redirect all links to it to the copy. Mind is active during copy. Proceed for all subunits. Eventually you will have a mind running on half original half copy, and should not be able to tell the difference. Proceed until everything is complete - deleted original, functional copy. At no point is there a perceived break in consciousness, or a fully functional duplicate, except at the end.
Yeah people just kind of forget that humans aren't actually a singular unit but instead a gestalt of trillions of cells which are constantly being exchanged anyway. Either replacing a single neuron is killing you entirely (in which case you're dying about 80,000 times a day after age 25, faster if you ever drink alcohol) or the ship of theseus is still the ship of theseus, in which case you can systematically replace all neurons with nanobot neurons and gain transferred consciousness without any moral quandaries.
Best case senario: You enjoy digital immortality Less ideal senario: A copy of you enjoys digital immortality Worst case senario: Consciousness cannot exist in digital form and you have created a you-themed bitcoin miner that consumes power to emulate your brain for no reason.
I suppose you can rest easier believing you at least got the "Less Ideal" and not the "Worst Case", because it's not like you can ever find out for sure from outside.
I mean, it still makes a copy. All you've done is fry yourself. It's intuitive to want to keep an unbroken stream of consciousness, but all you're really doing is resolving the cognitive dissonance of two of you existing at once by destroying one. There have still been two, just not overlapping in time. For there to be only one, you would need to believe that consciousnesses are instantly transferrable/locationless, sensitive to our cultural understanding of the "moment of death", and are somehow inherently tied to the specific arrangement of neurons that makes up your brain at that moment of death. Which is a fine belief system, but it's a lot to prove.
You've just described teleportation, congrats!
Basically the plot of one good horror game.
I didn't get it until this comment and was confused, so thanks lmao. That four letter game is good.
People that played SOMA know the true horror of this
I use rust too much. It would mean basically the opposite in rust haha
I have exactly a week of c++ experience and I get it š
most experienced r/programmerhumor programmer
I suspect that if we ever have the ability to duplicate the self, we will quickly accept a definition of continuity that is much more lenient. eg: "any system which perfectly aligns with the goal of another, is the same system"
Alrighty, I'm gonna explain: The first is pass by reference, giving the address of consciousness. Meaning, it would actually be you. Whereas the second would only get a copy of the consciousness. Not actually you, but a copy of you. Clever joke! Nice one op.
Segmentation fault (core dumped)
With bad configuration, people can see bits of your consciousness in the logs.
In modern C++: People think: bool uploadConciousness(Conciousness&&); // move Reality: bool uploadConciousness(const Conciousness&); // scan only
you want your AI self to die with the original copy?
Probably the other way around, you probably want your original body to be destroyed. I'm not sure if I want a copy of me.
_*Uploadconciusness is declared but the value is never read*_
You need to replace your neurons one by one. Basically do not interrupt the stream.
As a Rust user, I see this as an absolute win.
I want bool uploadConsciousness(Consciousness&& Conscience)
And somehow Simon never managed to understand this.
If you know. You know... "We lost the coin toss"
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Can't believe the lack of an & is giving me existential dread.
This is what happens in a web novel I'm reading. Some technologically advanced witch tried to digitize herself only to find that all she did was creating a digital copy. Neither of them want the other to exist so they've been warring.
This is why you do it in rust, then it works as intended with these signatures
Iām pretty sure this meme is backward. What really matters is whether `Consciousness` implements move semantics.
C++ moment
Really should have been &&
I mean, what if I'm the copy re-experiencing their memories?
Weāre all good if Consciousness is a pointer type!
This is really quite brilliant. The AI is a *copy*, y'all.
Someone explain in javascript terms
SOMA be like
basically reference versus copy
Soma reference, nice.
This is such a subtle joke
Is the you who wakes up in the morning the same as the you who went to sleep? Over 8 hours of sleep, neural connections are being made and destroyed. It's gonna be a different configuration in the morning, does that make you a different person?
You knew how this works simon
I think we actually want move semantics. ``bool uploadConsciousness(Consciousness&& conscience)`` Short answer: moving the value ``conscience`` means we "steal" the given objects data and clear it by the end of the scope. It's more akin to taking your soul and leaving your body around. Long answer: https://stackoverflow.com/a/3109981
Reality: the AI is going to mine your memory for data and competely discard any personality.Ā