"I’d like to talk to her, but first I have to cover half the distance between where we are and where she is, then half of the distance that remains, then half of that distance, and so on. The series is infinite. "
I get it’s a joke but that’s a bad joke. That’s a convergent series. It’s not infinite. Any 1st year calculus student would know that.
"it’s that in theory analog carries infinite information. "
But in reality it can’t. The universe isn’t continous, it’s discrete. That’s why we have quantum mechanics. It is the math to handle non contiguous transitions between states.
How much of that fidelity can you lose before it’s not your consciousness?
That can be tested with c elegans. You can measure changes until a difference is propagated.
Measure differences in what? We can’t ask *c. elegans * about it’s state of mind let alone consciousness. There are several issues here; a philosophical issue here about what you are modeling (e.g. mind, consciousness or something else), a biological issue with what physical parameters and states you need to capture to produce that model, and how you would propose to test the fidelity of that model against the original organism. The scope of these issues is well outside a reply chain in Lemmy.
"I’d like to talk to her, but first I have to cover half the distance between where we are and where she is, then half of the distance that remains, then half of that distance, and so on. The series is infinite. "
I get it’s a joke but that’s a bad joke. That’s a convergent series. It’s not infinite. Any 1st year calculus student would know that.
"it’s that in theory analog carries infinite information. "
But in reality it can’t. The universe isn’t continous, it’s discrete. That’s why we have quantum mechanics. It is the math to handle non contiguous transitions between states.
That can be tested with c elegans. You can measure changes until a difference is propagated.
Measure differences in what? We can’t ask *c. elegans * about it’s state of mind let alone consciousness. There are several issues here; a philosophical issue here about what you are modeling (e.g. mind, consciousness or something else), a biological issue with what physical parameters and states you need to capture to produce that model, and how you would propose to test the fidelity of that model against the original organism. The scope of these issues is well outside a reply chain in Lemmy.