Thanks to @[email protected] for the links!
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via%3Dihub
It doesn’t look like these “bits” are binary, but “pieces of information” (which I find a bit misleading):
The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:
But our brains are not digital, so they cannot be measured in binary bits.
There is no other definition of bit that is valid in a scientific context. Bit literally means “binary digit”.
Information theory, using bits, is applied to the workings of the brain all the time.
How do you know there is no other definition of bit that is valid in a scientific context? Are you saying a word can’t have a different meaning in a different field of science?
Because actual neuroscientists understand and use information theory.
Actual neuroscientists define their terms in their papers. Like the one you refuse to read because you’ve already decided it’s wrong.
Actual neuroscientists do not create false definitions for well defined terms. And they absolutely do not need to define basic, unambiguous terminology to be able to use it.
Please define ‘bit’ in neuroscientific terms.
Binary digit, or the minimum additional information needed to distinguish between two different equally likely states/messages/etc.
It’s same usage as information theory, because information theory applies to, and is directly used by, virtually every relevant field of science that touches information in any way.
All information can be stored in a digital form, and all information can be measured in base 2 units (of bits).
But it isn’t stored that way and it isn’t processed that way. The preprint appears to give an equation (beyond my ability to understand) which explains how they came up with it.
Your initial claim was that they couldn’t be measured that way. You’re right that they aren’t stored as bits, but it’s irrelevant to whether you can measure them using bits as the unit of information size.
Think of it like this: in the 1980s there were breathless articles about CD ROM technology, and how, in the future, “the entire encyclopedia Britannica could be stored on one disc”. How was that possible to know? Encyclopedias were not digitally stored! You can’t measure them in bits!
It’s possible because you could define a hypothetical analog to digital encoder, and then quantify how many bits coming off that encoder would be needed to store the entire corpus.
This is the same thing. You can ADC anything, and the spec on your ADC defines the bitrate you need to store the stream coming off… in bits (per second)
As has been shown elsewhere in this thread by Aatube a couple of times, they are not defining ‘bit’ the way you are defining it, but still in a valid way.
Indeed not. So using language specific to binary systems - e.g. bits per second - is not appropriate in this context.
So ten concepts per second? Ten ideas per second? This sounds a little more reasonable. I guess you have to read the word “bit” like you’re British, and it just means “part.” Of course this is still miserably badly defined.