• Neuronal firing is often understood as a fundamentally binary process, because a neuron either fires an action potential or it does not. This is often referred to as the “all-or-none” principle. Once the membrane potential of a neuron reaches a certain threshold, an action potential will fire. If this threshold is not reached, it won’t fire. There’s no such thing as a “partial” action potential; it’s a binary, all-or-none process.

        Frequency Modulation: Even though an individual neuron’s action potential can be considered binary, neurons encode the intensity of the stimulation in the frequency of action potentials. A stronger stimulus causes the neuron to fire action potentials more rapidly. Again binary in nature not analog.

        • @[email protected]OP
          link
          fedilink
          English
          96 months ago

          Neuronal firing is often understood as a fundamentally binary process, because a neuron either fires an action potential or it does not. This is often referred to as the “all-or-none” principle.

          Isn’t this true of standard multi-bit neural networks too? This seems to be what a nonlinear activation function achieves: translating the input values into an all-or-nothing activation.

          The characteristic of a 1-bit model is not that its activations are recorded in a single but but that its weights are. There are no gradations of connection weights: they are just on or off. As far as I know, that’s different from both standard neural nets and from how the brain works.

        • @[email protected]
          link
          fedilink
          English
          36 months ago

          So what you are saying is they are discrete in time and pulse modulated. Which can encode for so much more information than how NNs work on a processor.

    • @[email protected]
      link
      fedilink
      English
      16 months ago

      We really don’t know jack shit, but we know more than enough to know fire rate is hugely important.