Information theory in the brain

Claude Shannon’s classic 1948 paper introduced a general theory for measuring the transmission of information from a source, across a noisy channel, to a receiver (Figure 1). This theory became known as information theory. Shannon illustrated his theory with such examples as messages sent over telegraph lines by Morse code. But the same principles can be applied to neurons: a neuron transmits information along its axon to other neurons, using a neural code. Information theory has proved to be a powerful tool for quantifying the communication of information by neurons.