AI (2) - Three obvious reasons why the brain cannot be doing backpropagationNovember 10, 2021 Cortical neurons do not communicate real-valued activities What are real-values? Numbers such 3.14 or 2/5 or -6.9. In neural networks we use vectors of these numbers to represent the weights of the different layers. For example, the first layer is a vector w1 n real values. The first value might be 0.92, the second value 0.2 and so on until the n-th value. These are the weights of w1. Neural networks use these weights to communicate information The weights are essentially the input to a function. Each weight is passed to a function as an x value (e.g. $f(x)$). The function $f(x)$ can be for example a sigmoid or a Rectified Linear Unit (ReLU). A neural network uses hundreds of these functions in each layer. This allows it to learn from input - by passing in different values for x we get different positions on the function. We hope that by using the right activation function we will be able to abstract the information in the input. For this we need to translate it into real-values. How do neurons communicate Neurons communicate via spikes i.e. electrical activity What are action potentials? Neurons need to two send two types of signals forward backward backpropagation only passes backwards neurons in ANNs do not send activity forward AND backward Recurrent Neural Networks Neurons do not have point-wise reciprocal connections with the same weight in both directions Nodes in ANNs have connections to multiple neurons with different weights however the weight from a single node_A to a node_B is the same both ways (i.e. from A to B and from B to A) Neurons have connections to various different neurons - the reciprocal connections have different weights as well e.g. a sending neuron might have a larger weighting than a backpropagating neuron –> backpropagation does exist - just not in a purely mathematical form, nor as a unique mechanism parallel weight assignments - neurons can have reciprocal connections with different weights, allowing for parallel forward passing as well as backpropagation