The article discusses issues related to the development of the optimal structure of neural networks. One of the major problems in the synthesis neural conrollers is ensuring optimal number of neurons and connections between layers in the network. Necessary to achieve a compromise between the representational network capabilities, and memory that will be required to store it. The article presents a number of heuristics, allowing to reach such a compromise by limiting the number of connections for each neuron.
This is done through the introduction of two characteristics for the neuron: the radius of the coverage and density of connections. Coverage radius determines the number of neurons that could potentially have a connection to this neuron, and the density of the compound gives the actual number of such compounds. Altogether, these characteristics determine the amount of synaptic connections, which can have a neuron. The article presents the heuristics that allow to define these characteristics depending on the network structure.
Keywords: neural network, neural controller, associative memory, machine learning, optimization
Information about authors of issue №4 (2011)
Keywords: authors