The article discusses issues related to the development of the optimal structure of neural networks. One of the major problems in the synthesis neural conrollers is ensuring optimal number of neurons and connections between layers in the network. Necessary to achieve a compromise between the representational network capabilities, and memory that will be required to store it. The article presents a number of heuristics, allowing to reach such a compromise by limiting the number of connections for each neuron.
This is done through the introduction of two characteristics for the neuron: the radius of the coverage and density of connections. Coverage radius determines the number of neurons that could potentially have a connection to this neuron, and the density of the compound gives the actual number of such compounds. Altogether, these characteristics determine the amount of synaptic connections, which can have a neuron. The article presents the heuristics that allow to define these characteristics depending on the network structure.
Keywords: neural network, neural controller, associative memory, machine learning, optimization
In this work the algorithm for compression and access to compressed voxel data in process of landscape modelling is proposed, which will significantly decrease memory requirements. This algorithm is designed to ease the development of vast voxel landscapes in computer simulations and games. The experimental studies has shown significant gain of voxel model memory efficiency. The aim of this algorithm is to be able to cooperate with surface extraction tools.
Keywords: compression, RLE, run-length encoding, voxel, landscape, virtual reality, operative memory, data