ALGORITHMS FOR OPTIMIZATION OF LCR-NETWORK HYPERPARAMETERS

Authors

DOI:

https://doi.org/10.31891/

Keywords:

artificial neural networks, long-and-close-range principle, k-means algorithm, hyperparameters, optimality criteria

Abstract

This paper continues the development of methods for solving problems involving ultra-large data sets (Big Data class problems). The problems are solved with respect to two inherently conflicting criteria: accuracy and speed.

In previous studies, the long-and-close-range (LCR) principle was introduced for structuring artificial neural networks. According to this principle, the transition matrices between network layers are treated as nearly static objects based on technocratic reasoning: the farther a neuron in the next layer is from a neuron in the previous layer, the less influence it receives from the output of that previous neuron. Rather than optimizing hundreds of thousands or even millions of transition matrix elements, as in conventional feedforward networks, the proposed LCR networks require the optimization of only a few hyperparameters and activation function settings.

A key characteristic of LCR networks is the parameterization of the conditional distance between consecutive layers. This work introduces a radial structure for LCR networks and proves that when both the previous and subsequent layers contain an even number of neurons, the metric balance and influence balance conditions are satisfied.

The conditional distances are modeled as a function of the number of neurons in the previous layer, N, and can generally follow any monotonically increasing function. In this study, a power-law function is adopted, with the exponent Q serving as the sole hyperparameter of the LCR network in this configuration. Thus, the model optimizes only one scalar parameter instead of tuning numerous matrix coefficients.

A comparative analysis involving LCR networks and the classical k-means algorithm demonstrates that the proposed network achieves better performance both in terms of classification accuracy and processing speed after training.

Published

2025-12-19

How to Cite

ODEGOV, N. ., HADZHYIEV, M., PEREKRESTOV, I., & SHCHERBA, S. . (2025). ALGORITHMS FOR OPTIMIZATION OF LCR-NETWORK HYPERPARAMETERS. Herald of Khmelnytskyi National University. Technical Sciences, 359(6.2), 386-391. https://doi.org/10.31891/