Training a neural network to identify objects by parameters in non-overlapping spaces

Authors

  • Dmytro Evgrafov State scientific and research institute of cybersecurity technologies and information protection, Kyiv, Ukraine https://orcid.org/0000-0001-9651-1558
  • Serhii Sholokhov Institute of special communications and information protection of National technical university of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”, Kyiv, Ukraine https://orcid.org/0000-0003-2222-8842

DOI:

https://doi.org/10.20535/2411-1031.2025.13.1.328972

Keywords:

neural network training, quality criteria for object distinction, errors of the first and second kind of decision theory, indicators of quality

Abstract

The present moment is characterised by the active use of digital information processing technologies in electronic communications systems. An important task in this case is to develop methods and algorithms for deciding whether a certain object O belongs to one or another m-th class: Om, m =1, 2, ..., M, M ≥ 2 – number of classes. This task can be solved with the use of neural networks that implement the processing of k conditional estimates of physical parameters xk1/m, xk2/m, …, xkn/m, …, xkN/m objects, k =1, 2, …, K, K – is the maximum number of training steps, n is the current physical parameter that characterises the object and is an input to the neural network), N – is the number of such physical parameters, N ≥ 2. Contingent valuations xk1/m, xk2/m, …, xkn/m, …, xkN/m are random, depend on the energy characteristics of the impacts and dynamically change over time, and the decision to determine whether an object O belongs to one or another m-th class involves the use of neural networks, which have the properties of learning and self-learning. Suppose that, from the energy point of view, the input influences are powerful enough to assign the object O to one or another m-th class Om . Thanks to the expert's ability to accurately determine the m-th class after receiving the k-th conditional vector of physical parameter estimates xk/m, the m neural network will be trained by refining the lower and upper limits of displacements in the first layers of perceptrons for each m-th class Q1m min and Q1m max, after the next estimates of physical parameters are received and the expert provides the real value of the object O belonging to class m. The article solves the inverse problem of finding, Qm min, Qm max, which ensure the specified quality indicators in neural network training for the minimum number of steps K. The article considers the implementation of a three-layer neural network trained by an experienced expert to solve the problem of object identification by several parameters. The solution to the problem of object identification by classes is presented for known distributions of conditional estimates of physical parameters. The problem of object identification by classes at infinite signal-to-noise ratios in the process of estimating physical parameters is solved. The expressions that determine the perceptron displacement for the problem of object identification by classes when the spaces of true values of input parameters do not intersect are found.

Author Biographies

Dmytro Evgrafov, State scientific and research institute of cybersecurity technologies and information protection, Kyiv

candidate of technical sciences, senior researcher at the research department of countering technical intelligence

Serhii Sholokhov, Institute of special communications and information protection of National technical university of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”, Kyiv

candidate of technical sciences, associate professor, associate professor of the еlectronic communications academic department

References

A. M. Reznik, “Non-Iterative Learning for Neural Networks”, in Proc. Int. Joint Conf. Neural Netw., Washington, DC, USA, 1999, no. 548, pp. 1374-1379. doi: https://doi.org/10.1109/IJCNN.1999.831163.

What is Perceptron? A Beginners Guide for 2023. [Online]. Available: https://www.simplilearn.com/tutorials/deep-learning-tutorial/perceptron. Accessed on: Feb. 15, 2025.

What is machine learning? [Online]. Available: https://www.ibm.com/topics/machine-learning. Accessed on: Jan. 17, 2025.

What is deep learning and how does it work? [Online]. Available: https://www.techtarget.com/searchenterpriseai/definition/deep-learning-deep-neural-network. Accessed on: Feb. 12, 2025.

What Is Deep Learning? [Online]. Available: https://www.mathworks.com/discovery/deeplearning.html. Accessed on: Feb. 12, 2025.

O. M. Riznyk, O. A. Kalyna, O. S. Sychev, O. G. Sadova, O. K. Dekhtyarenko, and A. O. Galynska, “Multifunctional neurocomputer NeuroLand”, Math. Mach. & Sys., no. 1, pp. 36-45, 2003. [Online]. Available: http://www.immsp.kiev.ua/publications/2003_1/index.html. Accessed on: Jan. 18, 2025.

D. V. Evgrafov, Distribution of the absolute maximum of a random field in the theory of analysis of radio engineering systems. Monograph. Kyiv, Ukraine: Igor Sikorsky Kyiv Polytechnic Institute, Polytechnic Publishing House, 2021.

Y. Shchypskyi, O. Prozor, “Analysis of neural network technologies for the development of intelligent chatbots in social networks”, in Proc. XLVІІX VNTU Scien. & Tech. Conf., Vinnytsia, 2020, pp. 1-3. [Online]. Available: https://conferences.vntu.edu.ua/index.php/allfitki/all-fitki-2020/paper/view/8759/7556. Accessed on: Feb. 19, 2025.

Machine learning algorithms. Deep neural networks in problems of mechanics of continuous media: Study guide. Kyiv, Ukraine: Taras Shevchenko National University of Kyiv, 2024.

S. O. Subbotin, Neural networks: theory and practice. Textbook. Zhytomyr, Ukraine: O. O. Yevenok Publ., 2020.

S. Abdoli, P. Cardinal, and A.L. Koerich, “End-to-end environmental sound classification using a 1D convolutional neural network”, Mach. Learn., 2019. doi: https://doi.org/10.48550/arXiv.1904.08990.

O. O. Miroshnyk, and A. V. Svyatobatko, “Modelling a neural network for the tasks of predicting physical parameters”, Proc. of the Tauride State Agrotechnology University, vol. 5, no. 13, pp. 34-40, 2013. [Online]. Available: https://nauka.tsatu.edu.ua/print-journals-tdatu/135/13_5/zmist.pdf. Accessed on: Feb. 9, 2025.

Downloads

Published

2025-05-20

How to Cite

Evgrafov, D., & Sholokhov, S. (2025). Training a neural network to identify objects by parameters in non-overlapping spaces. Collection "Information Technology and Security", 13(1), 100–108. https://doi.org/10.20535/2411-1031.2025.13.1.328972

Issue

Section

ARTIFICIAL INTELLIGENCE IN THE CYBERSECURITY FIELD