USING MACHINE LEARNING TO DETECT SOFTWARE VULNERABILITIES
DOI:
https://doi.org/10.31891/2307-5732-2025-353-29Keywords:
Neural networks, Software code fluctuations, Models of neural networks, Development of neural networksAbstract
The increasing complexity of current software systems has led to the growth of the number of security vulnerabilities that pose a threat to both the single user and large organizations. Conventional vulnerability detection approaches include manual code reviews and rule-based static analysis and are limited by the large amount of code and the dynamic nature of security risks. As a result, machine learning and, specifically, neural networks have become popular for the automation of the software vulnerability detection process.
Neural networks can discover complex patterns in big data sets, and thus are suitable for identifying potential security weaknesses. Numerous models and training strategies have been suggested to enhance their precision and speed in vulnerability detection. These approaches aim to improve detection accuracy while reducing human effort and time required for analysis. Nevertheless, there are several challenges that come with using neural network models for this task, including the need for large and accurately annotated datasets, the right choice of network architecture, and the tradeoff between the number of false positives and false negatives. Moreover, the generalizability of trained models across different codebases and programming languages remains an ongoing concern.
This paper looks at the different neural network models and training strategies for the detection of vulnerabilities in software. We review supervised, unsupervised, and hybrid approaches and assess their efficacy and weaknesses. Besides, this paper considers feature selection techniques, data cleaning techniques, and performance metrics used in vulnerability detection problems. The purpose of this review is to outline the recent developments in this subject area and identify potentially useful areas for future investigation.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 ОЛЕКСІЙ ТАЗЕТДІНОВ, ВАЛЕРІЙ ТАЗЕТДІНОВ (Автор)

This work is licensed under a Creative Commons Attribution 4.0 International License.