This article highlights the theoretical results of analytical research in the field of the use of tethered balloons as carriers of barrier networks, which can be used as passive protection of particularly important civil and industrial facilities from unmanned aerial vehicles and other low-flying means of air attack. A diagram of an aerostatic barrier system, theoretically capable of countering air attack means, is presented. The elements of the methodology for determining the probability of damage to aircraft from cable type air barriers are presented.
Keywords: aerostatic air barrier system, unmanned aerial vehicles, tethered balloons, aerial protection of ground objects
This paper explores in detail the technological evolution and current state of question and answer (Q&A) systems. Using an example of an airline customer service task, a BERT-based model is developed that is capable of recognising user intentions and extracting named entities. The paper provides a detailed description of the dataset preparation, data analysis methods and data exploration techniques of the project. A description of the model and parameter settings during the model tuning process and the model training process is presented. The model developed in this project is named RNEEMAviCS-BERT, which achieved an intent recognition accuracy of 98.2% and named entity recognition accuracy of 83%. We have created a semantic analysis module for the question and answer system. The next stage of our work will be to integrate the dataset to complete the query-response and response generation components of the Q&A system.
Keywords: question-answering systems, ChatGPT, BERT, machine learning, neural networks, pre-trained models, intention recognition, named entity recognition, data analysis, model training
The article considers a method of automated formation of a training data set for machine learning algorithms for classification of electronic documents, which differs from the known ones by forming training data sets based on the synthesis of clustering and data augmentation methods based on calculating the distance between objects in multidimensional spaces.
Keywords: teaching with a teacher, clustering, pattern recognition, machine learning algorithm, electronic document, vectorization, formalized documents
In this article, an algorithm for processing hydroacoustic signals in the frequency domain using wavelets is considered. Arguments are given in favor of the similarity of the structure of hydroacoustic signals with the structure of vibration signals. The structure of hydroacoustic signals is described, while the relevance of wavelet analysis over analysis using Fourier transforms is emphasized. The algorithm can be applied to estimate the spectral density using the Fourier periodogram and estimate the energy in different frequency ranges. The method of hard threshold smoothing coefficients is presented and the advantages of this approach over a soft threshold are presented. A step-by-step algorithm for filtering the hydroacoustic signal is described. One of the applications of the algorithm is to estimate the parameters of the vibration signal using a parallel implementation of the algorithm.
Keywords: frequency domain, spectral density; wavelets; vibration signal processing
the effect of a thickener and a setting retarder on the technological properties of GSHS START is investigated. One-factor plans of a two-factor model have been developed, with a minimum (0.1%; 0.005%) and maximum (0.2%; 0.05%) dosage levels of pore-forming and water-retaining additives, respectively. Regression equations of output parameters in the form of a second-degree polynomial are obtained using regression and correlation analysis of experimental data. The values of partial correlation coefficients are analyzed. With an increase in the dose of water-retaining and pore-forming additives from 0.1% to 0.2% and from 0.005% to 0.05% of the binder, respectively, for all possible combinations of the dosage of the thickener and setting retarder, there is an increase in setting time by 10 ... 72%, and sliding by 33 ... 80%. The least sensitive to an increase in water-retaining and pore-forming additives was a mixture in which the amount of thickener is 0.2% (at the upper level), and the amount of moderator is 0.04% (at the lower level).
Keywords: technological properties, organizational and technological solutions, dry building mixes, functional additives, thickener, setting, retarder, two-factor experiment, coefficient of determination, regression analysis, correlation analysis
Currently, tracing the movements of various objects (in particular a person) occupies a central place in video surveillance and video analytics systems. It is a system for tracking people's movements by localizing their positions on each frame within the entire video stream and is the basis of many intellectual computer vision systems. The purpose of this article is to develop a new algorithm for tracing human movements in a video stream with the possibility of selecting motion trajectories. The main stages of the algorithm include: dividing the video into frames with a difference of one second, selecting the person under study in the video stream, implementing a digital processing process based on recognizing the clothes of the person under study and obtaining its color histogram, predicting localization and recognizing the person under study on all subsequent frames of the video stream using the developed methods of forecasting the direction of movement of this object. The output data of the proposed algorithm is used in the procedure of forming and displaying a general picture of the movement of a particular person within the entire video stream. The information and materials contained in this article may be of interest to specialists and experts who, in their work, pay special attention to data processing when analyzing fragments of the video stream.
Keywords: surveillance cameras, u2– net neural network, rembg library, pattern recognition, clothing recognition, delta E, tracing, direction prediction, object detection, tracking, mathematical statistics, predicted area, RGB pixels
In this article, using the example of the Silinka River in the city of Komsomolsk-on-Amur, the influence of various factors on the formation and transportation of sediments in the river, such as sediments, dissolved substances, such as the gross form of zinc, is estimated. The paper uses a multiple regression model to identify the influence of some external factors on the level of contamination of bottom sediments with zinc and presents the results of numerical modeling that allow us to assess changes in the "water – bottom sediments" system under the influence of various factors. The work is important for understanding ecological processes in rivers and can be used to develop methods for managing and protecting water resources.
Keywords: multiple regression, urban area, ecological status, mass transfer processes, water resources, bottom sediments, modeling, Silinka River, Komsomolsk-on-Amur city, ecological processes, numerical modeling, water resources management
The article provides an overview of the analysis and diagnosis of product surface defects, evaluated using digital image processing. The search for scientific publications was carried out mainly in the Scopus and RSCI scientometric bases for the period 2019-2023. The purpose of this article is to determine the best methods for assessing the destruction of materials using digital images. The main methods of processing and analyzing digital images are considered. The perspective of unification of segmentation modes by digital image acquisition sources and combining images from various recording sources to obtain objective information on the nature of material destruction is shown. To reduce the time for assessing the degree of destruction of materials, it is proposed to gradually use the methods of segmentation, filtering digital images of defects in metal products with subsequent calculation by a neural network.
Keywords: defect, control, digital image, neural network.
A new approach to increasing the efficiency of extreme control systems by improving the method of searching for the extremum of the objective function is presented. In its multidimensional nonlinear optimization, instead of a traditional linear search along a once selected direction, an exploratory search is used, the direction of which at each step is adapted to the topology of the objective function. This makes it possible to localize an extremum as quickly as possible and significantly reduce the time of its determination. An algorithm for interpolation search for an extremum in the found interval is proposed. The objective function is modeled by a cubic spline segment based on information about its gradient vector at the boundary points of the interval, as a result of which the number of interpolation search steps is significantly reduced. The possibility of simplified nonsmooth interpolation using first-order splines in the extremum region is considered. The results of a numerical experiment confirm the high efficiency of the new method in solving various problems.
Keywords: extremal control systems, nonlinear optimization, acceleration of extremum search, quasi-Newton method, polynomial interpolation, non-smooth interpolation
The problem of reducing communication interaction in the chain between a natural language message and a BPMN model is considered. For this purpose, a number of authors have proposed a special notation called a mechanism. The procedure for constructing a mechanism using a given BPMN model is considered. The possibility of building a mechanism only for BPMN models that satisfy certain conditions is shown: the model must contain at least one artifact associated with one of the gateway actions; gateways should not contain more than two choices; the model should not end with a gateway; the model should not contain an AND-OR gateway. The procedure for constructing a BPMN model using a given mechanism is considered. The possibility of such a transformation is shown if the following conditions are met: the presence of a one-to-one correspondence of the elements and functions of the mechanism, the use of a single tool and a single strip in the mechanism. For models that do not satisfy these conditions, the use of the mechanism is problematic: it turns out to be either too cumbersome or too simple, which does not facilitate the simplification of communicative interaction. It is concluded that additional research is necessary in order to either improve the mechanism or use a different notation that does not have the disadvantages of the mechanism.
Keywords: BPMN, communication, business model, modeling, mechanisms, natural language, translation into BPMN
The development and implementation of decision support systems (DSS) based on modern methods of data processing, storage and analysis is an urgent task. As part of this work, an algorithm for optimizing the business processes of an IT company and a model for the functioning of a DSS were developed. The implementation of the proposed methods will improve the efficiency of IT companies.
Keywords: decision support system, business process, optimization, algorithm, IT company, data analysis, software, program code
The paper presents a new approach to assessing the level of contamination with heavy metals of the soil-like fraction from landfills using Monte Carlo simulation using the example of landfills located within the borders of Volgograd.It was found that with a probability of 36.2%, the contamination level of a soil-like fraction from the landfill located in the Voroshilovsky district will correspond to moderately hazardous, and with a probability of 63.8%, hazardous. It is economically justified to isolate a soil-like fraction with a low level of pollution to detoxify it and further use it in the territory reclamation. For a soil-like fraction from landfill located in the Traktorozavodsky district, the pollution level was determined as extremely hazardous and hazardous with a probability of 87.1% and 3.1%, respectively. It is shown that a useful and usable part cannot be isolated from a soil-like fraction. A soil-like fraction must be neutralized and placed at waste disposal facilities.The presented approach is a useful instrument for pollution level assessment of a soil-like fraction, which can increase the accuracy of an estimate and the management effectiveness of a soil-like fraction during landfill development.
Keywords: landfill, soil-like fraction, heavy metals, pollution level, Monte Carlo method, modeling
The article discusses a method for detecting defects in extended products. To find defects, scanning the product along its entire length is used. The result is a two-dimensional data stream that needs to be analyzed. The problem of detecting a defect is one of the tasks of detecting a “useful” signal against a background of “noise”. The most reliable method is to use a set of statistical criteria. To compare the mean values, the Student's test and two Wilcoxon–Mann–Whitney tests were used; to compare the scattering values, the Fisher test and the Ansari–Bradley test were used. The effectiveness of the algorithm was confirmed using a computer model simulating a two-dimensional homogeneous data stream.
Keywords: defects, extended products, computer model, simulation, statistical criterion
This work solves the problem of increasing the effectiveness of educational activities by predicting student performance based on external and internal factors. To solve this problem, a model for predicting student performance was built using the Python programming language. The initial data for building the decision tree model was taken from the UCI Machine Learning Repository platform and pre-processed using the Deductor Studio Academic analytical platform. The results of the model are presented and a study was conducted to evaluate the effectiveness of predicting student performance.
Keywords: forecasting, decision tree, student performance, influence of factors, effectiveness assessment
The purpose of this study is to present the forestry complex development scenarios of the Republic of Karelia and the Murmansk region. Based on the use of factor analysis and cluster analysis, 27 central foresters of the study region were divided into 9 clusters according to 20 indicators. The selected indicators took into account the characteristics of wood resources, natural-production conditions and road infrastructure. Based on cluster profiles, as well as on topographical, climatic, soil map and vegetation maps, scenarios for the development of the study region forestry complex in the context of the resulting clusters. The results of the study showed that as they move from south to north, a gradual impoverishment of wood resources occurs. The efforts of the state and business should be aimed at resolving issues of road infrastructure, involving deciduous, small, energy wood in production circulation. Given the natural and production conditions, which are largely determined by the moist forest soils and the extreme vulnerability of the northern ecosystems, in the process of logging, it is especially necessary to pay attention to the minimization of the negative impact of logging operations on the soil cover.
Keywords: zoning, forest industry, factor analysis, cluster analysis, k-means cluster analysis, logging, forest management
The article presents ways to improve the accuracy of the classification of normative and reference information using hierarchical clustering algorithms.
Keywords: machine learning, artificial neural network, convolutional neural network, normative reference information, hierarchical clustering, DIANA
Roads have a huge impact on the life of a modern person. One of the key characteristics of the roadway is its quality. There are many systems for assessing the quality of the road surface. Such technologies work better with high-resolution images (HRI), because it is easier to identify any features on them. There are a sufficient number of ways to improve the resolution of photos, including neural networks. However, each neural network has certain characteristics. For example, for some neural networks, it is quite problematic to work with photos of a large initial size. To understand how effective a particular neural network is, a comparative analysis is needed. In this study, the average time to obtain the HRI is taken as the main indicator of effectiveness. EDSR, ESPCN, ESRGAN, FSRCNN and LapSRN were selected as neural networks, each of which increases the width and height of the image by 4 times (the number of pixels increases by 16 times). The source material is 5 photos of 5 different sizes (141x141, 200x200, 245x245, 283x283, 316x316) in png, jpg and bmp formats. ESPCN demonstrates the best performance indicators according to the proposed methodology, the FSRCNN neural network also has good results. Therefore, they are more preferable for solving the problem of improving image resolution.
Keywords: comparison, dependence, effectiveness, neural network, neuronet, resolution improvement, image, photo, format, size, road surface
The article describes the results of the development of an information system for processing the results of sports competitions on the 1C:Enterprise platform for informational support of the sports social project Don Family League, which involves entire families in sports and physical education. An object model of configuration data is presented, which allowed structuring the subject area, highlighting the main application objects, their details and the relationships between them, which was later used for algorithmization of the solution and software implementation of complex tools on the 1C platform.:Enterprise". The software package was tested as part of information support for the activities of the Don Family League project in the period from July 2022 to July 2023 and showed high efficiency.
Keywords: All-Russian physical culture and sports complex «Ready for work and Defense», the Don Family League project, processing of results of sports competitions, rating of individual standings, rating of family standings, rating of team standings
The article considers the methodology of forming and determining the parameters of machine learning algorithms for classifying electronic documents according to the importance of information for officials of organizations, which differs from the known ones by the dynamic formation of the structure and number of machine learning algorithms, due to the automated determination of sets of structural divisions of the organization, sets of keywords reflecting the tasks and functions of structural divisions in the process of automated analysis of the Organization's Regulations, The positions of structural units based on the theory of pattern recognition.
Keywords: lemmatization, pattern recognition, machine learning algorithm, electronic document, vectorization, formalized documents
In modern society, problems related to the ethics of artificial intelligence (AI) are increasingly emerging. AI is used everywhere, and the lack of ethical standards and a code necessitates its creation to ensure the safety and comfort of users. The purpose of the work is to analyze approaches to the ethics of artificial intelligence and identify the parameters for evaluating approaches to create systems that meet ethical standards and meet the needs of users. Approaches to the ethics of artificial intelligence are considered. The parameters for evaluating approaches are highlighted. The main characteristics are highlighted for each parameter. The parameters described in this paper will help achieve better results when creating standards for the development of safer and more user-friendly systems.
Keywords: Code, parameters, indicators, characteristics, ethics, artificial intelligence
Road surface quality assessment is one of the most popular tasks worldwide. To solve it, there are many systems, mainly interacting with images of the roadway. They work on the basis of both traditional methods (without using machine learning) and machine learning algorithms. To increase the effectiveness of such systems, there are a sufficient number of ways, including improving image quality. However, each of the approaches has certain characteristics. For example, some of them produce an improved version of the original photo faster. The analyzed methods for improving image quality are: noise reduction, histogram equalization, sharpening and smoothing. The main indicator of effectiveness in this study is the average time to obtain an improved image. The source material is 10 different photos of the road surface in 5 sizes (447x447, 632x632, 775x775, 894x894, 1000x1000) in png, jpg, bmp formats. The best performance indicator according to the methodology proposed in the study was demonstrated by the "Histogram equalization" approach, the "Sharpening" method has a comparable result.
Keywords: comparison, analysis, dependence, effectiveness, approach, quality improvement, image, photo, format, size, road surface
This paper considers the conditions and factors affecting the security of information systems functioning under network reconnaissance conditions. The developed model is based on the techniques that realize the dynamic change of domain names, network addresses and ports to the network devices of the information system and false network information objects functioning as part of them. The formalization of the research problem was carried out. The theoretical basis of the developed model is the theories of probability and random processes. The modeled target system is represented as a semi-Markov process identified by an oriented graph. The results of calculation of probabilistic-temporal characteristics of the target system depending on the actions of network reconnaissance are presented, which allow to determine the mode of adjustment of the developed protection measures and to evaluate the security of the target system under different conditions of its functioning.
Keywords: departmental information system, network intelligence, structural and functional characterization, false network information object
To date, a huge amount of heterogeneous information passes through electronic computing systems. There is a critical need to analyze an endless stream of data with limited means, and this, in turn, requires structuring information. One of the steps in solving the problem of data ordering is deduplication. This article discusses the method of removing duplicates using databases, analyzes the results of testing work with various types of database management systems with different sets of parameters.
Keywords: deduplication, database, field, row, text data, artificial neural network, sets, query, software, unstructured data
Social and pension provision are key processes in the activities of any state, and the issues of forecasting expenses for them are among the most important in the economy. The task of evaluating the effectiveness of the pension fund has been solved by various methods, including regression analysis methods. This task is particularly difficult due to the presence of a large number of factors determining the activity of the pension fund, such as: the number of recipients of old-age pensions, the number of policyholders, self-employed policyholders, recipients of benefits, insured persons and working pensioners. As the main approach to the study, the method of implementing a model competition was applied. Those variants that violated the meaningful meaning of the variables and did not fully reflect the behavior of the modeled process were excluded from the resulting set of alternative model options. The final option was selected using the multi-criteria selection method. It is revealed that the use of relative variables is important for qualitative modeling of the studied processes. The above model shows that an increase in the ratio of the number of employers and the self-employed to the number of insured persons leads to a decrease in the cost of financing social and pension provision.The model can be effectively used for short-term forecasting of the total annual volume of financing of the pension fund department in the context of changing social and macroeconomic factors.
Keywords: pension fund, regression model, model competition, adequacy criteria, forecasting
Currently, digitalization as a technological tool penetrates into the humanitarian sphere of knowledge, linking technocratic and humanitarian industries. An example is legal informatics, in which conceptual devices of quite different – at first glance – areas of human knowledge are interfaced. However, the desire to abstract (formalize) any knowledge is the most important task in the "convergence" of computer technologies and mathematical methods into a non-traditional humanitarian sphere for them. The paper discusses the problems generated by the superficial idea of artificial intelligence. A typical example is the attempt of some authors in jurisprudence to give computer technologies, often referred to as artificial intelligence by humanitarians, an almost sacred meaning and endow it with legal personality.
Keywords: artificial intelligence, deep learning, machine learning, hybrid intelligence, adaptive behavior, digital economy, digital law, legal personality of artificial intelligence