The article analyzes modern approaches and technologies for recycling and recycling of building materials. Such types of waste as reinforced concrete, brick, glass, wood, plastic, etc. are considered. Special attention is paid to innovative methods of processing to obtain secondary raw materials (recycling). It is shown that the most promising technologies are the crushing of concrete, brick and reinforced concrete for reuse in construction, the melting of scrap metal, glass and plastic, the processing of wood waste into slabs and fuel. There is an insufficient level of application of innovative solutions in Russia. A set of measures has been proposed to improve the efficiency of construction waste disposal: the introduction of advanced technologies, the construction of waste recycling plants, the adoption of a targeted state program, and the improvement of environmental culture.
Keywords: waste, construction debris, Russia, secondary raw materials, reuse, environment, recycling, burial, recycling
This paper analyses intrusion detection techniques and provides recommendations for preventing intrusions in peer-to-peer wireless networks. Peer-to-peer wireless networks are particularly vulnerable to attack due to their openness, dynamically changing topology, collaborative algorithms, lack of centralised monitoring, centralised control point and lack of clear protection. Intrusion detection techniques exist in wired networks but they are not applicable in wireless environment. The paper also presents a new intrusion defence method based on intrusion detection in peer-to-peer wireless networks.
Keywords: security, vulnerability, information protection, attack, intrusion, wireless network, mobile network, detection system, IDS, MANET, DoS, DDoS
The article explores the problem of creating aircraft flight models in the Simulink environment. The reference systems in which transformations are carried out are considered. The equations of motion used in the simplest converters are given. The initial conditions for the equations are determined: the speed of the body, the angular orientation of the body's pitch position, the angle between the velocity vector and the body, the speed of rotation of the body, the initial position, the mass and inertia of the body, the source of gravity, the acceleration due to gravity, the curb and total mass of the body, speed of air flow, inertia of an empty and full body, flight trajectory, etc. An analysis of converters of aerodynamic forces and moments into the trajectory of motion as part of an aerospace package in the Simulink environment was carried out. Recommendations are given for their use for various modeling purposes. The results of modeling a simple converter with three degrees of freedom are presented.
Keywords: modeling, MatLab, Simulink, equations of motion, aerodynamic torque, flight path, coordinate transformations, reference system, degrees of freedom
The article discusses the issues of identifying and ranking threats at an important government facility. The classification of threats by type is described. The article discusses the concepts of “design-based threats” and “the most dangerous variant of the violator’s actions” and their differences. An example of ranking threats and reducing their number to reduce the dimensionality of the vulnerability analysis problem is considered. An important government object is considered as an object in the example. Experts are tasked with ranking threats according to the level of potential losses for an important government facility if they are realized. In practice, the implementation of an accepted design threat may lead to lower potential losses than the most dangerous option for violators and, as a result, to lower requirements for the effectiveness of the physical protection system.
Keywords: comprehensive security, methods, models, security forces, intruders, important government facilities, clashes between security forces and attacks, ranking of threats
The article proposes a general formalized model of the task of processing and extracting potential key skills from job descriptions to determine the relevance of training areas and possible areas of employment for graduates. The formalized model is used in the software implementation of the job clustering module based on the obtained sets of key skills within the framework of a comprehensive toolkit for remote career guidance.
Keywords: vacancies, demand for training areas, career guidance, digitalization of career guidance, formalized model, clustering, professions, key skills
The article discusses the use of simplex methods of experiment planning to determine the optimal composition of composite building materials. Composite building materials are multicomponent systems, so their properties depend on a large number of factors that are diverse in nature and influence. The use of orthogonal plans is not able to adequately describe experimental data with a wide range of varying factors. Therefore, the article proposes to use simplex-lattice Scheffe's plans. A complete third-order model was developed to determine the optimal composition of the filled cement composite, and a regression equation was written. The adequacy of the equation was confirmed at the control points of the plan using the Student's criterion. The proposed planning method can be used to optimize the composition of multicomponent systems.
Keywords: multicomponent system, optimization methods, composite building material, experiment planning, simplex plans, Scheffe's polynomial, regression equation
motion of the body are obtained, which can only be solved numerically. The equilibrium equations are solved and the basic stationary modes of body motion are obtained. The main result is the most gentle body planning mode. In the matlab computer mathematics system, a set of programs is written that searches for a numerical solution and outputs trajectories. The movement of the modes under consideration is modeled and the stability of the main modes is looked at using numerical calculations.
Keywords: body, planning mode, stability, geometric dimensions
The paper examines current issues of modeling and forecasting market parameters for transport companies providing services for the transportation of industrial enterprises’ good, such as cost, time, speed and volumes of delivery of finished products to consumers, and also assesses the potential capabilities of transport companies to provide the required quantity and quality of transport and logistics services. The aim of the study is to determine the area of reliable forecasts of transportation indicators for each interval value of the cargo delivery shoulder, taking into account the company’s market share. Modeling of the time parameters of cargo transportation was carried out based on road transportation conditions and the time of year. When implementing modeling procedures, the required statistical basis for parameters of travel time and distance on the route was formed on the basis of data from specialized applications for analyzing indicators of transport and logistics services of freight vehicles. A family of forecast curves was obtained for various variants of forecast models of speed and travel time, as well as interval values of delivery lengths for the initial set of transport and logistics companies. and development of new production on available floor spaces. The most important organizational economic targets of a diversification of management are presented by innovative activity of the industrial enterprise.
Keywords: statistical forecasting, transportation efficiency, benchmark models, tariffs for cargo transportation, piecewise linear approximation, areas of reliable forecasts, cargo transportation parameters, benchmark analysis, transport company market
The work outlines the concept of “post-interpretation” of images and for its algorithmic implementation a model of a post-recognition interpreter is proposed. The recognition results of the initial images entering the recognition system are considered as post-images, and an artificial neural network is used as a post-recognizer. To assess the effectiveness of using the model, it is proposed to use the “expediency criterion” and numerical examples are considered to illustrate the features of its use in systems for recognizing and interpreting images with high risks. Data from preliminary results of experimental testing of a model for recognizing speech commands as part of an interactive operator's manual for performing various tasks and an assessment of its effectiveness are presented.
Keywords: intelligent data processing system, image interpretation, recognition reliability, decision-making criterion, artificial neural network
The development of business analytics, decision-making and resource planning systems is one of the most important components of almost any enterprise. In these matters, enterprises and production facilities of the penitentiary system are no exception. The paper examines the problem of the relationship between existing databases and statistical reporting forms of the production, economic and labor sectors of the penitentiary system. It has been established that indirectly interrelated parameters are quite difficult to compare due to different data recording systems, as well as approved statistical forms. One of the first steps in solving this problem could be the introduction of a generalized data indexing system. The paper discusses data indexing systems, the construction of their hierarchical structures, as well as the possibility of practical application using SQL. Examples of implementation using ORM technology and the Python language are considered.
Keywords: databases, indexing, ORM, SQL, Python, manufacturing sector, economic indicators, penitentiary system
The traditional cycle of manufacturing high-tech electronic products was examined, and a methodology for its optimization was developed. The interrelation between the stages of design documentation development and the electronic structure of the product was established. The methodology significantly reduced the manufacturing time of products by parallel development of the electronic structure alongside the product. The rationale for applying the methodology in design was provided.
Keywords: electronic structure of the product, eBOM, 1C:PLM, procurement of components
The article examines the experience of using correlation analysis to assess the nature of the relationships between various parameters of the organization and activities of fire and rescue garrisons in Russia. The analysis of absolute and relative parameters is carried out. The influence of the size of the service areas of fire departments on other response parameters, as well as the negative impact of population density and the number of fires on the speed of fire trucks is shown. The paper proposes a method for calculating the built-up area of settlements and a method for calculating the coefficient of non-straightness of the street and road network along arbitrary routes.
Keywords: fire service, analysis, fire statistics, large city, urbanism
Roads occupy an important place in the life of almost every person. The quality of the coating is the most significant characteristic of the roadway. To evaluate it, there are many systems, among which there are those that analyze the road surface using video information streams. In turn, the video is divided into frames, and the images are used to directly assess the road quality. Splitting video into frames in such systems works based on special software tools. To understand how effective a particular software is, a detailed analysis is needed. In this article, OpenCV, MoviePy and FFMpeg are selected as software tools for analysis. The research material is a two-minute video of the road surface with a frame rate 29.97 frames/s and mp4 format. The average time to get one frame from a video is used as an efficiency indicator. For each of the three software tools, 5 different experiments were conducted in which the frame size in pixels was consistently increased by 2 times: 40000, 80000, 160000, 320000, 640000. Each program has a linear dependence of O(n) average frame retrieving time on resolution, however, FFMpeg has the lowest absolute time indicators, as well as the lowest growth rate of the function, therefore it is the most effective tool compared to the others (OpenCV, MoviePy).
Keywords: comparison, analysis, effectiveness, software tool, library, program, video splitting, frame size, resolution, road surface
One of the most reliable methods of identity verification are biometric authentication methods. There are two types of methods: static and dynamic. Static methods include fingerprint scanning, 3D facial recognition, vein patterns, retina scanning, etc. Dynamic methods include voice verification, keyboard handwriting and signature recognition. As of today, static methods have the lowest type I and II error rates, because their primary principle of operation is based on capturing a person's biometric characteristics, which do not change throughout their lifetime. Unfortunately, this advantage, which accounts for such low type I and II error rates, is also a drawback when implementing this method for widespread use among internet services. If biometric data is compromised, user can no longer safely use method everywhere. Dynamic biometric authentication methods are based on a person's behavioral characteristics, allowing user to control information entered for authentication. However, behavioral characteristics are more vulnerable to changes than static, resulting in significantly different type I and II errors. The aim of this work is to analyze one of the dynamic methods of biometric authentication, which can be used in most internal and external information systems as a tool for authorization or confirmation of user intentions. Biometric user authentication based on their handwritten signature relies on comparing unique biometric features that can be extracted from signature image. These unique features are divided into two categories: static and dynamic. Static features are extracted from signature image, based on characteristics such as point coordinates, total length, and width of the signature. Dynamic features are based on coordinate dependency of the signature points over time. More unique features are identified and more accurately each is weighted, the better type I and II error rates will be. This work focuses on algorithms that extract unique features from static characteristics of signature, as most signature peculiarities are identified from the dependencies of writing individual segments of the signature image.
Keywords: static algorithms, metrics, signature length, scaling, signature angle
Currently, to access information contained in autonomous and external information systems, user must pass an authorization process using modern methods of identity verification, such as: password protection, protection based on one-time codes, electronic signature-based protection, etc. These methods as always have worked well and still continue to provide secure access, however, biometric authentication methods are more reliable when access to confidential information should be limited to a single user. Today, there are two types of biometric authentication methods: static and dynamic. Static methods based on a person's biological characteristics that remain with them throughout their life, while dynamic methods based on a person's behavioral characteristics. Static methods are considered some of the most accurate, because most biometric parameters do not change over a lifetime. However, this method should only be used if chance of data compromise is very low, because in the event of leak, user will not be able to continue using these types of methods anywhere else. Dynamic methods, due to their behavioral characteristics, do not have sufficiently satisfactory type I and II error rates, as they directly depend on user's psychological and physical state. However, unlike static methods, user can control the information that will serve as a secret key for authorization in the future, so in case of a leak, user can always change the contents of the key for current and future services. This work examines one of these dynamic methods of biometric authentication: verification by handwritten signature. This method is considered more attractive among its counterparts, as in case of successful type I and II error rates, it can be applied in most existing services as a tool for authentication and confirmation of user intentions when signing various types of documents. The article discusses the main algorithms for verifying handwritten signatures by identifying unique dynamic features, dependent on the temporal and coordinate values of the analyzed samples of handwritten signatures.
Keywords: dynamic algorithms, feature extraction, signature writing time, proximity of point coordinate functions, Fourier transform
The quality of training of incompletely connected neural networks based on decision's roots is discussed. Using the example of limited data on patients with clinically diagnosed Alzheimer's disease and conditionally healthy patients, a decision's root and the corresponding neural network structure are found by preprocessing the data. The results of training an incompletely connected artificial neural network of this type are demonstrated for the first time. The results of training of this type of neural network allowed us to find a neural network with an acceptable level of accuracy for the practical application of the obtained neural network to support medical decision making - in the considered example for the diagnosis of Alzheimer's disease.
Keywords: neural networks, complex assessment mechanisms; decision roots, criteria trees, convolution matrices, data preprocessing
The paper presents a brief overview of publications describing the experience of using mathematical modeling methods to solve various problems. A multivariate piecewise linear regression model of a steel company was built using the continuous form of the maximum consistency method. To assess the adequacy of the model, the following criteria were used: average relative error of approximation, continuous criterion of consistency of behavior, sum of modules of approximation errors. It is concluded that the resulting model has sufficient accuracy and can be used for forecasting.
Keywords: mathematical modeling, piecewise linear regression, least modulus method, continuous form of maximum consistency method, steel company
Fifth-generation networks are of great interest for various studies. One of the most important and relevant technologies for efficient use of resources in fifth-generation networks is Network Slicing technology. The main purpose of the work is to simulate the probabilistic characteristics of blocking requests for access to wireless network radio resources. The main task is to analyze one of the options for implementing a two–service model of a wireless network radio access scheme with two slices and BG traffic. In the course of the work, the dependence of the probability of blocking a request depending on the intensity of receipt of applications of various types was considered. It turned out that the probability of blocking a type i application has the form of an exponential function. According to the results of the analysis, request blocking occurs predictably, taking into account the nature of incoming traffic. Previously, there are no significant drawbacks in the considered model. The developed model is of great interest for future, deeper and long-term research, for example, using simulation modeling, with the choice of optimal network parameters.
Keywords: queuing system, 5G, two - service queuing system, resource allocation, Network Slicing, elastic traffic, minimum guaranteed bitrate
The subject of this article is the development of a behavior pattern with AI elements for an opponent bot in the single-player game Steal Tower. The essence of which is to collect resources to build a tower faster than opponents. To create the illusion that the same people are playing against the player, an imitation stochastic model based on the Monte Carlo method for Markov chains has been developed. Based on the results of its tests, balanced system parameters were determined, which are embedded in the behavioral pattern of the bot, which is implemented using the Enum AIStates enumeration consisting of five states: Idle (inactivity), GoTo (movement) and GoToWarehouse (return to the warehouse), Win (victory), Loose (scoring). Each of them has developed functions for the optimal behavior of the bot given in the article. So for the GoTo state, functions have been created that analyze the benefits of different types of behavior: steal or collect, or walk to the warehouse or to the tower.
Keywords: game intelligence, behavioral pattern, live world emulation, bot behavior scenario, state structure, Markov chains, Monte Carlo method, simulation model, Unity environment, C# language
Stepper motors are often used in automated laser cutting systems. The control circuit of a stepper motor requires a special electronic device - a driver, which receives logical signals as input and changes the current in the motor windings to provide motion parameters. This research study evaluated stepper motor drivers to determine the feasibility of their use - PLDS880, OSM-42RA, OSM-88RA. To control the system, software code was written, which was connected to the controller via a link board. With each driver, in different modes, optimal parameters were selected (initial speed, final speed and acceleration), that is, the movement of the carriage without stalling for ten passes with a minimum travel time. The results of the experiments are presented in the form of tables.
Keywords: laser, laser cutting, automation, technological process, stepper motor, performance, driver, controller, control circuit, optimal parameters
The validity of analytical models of the process of one-parameter selective assembly of two elements was evaluated by comparison with the results of simulation modelling. A series of machine experiments including one-factor and two-factor experiments were carried out. At the accepted levels of variation of factors, the confidence interval of the probability of yield of good products from the initial population of elements that did not pass sorting was determined using Student's criterion at the given level of significance and number of degrees of freedom. Comparison of the simulation results revealed the facts of reaching the specified process indicator, determined by the analytical model, for all experiments within the confidence intervals and the presence of relatively small deviations from their centres.
Keywords: selective assembly, analytical model, simulation model, measurement error, simulation results
This article presents a new developed calculation methodology, which includes provisions for standard calculations and takes into account the peculiarities of the operation of eccentrically compressed reinforced concrete structures operating at large eccentricities of load application. Adjustments have been made to the calculation methodology to take into account the following factors: the standard methodology uses the maximum tensile strength of reinforcement; proposals have been developed to determine the actual resistance of tensile reinforcement, which, in fact, will be significantly lower than the limit. Proposals are given that take into account the limiting deformations of concrete, which, in turn, will be a key quantity for determining the resistance of tensile reinforcement in the cross section. The article also presents the results of experimental studies of a flexible reinforced concrete pillar operating with a load eccentricity equal to e0 = 0.32h. Theoretical calculations and experimental studies were analyzed and appropriate conclusions were drawn.A formula has been developed to determine the real resistance of the stretched metal reinforcement at the time preceding destruction. The calculation algorithm has been compiled. When comparing theoretical and experimental strength, the difference did not exceed 5%.
Keywords: steel, heavy concrete, reinforced concrete, testing, stand