The paper investigates the use of architectural combinatorics to solve the problems of multifunctional residential complexes in the conditions of digital transformation. The main methods of combinatorics, including conceptual and formal approaches, are considered. The main stages of evolution of the method, starting from constructivism, and the role of modern digital technologies such as BIM, parametric modeling, machine learning and artificial intelligence in the implementation of combinatorial approaches are described. Attention is given to sustainable architecture and optimization of spatial solutions. Successful and problematic project examples are analyzed. Limitations of the application of the technologies are analyzed, as well as ethical and social aspects of their use. The conclusions substantiate the significance of the method in the context of contemporary challenges.
Keywords: architectural combinatorics, combinatorial methods, multifunctional residential complex, sustainable development, sustainable architecture, adaptive architecture, digital technologies, BIM, parametric modeling, machine learning, artificial intelligence
The article is devoted to the development of a tool for automated generation of time constraints in the context of circuit development in the basis of programmable logic integrated circuits (FPGAs). The paper analyzes current solutions in the field of interface tools for generating design constraints. The data structure for the means of generating design constraints and algorithms for reading and writing Synopsys Design Constraints format files have been developed. Based on the developed structures and algorithms, a software module was implemented, which was subsequently implemented into the circuit design flow in the FPGA basis - X-CAD.
Keywords: computer-aided design, field programmable gate array, automation, design constraints, development, design route, interface, algorithm, tool, static timing analysis
The article presents an analysis of the application of the Socratic method for selecting machine learning models in corporate information systems. The study aims to explore the potential of utilizing the modular architecture of Socratic Models for integrating pretrained models without the need for additional training. The methodology relies on linguistic interactions between modules, enabling the combination of data from various domains, including text, images, and audio, to address multimodal tasks. The results demonstrate that the proposed approach holds significant potential for optimizing model selection, accelerating decision-making processes, and reducing the costs associated with implementing artificial intelligence in corporate environments.
Keywords: Socratic method, machine learning, corporate information systems, multimodal data, linguistic interaction, business process optimization, artificial intelligence
The article examines the modular structure of interactions between various models based on the Socratic dialogue. The research aims to explore the possibilities of synthesizing neural networks and system analysis using Socratic methods for managing corporate IT projects. The application of these methods enables the integration of knowledge stored in pre – trained models without additional training, facilitating the resolution of complex management tasks. The research methodology is based on analyzing the capabilities of multimodal models, their integration through linguistic interactions, and system analysis of key aspects of IT project management. The results include the development of a structured framework for selecting suitable models and generating recommendations, thereby improving the efficiency of project management in corporate environments. The scientific significance of the study lies in the integration of modern artificial intelligence approaches to implement system analysis using multi – agent solutions.
Keywords: neural networks, system analysis, Socratic method, corporate IT projects, multimodal models, project management
The article is devoted to the automation of the process of managing road construction works at a manufacturing enterprise. Among the means of communication in Russia, highways are in the first place in terms of length. Construction of new roads, repair and bringing the existing roads to regulatory requirements is a complex process that can be characterized as a project. The process of project-oriented management of road construction works is formalized, project limitations are defined. The enlarged milestones of project-oriented management of road construction works are highlighted, including the stages of initialization and implementation. The categories of system users and their functions are defined. A class diagram of the information system for managing road construction works is provided. An algorithm for the operation of an automated system for managing road construction works based on a project-oriented approach is developed and described in detail. Formalization of the calculation of the percentage of project readiness is carried out based on the significance coefficient. Examples of implementing the algorithm stages in the information system and generating analytical reports in the system are given. The reports generated in the system are described in detail. The economic efficiency of the proposed automation system is substantiated.
Keywords: road construction works, project-oriented management, highway, automation, reporting, significance coefficient, project, project resources, performance indicator, construction, repair
The article presents the results of a numerical experiment comparing the accuracy of neural network recognition of objects in images using various types of data set extensions. It describes the need to expand data sets using adaptive approaches in order to minimize the use of image transformations that may reduce the accuracy of object recognition. The author considers such approaches to data set expansion as random and automatic augmentation, as they are common, as well as the developed method of adaptive data set expansion using a reinforcement learning algorithm. The algorithms of operation of each of the approaches, their advantages and disadvantages of the methods are given. The work and main parameters of the developed method of expanding the dataset using the Deep-Q-Network algorithm are described from the point of view of the algorithm and the main module of the software package. Attention is being paid to one of the machine learning approaches, namely reinforcement learning. The application of a neural network for approximating the Q-function and updating it in the learning process, which is based on the developed method, is described. The experimental results show the advantage of using data set expansion using a reinforcement learning algorithm using the example of the Squeezenet v1.1 classification model. The comparison of recognition accuracy using data set expansion methods was carried out using the same parameters of a neural network classifier with and without the use of pre-trained weights. Thus, the increase in accuracy in comparison with other methods varies from 2.91% to 6.635%.
Keywords: dataset, extension, neural network models, classification, image transformation, data replacement
The transition from scheduled maintenance and repair of equipment to maintenance based on its actual technical state requires the use of new methods of data analysis based on machine learning. Modern data collection systems such as robotic unmanned complexes allow generating large volumes of graphic data in various spectra. The increase in data volume leads to the task of automating their processing and analysis to identify defects in high-voltage equipment. This article analyzes the features of using computer vision algorithms for images of high-voltage equipment of power plants and substations in the infrared spectrum and presents a method for their analysis, which can be used to create intelligent decision support systems in the field of technical diagnostics of equipment. The proposed method uses both deterministic algorithms and machine learning. Classical computer vision algorithms are applied for preliminary data processing in order to highlight significant features, and models based on unsupervised machine learning are applied to recognize graphic images of equipment in a feature space optimized for information space. Image segmentation using a spatial clustering algorithm based on the density distribution of values taking into account outliers allows detecting and grouping image fragments with statistically close distributions of line orientations. Such fragments characterize certain structural elements of the equipment. The article describes an algorithm that implements the proposed method using the example of solving the problem of detecting defects in current transformers, and presents a visualization of its intermediate steps.
Keywords: diversification of management, production diversification, financial and economic purposes of a diversification, technological purposes of ensuring flexibility of production
The annual growth of the load on data centers increases many times over, which is due to the growing growth of users of the information and telecommunications network Internet. Users access various resources and sources, using search engines and services for this. Installing equipment that processes telecommunications traffic faster requires significant financial costs, and can also significantly increase the downtime of the data center due to possible problems during routine maintenance. It is more expedient to focus resources on improving the software, rather than the hardware of the equipment. The article provides an algorithm that can reduce the load on telecommunications equipment by searching for information within a specific subject area, as well as by using the features of natural language and the process of forming words, sentences and texts in it. It is proposed to analyze the request based on the formation of a prefix tree and clustering, as well as by calculating the probability of the occurrence of the desired word based on the three sigma rule and Zipf's Law.
Keywords: Three Sigma Rule, Zipf's Law, Clusters, Language Analysis, Morphemes, Prefix Tree, Probability Distribution
In this work, we present the development and analysis of a feature model for dynamic handwritten signature recognition to improve its effectiveness. The feature model is based on the extraction of both global features (signature length, average angle between signature vectors, range of dynamic characteristics, proportionality coefficient, average input speed) and local features (pen coordinates, pressure, azimuth, and tilt angle). We utilized the method of potentials to generate a signature template that accounts for variations in writing style. Experimental evaluation was conducted using the MCYT_Signature_100 signature database, which contains 2500 genuine and 2500 forged samples. We determined optimal compactness values for each feature, enabling us to accommodate signature writing variability and enhance recognition accuracy. The obtained results confirm the effectiveness of the proposed feature model and its potential for biometric authentication systems, presenting practical interest for information security specialists.
Keywords: dynamic handwritten signature, signature recognition, biometric authentication, feature model, potential method, MCYT_Signature_100, FRR, FAR
The article describes the methodology for constructing a regression model of occupancy of paid parking zones taking into account the uneven distribution of sessions during the day and the behavioral characteristics of two groups of clients - the regression model consists of two equations that take into account the characteristics of each group. In addition, the process of creating a data model, collecting, processing and analyzing data, distribution of occupancy during the day is described. Also, the methodology for modeling a phenomenon whose distribution has the shape of a bell and depends on the time of day is given. The results can be used by commercial enterprises managing parking lots and city administrations, researchers when modeling similar indicators that demonstrate a normal distribution characteristic of many natural processes (customer flow in bank branches, replenishment and / or withdrawal of funds during the life of replenished deposits, etc.).
Keywords: paid parking, occupancy, regression model, customer behavior, behavioral segmentation, model robustness, model, forecast, parking management, distribution
within the framework of the conducted research, the task of controlling a robot of a parallel structure was considered. This paper presents a model of a 3-RPR type flat parallel robot in the Matlab package, developed for conducting computational experiments. Implementation of two types motion trajectories have been simulated in order to determine the optimal structure of the position regulators of the drive joint used in the robot control system. Six structure of regulators were compared: three classical ones: PD, PID, PDD and three of their fractional-degree analogues: FOPD, FOPID, FOPDD. The FOMCON tool was used to model fractional-degree regulators. The best results for type 3-PPR robot were shown by a control system with a FOPID regulator, which indicates the expediency of using fractional-degree regulators to control parallel robots.
Keywords: parallel robot, inverse kinematics problem, 3-RPR robot, computational experiment, working out the trajectory of movement, control system accuracy, fractional-degree regulator, parametric optimization of the regulator, comparative modeling, FOMCON tool
A review of various approaches used to model the contact interaction between the grinding wheel grain and the surface layer of the workpiece during grinding is presented. In addition, the influence of material properties, grinding parameters and grain morphology on the contact process is studied.
Keywords: grinding, grain, contact zone, modeling, grinding wheel, indenter, micro cutting, cutting depth
The article is devoted to the automation of the process of managing road construction works at a manufacturing enterprise. Among the means of communication in Russia, highways are in the first place in terms of length. Construction of new roads, repair and bringing the existing roads to regulatory requirements is a complex process that can be characterized as a project. The process of project-oriented management of road construction works is formalized, project limitations are defined. The enlarged milestones of project-oriented management of road construction works are highlighted, including the stages of initialization and implementation. The categories of system users and their functions are defined. A class diagram of the information system for managing road construction works is provided. An algorithm for the operation of an automated system for managing road construction works based on a project-oriented approach is developed and described in detail. Formalization of the calculation of the percentage of project readiness is carried out based on the significance coefficient. Examples of implementing the algorithm stages in the information system and generating analytical reports in the system are given. The reports generated in the system are described in detail. The economic efficiency of the proposed automation system is substantiated.
Keywords: road construction works, project-oriented management, highway, automation, reporting, significance coefficient, project, project resources, performance indicator, construction, repair
The article presents a method for protecting transmitted images in instant messengers using time-based one-time passwords (TOTP). An additional level of protection is offered based on a combination of image masking using orthogonal matrices and two-factor authentication based on TOTP. A prototype Python application has been developed and tested using the gRPC remote procedure protocol to ensure secure data exchange between the client and the server. The results of the implementation of the proposed method in preventing unauthorized access to confidential images are presented.
Keywords: information security, messenger, messaging, communications, instant messaging systems, one-time password
The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management. The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management.
Keywords: data warehouse, data lake, big data, cloud storage, unstructured data, semi-structured data