This paper provides a survey of metrics used to assess the quality of images generated by generative models. Specialized metrics are required to objectively evaluate image quality. A comparative analysis showed that a combination of different metrics is necessary for a comprehensive evaluation of generation quality. Perceptual metrics are effective for assessing image quality from the perspective of machine systems, while metrics evaluating structure and details are useful for analyzing human perception. Text-based metrics allow for the assessment of image-text alignment but cannot replace metrics focused on visual or structural evaluation. The results of this study will be beneficial for specialists in machine learning and computer vision, as well as contribute to the improvement of generative algorithms and the expansion of diffusion model applications.
Keywords: deep learning, metric, generative model, image quality, image
The relevance of the problem considered in the article - automation of the dosing system for galvanic production reagents, is emphasized by the fact that ecology is one of the most important problems of our time. These reagents affect vast areas of industry, while causing harm to the environment. As a result, it is necessary to consider innovations that will minimize these harmful effects. The data of the experiment on the application of the SCADA system in the automation of dosing and optimization of the control system for the processes of cleaning, filtration and selection of reagents are presented. The Aim of the work is to present the results of the conducted research. The results may be useful both for processing enterprises and for subsequent research.
Keywords: automation, galvanic production, optimization, SCADA systems, process control
Evaluation of the strength characteristics of concrete is an important criterion for the quality of building structures when examining the engineering and technical condition of buildings with a monolithic reinforced concrete frame. The main controlled indicator when assessing strength is the class of concrete by compressive strength, determined in accordance with GOST 18105-2018 by statistical processing of test results by destructive or non-destructive methods. The article assesses the methods used to control the strength of concrete in buildings under construction or in operation, provides the main requirements for the tests carried out, gives examples of the necessary equipment, and presents the most rational algorithm for assessing strength. The research materials will be useful for specialists in the field of construction and researchers dealing with issues of the quality of building materials.
Keywords: non-destructive testing, concrete compressive strength, concrete testing, concrete class, calibration dependence, monolithic structures
The article is devoted to the application of modern methods of generative image compression using variational autoencoders and neural network architectures. Special attention is paid to the analysis of existing approaches to image generation and restoration, as well as a comparative assessment of compression quality in terms of visual perception and metric indicators. The aim of the study is to systematize deep image compression methods and identify the most effective solutions based on the variational Bayesian approach. The paper considers various architectures, including conditional autoencoders and hypernetwork models, as well as methods for evaluating the quality of the data obtained. The main research methods used were the analysis of scientific literature, a comparative experiment on the architectures of generative models and a computational estimation of compression based on metrics. The results of the study showed that the use of variational autoencoders in combination with recurrent and convolutional layers makes it possible to achieve high-quality image recovery with a significant reduction in data volume. The conclusion is made about the prospects of using conditional variational autoencoders in image compression tasks, especially in the presence of additional information (for example, metadata). The presented approaches can be useful for developing efficient systems for storing and transmitting visual data.
Keywords: variational autoencoders, generative models, image compression, deep learning, neural network architectures, data recovery, conditional models
The article describes the features of using a two-layer membrane with the use of injection control fittings in the installation of underground waterproofing. The circumstances preventing the mass application of this technology have been identified, the main part of which is related to the increase in the cost of work at the initial stage. However, the use of the technology is justified because it allows you to localize the location and period of leakage, has increased maintainability and durability.
Keywords: waterproofing, modern waterproofing technologies, double-layer membrane, injection control fittings
The article provides a justification for the concept of a folding system for a prefabricated residential module based on wooden structures. An analysis of foreign analogues of prefabricated transformable wooden buildings and an assessment of the possibility of their use in northern climatic conditions has been performed. A transformation system for a prefabricated wooden module for use in northern and Arctic conditions is proposed and substantiated.
Keywords: low-rise housing construction, transformation, transformation of low-rise residential buildings, prefabricated transformable buildings, pre-manufactured at the factory, high degree of factory readiness
Currently, one of the main factors influencing the formation of architecture is the functional purpose of the building, since it determines the essence of the architectural object. The purpose of the scientific work is to study the influence of building functions on the historical architecture of Europe and their impact on the development of modern architecture. This article sets the objectives of studying the classification of functional purposes of buildings, conducting a retrospective analysis of the development and formation of architectural styles in Europe, based on world design experience and the conducted research to identify the influence of the building function on its planning and volumetric-spatial solutions in the process of architecture development. The research method is the analysis of the historical architecture of Europe from the time of the inception of architecture to the present day, carried out on the basis of world design experience in different eras. In the course of the study, four main trends in the development of the functions of modern architecture were identified: integration with nature, creation of adaptive spaces, multifunctionality and development of new functions. It is concluded that the building function played the most important role throughout the entire period of architecture formation, which led to the emergence of a huge variety of building types today and made a significant contribution to the development of architecture of the XXI century.
Keywords: architecture, historical architecture, architectural style, functional purpose, European architecture, building type, retrospective analysis, function, influence, development
This paper examines the application of Bidirectional Long Short-Term Memory (Bi-LSTM) networks in neural source code generation. The research analyses how Bi-LSTMs process sequential data bidirectionally, capturing contextual information from both past and future tokens to generate syntactically correct and semantically coherent code. A comprehensive analysis of model architectures is presented, including embedding mechanisms, network configurations, and output layers. The study details data preparation processes, focusing on tokenization techniques that balance vocabulary size with domain-specific terminology handling. Training methodologies, optimization algorithms, and evaluation metrics are discussed with comparative results across multiple programming languages. Despite promising outcomes, challenges remain in functional correctness and complex code structure generation. Future research directions include attention mechanisms, innovative architectures, and advanced training procedures.
Keywords: code generation, deep learning, recurrent neural networks, transformers, tokenisation
One of the elements of the organization's information infrastructure, which is aimed at organizing a comprehensive system for protecting confidential information, is the electronic document management system. The market for electronic document management systems is showing continuous growth due to its advantages, which underlines the relevance of ensuring information security in such systems. The article analyzes the current types and channels of information leakage in the electronic document management system.
Keywords: document management, confidentiality of information, electronic document management, information leaks, information security, information security issues, information security system
This paper provides an analysis of the main problems encountered during the installation of bitumen-polymer roofing materials. Special attention is paid to typical defects and errors related to insufficient qualifications, as well as problems related to violations of installation technology.
Keywords: bitumen-polymer roofing, installation of a surfaced roof, waterproofing defect, quality management, recommendations for improvement
The article examines the problem of architectural and spatial formation of the living environment of multi-storey residential quarters based on their coloristic solution. The analysis of the coloristic organization of the living environment space in specific housing complexes of St. Petersburg is carried out. The features of coloristic solutions of facades in the formation of a large-scale architectural space are revealed using examples of the architecture of St. Petersburg. Negative and positive examples of the use of color graphics of residential buildings in accordance with the psychological perception of scale are given.
Keywords: scale of living environment, coloristics, supergraphics, architectural space, scale of facade, psychophysiological perception of color
The article presents the process of verifying the functioning of a secure data transmission network based on broadband wireless access equipment with a sev-en-element antenna array (ABSD 7) and the same with one antenna device (ABSD 1). The conditions of the experiment, the composition and completeness of the equipment are described. The results of the checks in various modes of op-eration are presented. It is concluded that it is possible to use standard on-board communication equipment as a repeater when installing the appropriate program mode.
Keywords: data transmission, secure network, data transmission channel, repeater, basic station, on-board equipment.
The article examines the problem of global network optimization, as well as currently existing software and hardware solutions. The purpose of the study is to determine the technological basis for developing a prototype of a domestic WAN optimizer. When studying the subject area, it turned out that there are no domestic solutions in this area that are freely available. The resulting solution can be adapted to the specific requirements of the customer company by adding the necessary modifications to the prototype.
Keywords: global network, data deduplication, WAN optimizer, bandwidth
The objective of the study is to analyze the methods of describing a computer incident in the field of information security when identifying illegal events and testing cyber-physical systems to improve the quality of work with documentation when protecting cyber-physical systems. To achieve this goal, it is necessary to develop a format for describing incidents. For this purpose, regulatory documents were analyzed, types of computer incidents and their classification were identified, incident criteria were defined, and the degrees of criticality of the consequences when they occurred were identified. A document was developed to describe the incident. These studies are carried out in conjunction with work on developing methods for monitoring and testing the security of cyber-physical systems for automatic detection of illegal operation and (or) abnormal operation in a cyber-physical system. Based on the research results, an algorithm of actions and methods for identifying and preventing the consequences of computer incidents will be formed, due to which it will be possible to increase the security of cyber-physical systems.
Keywords: information security event, computer incident, information system, incident description, documentation generation, incident card, cybersecurity, cyber-physical system
This paper considers the problem of task scheduling in manufacturing systems with multiple machines operating in parallel. Four approaches to solving this problem are proposed: pure Monte Carlo Tree Search (MCTS), a hybrid MCDDQ agent combining reinforcement learning based on Double Deep Q-Network (DDQN) and Monte Carlo Tree Search (MCTS), an improved MCDDQ-SA agent integrating the Simulated Annealing (SA) algorithm to improve the quality of solutions, and a greedy algorithm (Greedy). A model of the environment is developed that takes into account machine speeds and task durations. A comparative study of the effectiveness of methods based on the makespan (maximum completion time) and idle time metrics is conducted. The results demonstrate that MCDDQ-SA provides the best balance between scheduling quality and computational efficiency due to adaptive exploration of the solution space. Analytical tools for evaluating the dynamics of the algorithms are presented, which emphasizes their applicability to real manufacturing systems. The paper offers new perspectives for the application of hybrid methods in resource management problems.
Keywords: machine learning, Q-learning, deep neural networks, MCTS, DDQN, simulated annealing, scheduling, greedy algorithm
The article examines the network model of district management within the subsidized region from the point of view of the upper level, taking into account the optimal response of the districts – the Stackelberg equilibrium is found. The results obtained are compared with the corresponding results in the creation of horizontal and integrated coalitions of districts and the region. The author examines the preference for creating coalitions for all participants in the system. It has been proven that cooperation is more profitable for weak and medium-sized districts, hierarchy is more profitable for a strong district, and independence is more profitable for the region as for the upper level. It is not profitable for the strong elements of the system to join a coalition, while it is more profitable for the weak elements to join a coalition.
Keywords: network model, Nash equilibrium, Stackelberg equilibrium, resource allocation, Lagrange multiplier method, cooperation, horizontal coalition, maximal coalition, complex coalition, independent behavior
This study addresses the technical bottlenecks of generative AI in architectural style control by proposing a nodular workflow construction method based on ComfyUI (A graphical user interface for working with the Stable Diffusion model, simplifying the management of image generation parameters.), aiming to achieve precise and controllable generation of functionalist architectural renderings. Through deconstructing the technical characteristics of the Stable Diffusion (A generative AI model based on diffusion processes that transforms noise into images through iterative noise removal.) model, neural network components such as ControlNet (A neural network architecture used for precise control of image generation via additional input data.) edge constraints and LoRA(Low-Rank Adaptation. A method for fine - tuning neural networks using low - rank matrices, enabling modification of large models with minimal computational costs.) module enhancements are encapsulated into visual nodes, establishing a three-phase generation path of "case analysis - parameter encoding - dynamic adjustment". Experiments involving 10 classical functionalist architectural cases employed orthogonal experimental methods to validate node combination strategies, revealing that the optimal workflow incorporating MLSD (Multi-Level Semantic Diffusion. An algorithm that combines semantic segmentation and diffusion models to generate structurally consistent images.) straight-line detection and LoRA prefabricated component reinforcement significantly improves architectural style transfer effectiveness. The research demonstrates: 1) The nodular system overcomes the "black box" limitations of traditional AI tools by exposing latent space(A multi - dimensional space where neural networks encode semantic features of data.) parameters, enabling architects to directionally configure professional elements; 2) Workflow templates support rapid recombination within 4 nodes, enhancing cross-project adaptability while further compressing single-image generation time; 3) Strict architectural typology matching (e.g., residential-to-residential, office-to-office) is critical for successful style transfer, as typological deviations cause structural logic error rates to surge. This research holds significant implications in architectural design by leveraging ComfyUI to develop workflows that transform how architects visualize and communicate ideas, thereby improving project outcomes. It demonstrates practical applications of this technology, proving its potential to accelerate design processes and expand architects' creative possibilities.
Keywords: comfyui, functionalist architecture, style transfer, node-based workflow, artificial intelligence, architectural design, generative design
The article is devoted to the study of the problem of estimating unknown parameters of linear regression models using the least absolute deviations method. Two well-known approaches to identifying regression models are considered: the first is based on solving a linear programming problem; the second, known as the iterative least-squares method, allows one to obtain an approximate solution to the problem. To test this method, a special program was developed using the Gretl software package. A dataset of house prices and factors influencing them, consisting of 20640 observations, was used for computational experiments. The best results were obtained using the quantreg function built into Gretl, which implements the Frisch-Newton algorithm; the second result was obtained using an iterative method; and the third result was achieved by solving a linear program using the LPSolve software package.
Keywords: regression analysis, least absolute deviations method, linear programming, iterative least squares method, variational weighted quadratic approximation method
The article presents aspects of designing an artificial intelligence module for analyzing video streams from surveillance cameras in order to classify objects and interpret their actions as part of the task of collecting statistical information and recording information about abnormal activity of surveillance objects. A diagram of the sequence of the user's process with active monitoring using a Telegram bot and a conceptual diagram of the interaction of the information and analytical system of a pedigree dog kennel on the platform "1С:Enterprise" with external services.
Keywords: computer vision, machine learning, neural networks, artificial intelligence, action recognition, object classification, YOLO, LSTM model, behavioral patterns, keyword search, 1C:Enterprise, Telegram bot
A set of techniques for obtaining retrospective, statistical, expert information, data integration, competence deficit assessment and knowledge management to compensate for competence deficit in organisational systems is presented. For the purpose of practical implementation of an integrated approach to improving the management of organisational systems, a model and an algorithm for obtaining data by applying a set of techniques have been developed. In the future, the proposed methodological solutions will significantly improve the efficiency of organisational systems management through the rational application of automated management systems with components of trusted artificial intelligence.
Keywords: algorithm, critical events, integration, information resources, recommendations, systematisation, efficiency
This article presents the technical implementation of a convolutional nueral network-based face recognition system that is able to work under variable scenarios like occlusion, angle changes, and camera rotation. various face identification algorithms were analysed with the purpose of developing a model that could identify faces at different angles. The system was experimentally verified with various datasets and compared to its accuracy, processing speed, and robustness towards environmental disturbance. Results indicate that our convolutioan neural network structure optimized achieves 90%+ accuracy under pristine conditions and maintains decent performance upon partial occlusion.
Keywords: face detection, convolutional nueral networks, model, feature extraction, deep learning, face recognition, image
This study is a testament to the potential of convolutional neural networks in softmax activation to classify mantis, honey badger, and weasel samples. The model was able to predict highly with low misclassification and had the potential to reduce environmental variance by minimizing it using data augmentation. The research shows how deep learning networks would be used in the automation of taxonomic classification, which in turn would help species identification through images and large-scale conservation monitoring.
Keywords: deep learning, machine learning, convolutional neural networks, dataset, softmax function, image classification, wildlife, data augmentations
The article proposes an approach to creating an intelligent industrial emissions monitoring system based on the YOLO architecture and digital simulation. The work is relevant for improving the effectiveness of environmental control at industrial facilities, for example, an oil refinery. The system automatically detects and classifies smoke against a complex background (glare, fog, sky), combining real video data with synthetic images of a digital model of the site. Simulation settings and augmentation have been performed for different weather and light conditions. Experiments have shown that adding 30% synthetics to the training set increases classification accuracy, especially for subtle outliers. Recommendations on simulation parameters have been developed and the precision metric for pollution classes has been evaluated. The results confirm the effectiveness of the approach and its readiness to be implemented in automation.
Keywords: machine vision, digital simulation, emission monitoring, neural network models, pollution classification