https://ota-new.donntu.edu.ua/issue/feed Scientific Papers of Donetsk National Technical University. Series: “Computer Engineering and Automation" 2026-04-28T14:32:18+03:00 Iaroslav Dorohyi yaroslav.dorohyi@donntu.edu.ua Open Journal Systems <p><span class="HwtZe" lang="en"><span class="jCAhz ChMk0b"><span class="ryNqvb">All-Ukrainian scientific collection <strong>"Scientific papers of Donetsk National Technical University.</strong></span></span> <span class="jCAhz ChMk0b"><span class="ryNqvb"><strong>Series: "Computer engineering and automation"</strong> is a scientific specialist publication of Ukraine, in which the results of scientific research in the field of technical sciences can be published.</span></span> <span class="jCAhz ChMk0b"><span class="ryNqvb">The collection publishes articles by scientists, graduate students, masters of higher education institutions, as well as practicing scientists and engineers of leading enterprises, which contain the results of theoretical and practical research and development according to <strong>thematic sections</strong>:</span></span> </span></p> <p><span class="HwtZe" lang="en"><span class="jCAhz ChMk0b"><span class="ryNqvb">1. Automation of technological processes.</span></span> </span></p> <p><span class="HwtZe" lang="en"><span class="jCAhz ChMk0b"><span class="ryNqvb">2. Information technologies and telecommunications.</span></span> </span></p> <p><span class="HwtZe" lang="en"><span class="jCAhz ChMk0b"><span class="ryNqvb">3. Information and measurement systems, electronic and microprocessor devices.</span></span></span></p> https://ota-new.donntu.edu.ua/article/view/359331 ALGORITHM FOR ASSESSING THE RELIABILITY OF ARTIFICIAL INTELLIGENCE SYSTEM RESPONSES IN EDUCATIONAL CONTENT CREATION 2026-04-28T14:32:18+03:00 Nataliia Maslova nataliia.maslova@donntu.edu.ua Olena Lyubymenko olena.liubymenko@donntu.edu.ua <p>The paper analyzes the risks associated with the correctness and reliability of educational content created using artificial intelligence tools. Intelligent tools based on artificial intelligence contribute to the automation of the process of developing interactive educational materials, increasing the level of personalization of learning and optimizing the analysis of educational results. At the same time, the introduction of artificial intelligence technologies into the educational environment is accompanied by the emergence of new digital risks, in particular, the spread of disinformation and the formation of dependence on technological means. This study analyzes the risks associated with the correctness and reliability of educational content created using artificial intelligence tools. An algorithm for assessing the reliability of responses of artificial intelligence systems used in the creation of educational content is proposed. The algorithm is based on modeling the process of checking the reliability of answers generated by artificial intelligence, provides for a step-by-step analysis of the generated results, calculation of accuracy indicators and determination of their reliability level based on comparison with control sources. An experimental evaluation of several artificial intelligence tools was carried out using test tasks related to information security topics. The results showed that the accuracy of answers generated by ChatGPT reached approximately 90–95%, while other tools demonstrated lower reliability depending on the complexity of the task. The proposed algorithm is aimed at reducing the risks of spreading disinformation and contributes to improving the quality of educational materials created using intelligent systems.</p> 2026-04-28T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359323 RESEARCH OF THE FUNCTIONING OF THE EXISTING SPACE SITUATION CONTROL AND ANALYSIS SYSTEM 2026-04-28T14:07:06+03:00 Serhii Rahulin sirotenko.helvetica@gmail.com Oleksandr Sharabaiko sirotenko.helvetica@gmail.com <p>The presented article provides an in-depth systemic analysis of the architecture and functional capabilities of the national Space Situation Control and Analysis System (SSCAS) within the framework of implementing the tasks of the National Space Program of Ukraine. The relevance of the study is driven by the rapid militarization and commercialization of near-Earth space, which necessitates the state’s possession of independent and high-precision tools for monitoring the space environment to guarantee national security and protect state interests. The object of the study is the process of ballistic and navigational support for spacecraft (SC) control and the maintenance of space object (SO) catalogs. The work details the role of the SSCAS as a multi-component information system performing tasks of identification, tracking, and motion prediction in real-time. Particular attention is paid to the hierarchical structure of the system, identifying a specialized subsystem for the centralized management of all measurement complexes included in its composition. The scientific novelty of the study lies in the formalization of criteria for assessing the quality of SSCAS functioning through a three-dimensional vector of indicators: functional result (Pf ), resource costs (Vr ), and time factor (T). The author substantiates a specific approach to prioritizing these indicators within the context of national security. It is proven that when performing strategic tasks, the cost component (Vr ) is secondary compared to operational efficiency (T) and the accuracy of object identification (Pf ), as the cost of an error in ballistic calculations could lead to the loss of expensive space assets or threats to national security. The article provides a detailed analysis of a group of factors directly correlating with the system’s functioning quality: from geospatial configuration (territorial distribution of observation points) to the technical characteristics of communication channels. A separate point highlights the complexity of the system's interconnections with the external environment, including the impact of ionospheric interference and anthropogenic orbital debris on the accuracy of radio- technical measurements. Based on the analysis results, the critical necessity of establishing stringent requirements for the SSCAS during its integration with radio-technical spacecraft control systems is identified. It is established that further modernization of the system should be aimed at increasing the automation of identification processes and reducing the time lag between object detection and the issuance of target designations to the information consumer. The conclusions of the article can be utilized in developing prospective plans for the development of ground-based space infrastructure and improving the management algorithms for complex technical systems.</p> 2026-04-28T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359304 ZERO-TRUST ARCHITECTURE FOR INDUSTRIAL IOT (IIOT): PROTECTING CRITICAL INFRASTRUCTURE IN IT/OT CONVERGENCE 2026-04-28T12:36:39+03:00 Valeria Slatvinska slatvinskaya_valeriya@ukr.net Viacheslav Bevza sirotenko.helvetica@gmail.com <p>The purpose of article. The current stage of industrial systems development is characterised by an unprecedented integration of information technology (IT) and operational technology (OT), resulting in complex ecosystems of the Industrial Internet of Things (IIoT). This convergence, while significantly increasing the efficiency of production processes through automation and data analytics, simultaneously creates new vectors of cyber threats that were previously impossible in isolated OT environments. Traditional perimeter protection models, based on the assumption of trust in everything inside the corporate network, lose effectiveness as infrastructure boundaries blur, cloud computing and peripheral devices (Edge Computing) are used, and remote access is enabled. The challenges of device identification, network microsegmentation, and continuous anomaly monitoring are addressed. Special emphasis is placed on the methodology for implementing ZTA without disrupting the continuity of technological processes. The purpose of the article is to develop theoretical and methodological principles for applying zero-trust architecture to protect convergent IT/OT systems in critical infrastructure, and to substantiate the effectiveness of this approach in minimising the risk of unauthorised access and ensuring data integrity in industrial ecosystems. Scientific novelty. The scientific novelty of the research lies in developing an adaptive model to implement the Zero Trust architecture in heterogeneous IIoT environments, which, unlike existing approaches, accounts for the strict latency constraints of industrial automation protocols and the specifics of the OT equipment life cycle. A method for dynamically calculating the trust level (Trust Score) for industrial controllers and sensors is proposed, based not only on static identification attributes but also on real- time behavioural analysis of the technological process. Results. The work forms a holistic conceptual and methodological model for implementing Zero-Trust architecture for Industrial IoT in the context of IT/OT convergence, combining asset and data flow identification, micro-segmentation, continuous verification of subjects/ devices, and context-adaptive access control. A set of critical control points (policy enforcement points) for typical IIoT chains “field devices – gateways – edge/ SCADA – analytical services” is specified, and a consistent telemetry profile is proposed for assessing trust in nodes (device posture), taking into account OT constraints on latency and determinism. A practice- oriented procedure for “Zero-Trust-Inventory” for mixed-protocol environments (including industrial ones) has been developed, which allows formalizing access policies at the level of minimally necessary privileges and linking them to roles, functions, device state, and network context. Additionally, mechanisms for secure interaction between IT and OT domains through trust gateways have been substantiated, and an approach to phased migration from the perimeter model to Zero Trust without disrupting technological processes has been proposed.It has been shown that the most effective combination for IIoT is: (i) segmentation by technological contours, (ii) strong management of machine subject identities (certificates/ attestation), (iii) constant behaviour monitoring, and (iv) automated response to policy deviations. The results obtained form the basis for creating a unified profile of Zero-Trust maturity requirements for critical IIoT systems. They are suitable for use when designing or modernising convergent IT/OT infrastructure. Conclusions. Zero-Trust architecture is methodologically sound response to specific IIoT threats, a which are exacerbated by IT/OT convergence and the growth of heterogeneous devices and interaction channels. Adequate protection of critical IIoT infrastructure is achieved not by declarative “zero trust”, but by the systematic implementation of managed policy enforcement points, micro-segmentation and continuous access context verification. The model, inventory procedure, and telemetry profile proposed in the article enable alignment of cybersecurity requirements with the technological limitations of OT environments (determinism, availability, limited node resources), minimising the risk of process downtime. The transition to Zero Trust should be implemented in stages, starting with critical areas and the riskiest inter-domain interactions, and then expanding policies to the entire device and service life cycle.</p> 2026-04-28T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359309 METHODS FOR ENSURING QUANTUM-ADAPTIVE SECURITY OF HYBRID CRYPTOGRAPHIC PROTOCOLS IN NEXT-GENERATION NETWORKS 2026-04-28T12:47:32+03:00 T.М. Fesenko sirotenko.helvetica@gmail.com A.S. Yanko al9_yanko@ukr.net V.V. Magaletska sirotenko.helvetica@gmail.com M.O. Plakhtii sirotenko.helvetica@gmail.com <p>This article investigates methods for ensuring the quantum-adaptive security of hybrid cryptographic protocols in next-generation networks. 5G/6G and IoT networks necessitate the integration of classical and post-quantum algorithms. However, standard protocols combining ECDH with CRYSTALS-Kyber or CRYSTALS-Dilithium require formal security assessments. Current approaches primarily consider non-adaptive quantum adversaries, which limits their practical applicability in multi-session and dynamic environments. The paper proposes a model of a quantum-adaptive adversary. This model integrates the adversary's classical and quantum resources, an adaptive attack strategy, and a quantum-accessible oracle. It allows for the formalization of superposition queries and multi-step interactions with the protocol. A mathematical model of a hybrid handshake protocol is introduced, where the session key is formed by combining classical and post-quantum components via a Key Derivation Function. An upper bound for the adversary's advantage is derived, accounting for both the classical and post-quantum components of the protocol. To enhance resilience, three primary methods are proposed. The first is downgrade-resistant fixation of protocol parameters with cryptographic confirmation. The second is dynamic management of key parameters and cryptographic primitives based on an integrated risk function, which accounts for the adversary's quantum resources, network load, and attack activity. The third is compositional protocol verification considering multi-session and multi-level handshake phases, enabling the formalization of composability and the assessment of multi-level resilience. An integral metric of quantum-adaptive resilience is proposed, accounting for security, complexity, and adaptability. The results provide a scientific foundation for “harvest-now, decrypt-later” risk analysis.</p> 2026-04-28T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359118 APPLICATION OF GENETIC ALGORITHMS FOR SOLVING MULTI-CRITERION CHOICE PROBLEMS IN FORMING THE COMPOSITION OF EXPERT GROUPS 2026-04-27T12:05:03+03:00 Iryna Vdovychenko vivin2015@nu.edu.ua Oksana Markova vivin2015@nu.edu.ua <p>The article is devoted to the solution of an urgent scientific and applied task – the development and substantiation of a combined method for forming expert groups, the operating algorithm of which integrates statistical approaches, mathematical modeling, expert assessment methods, and the apparatus of genetic algorithms. The paper substantiates the necessity of optimizing the expert selection process to minimize errors during the examination procedure. A comprehensive scheme for forming the optimal composition of an expert group is proposed, based on a systematic combination of quantitative and qualitative indicators. Within the framework of the study, a complex optimization problem of multi-criteria selection of specialists for technical, social, and economic examinations is formalized. A scheme for combining statistical methods and a genetic algorithm in the formation of expert groups is presented. The mathematical model of the problem is based on maximizing a fitness function that considers a number of critical parameters: an individual expert competence index, a professional experience coefficient, and the degree of consistency of the candidate’s previous assessments. Particular attention is paid to the application of genetic algorithms for searching for optimal solutions in a large space of alternatives. The use of evolutionary mechanisms of selection, mutation, and crossover allows for the effective resolution of the multi-criteria selection problem, ensuring high precision in group formation. The results confirm that the synthesis of genetic algorithms with expert and mathematical methods significantly increases the reliability of forecasts and optimizes decision-making processes.</p> 2026-04-27T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359123 FINANCIAL MARKET FORECASTING USING NEURAL NETWORK AND IMMUNE APPROACHES 2026-04-27T12:20:01+03:00 Mykola Korablyov mykola.korablyov@nure.ua Danylo Antonov mykola.korablyov@nure.ua <p>Accurate stock price forecasting is a key task for investment decision support in volatile financial markets. Existing recurrent neural network approaches do not fully capture long-range dependencies and cross- market relationships, which reduces forecast quality on the volatile markets of 2022–2025 [1; 2]. This paper proposes a hybrid financial market forecasting model combining three components: a Temporal Fusion Transformer (TFT) for multivariate time-series encoding with interpretable attention; a Dendritic Artificial Immune Network (daiNet) for automatic stock clustering and adaptive relationship graph construction; and a Graph Neural Network (GNN) for joint learning of temporal and relational features. TFT, unlike LSTM, provides interpretable attention over different time horizons and explicitly models important market events. The model was validated on daily data of 16 NASDAQ technology companies over the 2022–2025 period, covering the 2022 tech crash and the 2023–2024 AI boom. Clustering identified three stable market clusters centered on eBay, Microsoft, and Amazon, reflecting distinct correlation patterns confirmed by heatmap analysis. Forecast quality was evaluated using mean squared error (MSE); the full (TFT + daiNet + GNN) configuration achieved an MSE of 1.41% on the test interval. The predicted returns were also used to generate an investment decision: for each day in the test set, the stock with the highest predicted return for the next period was selected. Experiments were conducted on daily OHLCV data for a set of liquid equities with a 1–5 day forecasting horizon and a 30-day TFT input window. Analysis of TFT attention weights revealed concentration on 5-day and 20-day horizons, corresponding to weekly and monthly trading cycles and providing actionable insights for practitioners. The absence of negative correlations across all 16 companies confirms broad market synchronization under shared macroeconomic shocks.</p> 2026-04-27T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359133 INTELLIGENT DECISION-SUPPORT SYSTEM FOR EVALUATION AND OPTIMIZATION OF WEB RESOURCES WITHIN URBAN INFORMATION INFRASTRUCTURE BASED ON FUZZY MCDM MODELS 2026-04-27T12:40:20+03:00 Artem Onyshchenko Artem.Onyshchenko@kname.edu.ua <p>The purpose of this study is to develop and substantiate a fuzzy hybrid decision-support model for evaluating and optimizing web resources operating within the urban information infrastructure. The study aims to formalize the assessment of technical, behavioral, semantic, and algorithmic factors influencing web resource performance while accounting for expert uncertainty and dynamic changes in digital environments. The research is based on a combined methodological framework integrating machine learning (ML) and natural language processing (NLP) techniques with fuzzy multi- criteria decision-making (MCDM) methods, specifically DEMATEL-DANP-VIKOR. Fuzzy expert evaluation is implemented using linguistic scales transformed into triangular fuzzy numbers, followed by defuzzification through the centroid method. The Fuzzy DEMATEL approach is applied to construct the interrelationship matrix and identify cause-effect dependencies among evaluation criteria. DANP is used to determine criterion weights, and VIKOR is employed to calculate integral efficiency indices and rank alternative web resources. An infological model of the information system is developed to represent structured functional modules and information flows, including data acquisition, fuzzy expert assessment, ML analytics, multi-criteria optimization, and recommendation generation subsystems. The proposed framework enables the formal quantification of interdependencies among technical, content-related, behavioral, and semantic criteria influencing web resource effectiveness. The integration of fuzzy logic reduces subjectivity in expert assessments and allows uncertainty to be incorporated into the evaluation process. The model supports the computation of integral performance indicators and the identification of priority directions for optimization. The developed infological model ensures systemic consistency of analytical, computational, and managerial components within the decision-support environment. The scientific novelty consists in the integration of fuzzy expert evaluation with network-based MCDM methods (DEMATEL-DANP-VIKOR) and machine learning analytics within a unified formal decision- support framework. Unlike traditional optimization approaches, the proposed model simultaneously accounts for causal relationships among criteria, uncertainty of expert judgments, and adaptive data-driven analysis. The proposed model can be implemented within urban digital infrastructures to enhance the efficiency, accessibility, and transparency of municipal web services. The approach provides a foundation for developing adaptive intelligent platforms capable of maintaining stability and operational effectiveness under evolving technological and informational conditions.</p> 2026-04-27T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359287 CURRENT ARCHITECTURES FOR DECISION-MAKING BY AUTONOMOUS AGENTS 2026-04-28T11:43:36+03:00 Yevhen Sobol sirotenko.helvetica@gmail.com Andrii Ponepaliak sirotenko.helvetica@gmail.com Yaroslav Dorogiy sirotenko.helvetica@gmail.com <p>Autonomous agents operating in highly dynamic and stochastic environments with a high degree of uncertainty require computationally efficient and reliable decision-making architectures. Historically, the control of such systems has been based on classical paradigms, including reactive architectures, finite state machines, and behavior trees. However, these methods face the problem of an exponential combinatorial explosion of the state space in unstructured conditions and exhibit a critical degradation in performance due to their inability to adapt continuously. At the same time, the transition to modern, purely neural network-based control methods is accompanied by an inherent tendency of systems toward stochastic hallucinations, epistemic opacity of decision- making mechanisms, and a fundamental inability to provide deterministic mathematical guarantees of safe operation. This article investigates and justifies hybrid neurosymbolic architectures that synergistically combine the approximation capabilities of deep learning methods for processing multimodal sensory data with the mathematical rigor and semantic interpretability of classical symbolic logic methods. A comprehensive analysis was conducted of the structural integration of neural network modules for high-level feature extraction with graph-based world models and hierarchical symbolic planners. Particular attention is paid to solving the problem of semantic ambiguity through automated verification of the structure of knowledge graphs and the elimination of logical conflicts prior to the start of the physical execution stage. The promise of using semantic scene decomposition for optimizing computational resources has been demonstrated.</p> 2026-04-28T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359293 ONLINE RELIABILITY ESTIMATION OF SOURCES IN STREAMING ANALYSIS OF MULTIMODAL TIME SERIES WITH ISOTONIC REGRESSION CALIBRATION 2026-04-28T12:00:57+03:00 Illia Uzun uzun.i.s@op.edu.ua Mykhaylo Lobachev sirotenko.helvetica@gmail.com <p>Streaming intelligent decision support systems processing multimodal time series operate under causality constraints and must satisfy requirements of low latency, bounded computational budgets, and controllable responses to environmental change. A critical practical risk in such pipelines is the temporary degradation of individual sources (missing values, elevated noise, scale shifts), which can masquerade as concept drift and trigger unstable or excessive control actions. This paper considers online estimation of source reliability as a causal probabilistic assessment of being in a non- degraded state and shows that practical control requires a calibrated scale: the output value must be interpretable as the frequency of the “non-degraded” regime under relevant conditions. The proposed approach combines lightweight degradation proxy signals suitable for online computation with isotonic regression calibration, which provides a monotone mapping from scores to correct probabilities. Key experimental results demonstrate ROC- AUC of 0 86 0 07 . ±. for the calibrated variant and calibration improvement from ECE of 0 18 0 07 . ±. (uncalibrated) to ECE of 0 08 0 04 . ±. (calibrated) at acceptable time costs: simple proxy scales have microsecond latencies, while the full online model maintains mean latency of approximately 150µs , meeting the needs of streaming pipelines.</p> 2026-04-28T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359113 HARDWARE-SOFTWARE MODULE FOR INTELLIGENT MICROCLIMATE CONTROL IN INDUSTRIAL FACILITIES 2026-04-27T11:37:58+03:00 V.V. Yevsieiev vladyslav.yevsieiev@nure.ua I.V. Holod vladyslav.yevsieiev@nure.ua <p>The article addresses the problem of automated and intelligent microclimate control in industrial premises, where the stability of the temperature and humidity regime directly affects the efficiency of technological processes, energy efficiency, and equipment reliability. It is shown that traditional control systems based on simplified linear models do not provide the required accuracy under conditions of multifactor disturbances and dynamic changes in environmental parameters. The purpose of the study is to develop a hardware–software module capable of providing adaptive, stable, and energy-efficient control of microclimate parameters based on the coordinated operation of sensor, computational, and executive subsystems. The research methodology involves the development of a structural architecture of the hardware–software module, integration of a sensor system, a controller with embedded HMI, and executive mechanisms, as well as the formation of algorithmic logic for regulating temperature, humidity, air exchange, and internal pressure. The study applies methods of analysis and synthesis, mathematical modeling, and experimental testing under real industrial conditions. The module operation was verified through continuous data acquisition, real-time parameter logging, and evaluation of the response of executive mechanisms to changes in external and internal factors. The test results confirmed the module’s ability to maintain stable microclimate parameters within specified setpoints, ensuring smooth switching between heating, cooling, and ventilation modes. The recorded temperature dynamics demonstrate the absence of sharp fluctuations, effective operation of hysteresis mechanisms, and rapid system response to load changes. The practical applicability of the module is confirmed by its stable operation under daily variations in outdoor temperature, the presence of thermal disturbances, and variations in air exchange. The selected combination of equipment (KSP-08.L controller, data acquisition modules, and sensor devices) provided the required performance, accuracy, and flexibility. Prospects for further research include expanding the functionality of the hardware–software module through the integration of predictive models, in particular neural network structures of the NNARX type, which will increase the accuracy of microclimate dynamics assessment and optimize control logic in complex industrial scenarios. The obtained results form a basis for improving intelligent control systems in industrial cyber-physical complexes and for their implementation in various industrial sectors.</p> 2026-04-27T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359117 ANALYSIS OF THE SINGULAR SPECTRUM OF WATER CONSUMPTION IN A DIFFUSION UNIT OF A SUGAR PRODUCTION PLANT FOR AUTOMATION APPLICATIONS 2026-04-27T11:50:05+03:00 Anton Horpynchenko antongorpinchenkodra@gmail.com <p>This paper investigates the application of Singular Spectrum Analysis for processing time series of water flow rate in a diffusion unit of sugar production. Stable water flow is an important factor affecting the efficiency of the diffusion process, since it influences hydrodynamic conditions, sucrose extraction efficiency, and energy consumption of the technological equipment. In industrial environments, measurement signals often contain noise, random disturbances, and short term anomalies, which complicates their direct use in automatic control systems. The study applies Singular Spectrum Analysis as a non parametric method for time series decomposition that allows separating trend, periodic components, and noise without assuming a predefined mathematical model of the process. A practical implementation procedure is proposed, including trajectory matrix construction, singular value decomposition, component grouping, and signal reconstruction using diagonal averaging. Experimental analysis of water flow rate data demonstrates that the leading components represent the main physical structure of the signal, including trend and seasonal variations, while higher order components correspond to noise. Reconstruction based on the dominant components significantly reduces random disturbances while preserving the informative dynamics of the technological process. The quality of reconstruction is evaluated using statistical accuracy indicators such as Mean Absolute Error, Root Mean Square Error, and Mean Absolute Percentage Error. The obtained results confirm that Singular Spectrum Analysis can effectively improve the reliability of measurement signals and may be applied in industrial automation systems for control, diagnostics, and forecasting of diffusion unit operating modes.</p> 2026-04-27T00:00:00+03:00 Copyright (c) 2026 https://ota-new.donntu.edu.ua/article/view/359299 HIGHLY EFFICIENT FORMALIZED COMPUTER MODELS FOR REPRODUCING TRANSCENDENTAL FUNCTIONS IN NON-TRADITIONAL TASK SETTINGS 2026-04-28T12:13:29+03:00 Volodymyr Lukashenko sirotenko.helvetica@gmail.com Artemii Bernatskyi sirotenko.helvetica@gmail.com Yurii Yurchenko yuriyyurchenko14@gmail.com Oleksandr Siora sirotenko.helvetica@gmail.com Valentyna Lukashenko sirotenko.helvetica@gmail.com Dmytro Harder sirotenko.helvetica@gmail.com <p>The work is devoted to the creation and research of highly efficient formalized models of special-purpose precision computers for solving non-traditional problems caused by the absence of analytical relationships between the values of transcendental functions and the corresponding values of ordered grid numbers in computer integrated control systems based on tabular- algorithmic methods. The use of specialized precision computing devices for controlling objects and high-speed processes in real time, where the use of general-purpose microprocessors, even with special software tools, is impossible due to the high requirements for speed, reliability, dimensions, power consumption, readiness, and equipment costs (cost). Particularly relevant is the task of hardware implementation of multi-digit special- purpose computers for high-precision reproduction of basic mathematical and transcendental functions under conditions of limited energy-time resources in a single crystal. In this regard, a promising direction is the application of formalized tabular-algorithmic methods that allow optimizing the structure of special-purpose computers without compromising the accuracy of function reproduction. The aim of the work is to create a model of a precision special-purpose computer that provides high efficiency in reproducing values in the binary number system of transcendental functions relative to an ordered grid number by using a formalized tabular logical reversible method of converting the input code set into the output using correction constants. The work verifies the effectiveness of formalized tabular-algorithmic models of precision special-purpose computers implemented by a formalized tabular logical-reversible method. The results obtained were compared with the classical tabular method of hardware implementation in terms of a set of key indicators, namely: power consumption, time required to reproduce functions, and hardware costs (cost) within a single crystal. An original formalized model of a precision computer for special purposes is proposed, which reproduces the value of a transcendental function from the corresponding ordered grid number with lower energy, time, and equipment costs, which adequately ensures an increase in the efficiency of computer-integrated systems in the fields of air navigation, defense, and space.</p> 2026-04-28T00:00:00+03:00 Copyright (c) 2026