Preview

Proceedings of the Southwest State University. Series: IT Management, Computer Science, Computer Engineering. Medical Equipment Engineering

Advanced search
Vol 15, No 4 (2025)
View or download the full issue PDF (Russian)

INFORMATION AND INTELLIGENT SYSTEMS

8-21 29
Abstract

The purpose of the research is to demonstrate the possibility of assessing the level of cognition by reducing the entropy of the code sequence. In complete chaos there is no order, but evolution, natural intelligence and artificial intelligence are able to counteract disorder, gradually reducing it.

Methods. Shannon entropy is canonized and described in all textbooks, but as a tool for practical application it is flawed due to the enormous computational complexity of its estimates. However, in this century, alternative approaches have been actively developed that significantly simplify calculations. In particular, the entropy in the space of Hamming distances should have a linear computational complexity, and the entropy of the correlation entanglement of code bits should have a quadratic computational complexity. The only problem is that the Hamming entropy and the entropy of the correlation entanglement of bits have their own scales that do not coincide with the Shannon entropy scale.

Results. There should be many entropy metrics, one of such metrics is the length of the cognitive path from the chaos of "white" noise to complete determinism and monotony. The article provides a software implementation of the assessment of such a metric. It is shown that the length of the cognitive path is reduced to both the Hamming distances and the correlation coefficients between the resulting code sequences.

Conclusion. The proposed cognitive path length metric should apparently have its own entropy scale, which does not coincide with the Shannon entropy scale. All this should be considered as a convenient for practical use special case of some simplified evaluation of a computationally complex problem. At least, the program given in the article can be considered as another system of cryptographic key quality tests, which has polynomial computational complexity.

22-34 27
Abstract

The purpose of the research is to develop the core architecture of an information system based on the open-source workflow engine Elsa Workflows, focused on the automation of business processes in small and medium-sized businesses. Special attention is paid to the creation of a flexible, scalable and cost-effective information system.

Methods. The paper uses system analysis methods to conduct a comparative analysis of existing solutions such as Camunda, ELMA BPM. The functional and non-functional requirements for the information system are formulated and substantiated. An algorithm for the functioning of the core has been developed, and a prototype architecture using technology has been implemented.NET, PostgreSQL, and React. Simulation modeling was carried out, experiments were conducted.

Results. A modular system architecture is proposed, including subsystems for task management, monitoring, notifications, integration, and analytics. Elsa Workflows is a lightweight, modular and freely distributed engine for the .NET platform. Its key features are support for code and declarative implementation of processes, visual editing, built-in support for REST API and microservice architecture. Special attention is paid to the flexibility of defining processes: they can be executed both in C# (code-first approach), and in a declarative form – via JSON or YAML. The scheme of the algorithm for the functioning of the task's life cycle with error handling mechanisms, post-processing and information archiving capabilities is implemented. The possibility of creating an effective workflow core on the platform has been proven.NET, characterized by a low cost of ownership and a high degree of adaptability.

Conclusion. Using the Elsa Workflows open-source workflow engine allows you to create a modern information system core that combines flexibility, productivity, and compliance with import substitution requirements. The proposed solution can serve as a basis for digitalization of poorly automated industries and contribute to improving the operational efficiency of enterprises.

35-49 18
Abstract

The purpose of research is development of a universal methodology of positive industrial engineering for image generation by diffusion models based on a deep linguistic and semantic analysis of Human-AI interaction and identification of cross-model invariants.

Methods. Within the framework of this study, an interdisciplinary scientific approach was applied, combining methods of cognitive analysis and empirical verification.

Results. The results of the study confirmed the high efficiency of the proposed universal methodology of positive industrial engineering, which significantly improved the quality of image generation by diffusion models. Experimental data have shown that promptas formed according to the developed structure and lexical optimization strategies provide better compliance with the specified characteristics and more stable results across different models, while statistically significantly exceeding the quality of unstructured promptas (p < 0,01). The use of a multi-level system of components and implicit control methods has made it possible to reduce the variability of unwanted artifacts, increase the accuracy of visual characteristics, and simplify the process of creating designs, making it more predictable, reproducible, and universal for various platforms. In general, the implementation of this methodology improves human interaction with AI, increases the stability and quality of visual results, and facilitates the adaptation of products to different models and tasks.

Conclusion. The conducted research has confirmed the effectiveness of the proposed universal methodology of positive industrial engineering for image generation by diffusion models. The introduction of a structured approach and lexical optimization strategies can significantly improve the quality, stability and predictability of results, as well as reduce the number of unwanted artifacts. This approach promotes more manageable and universal human-AI interaction, making it easier to create high-quality images in various models and conditions. In the future, the use of the developed methodology can become the basis for improving the efficiency of automated visual content generation systems and expanding their practical capabilities.

50-66 21
Abstract

The purpose of research. Digitalization of all spheres of applied activity has become a prerequisite not only for the development, but also for the existence of a modern society, the functioning of which is ensured by the conditions for making effective managerial decisions. In the era of cloud computing, Big Data, and social media, complex management information processing is becoming a trend in various industries. An obligatory structural element determining the viability of state and social structures is the system of training and education of the younger generation. The higher education system acts as the basis for training specialists in demand by the market, therefore, the higher education management system is obliged to actively innovate and optimize, change traditional models and methods, create an effective education management system taking into account the specifics of the modern generation and the need for universities to solve educational problems in the context of the formation of a single information space implemented using automated information systems. The purpose of the research is to analyze the process of digitalization of the university's educational process management based on the use of an information system for automating the educational process. To achieve this goal, it is necessary to analyze the positive aspects of digitalization, identify negative aspects and formulate proposals for their elimination.

Methods. Research methods: study of the structure and specifics of the functioning of existing digital technologies used in universities, survey method, observation, descriptive method, synthesis and analysis methods.

Results. The main hypothesis of the study is the assumption that the digital transformation of educational process management is the integration of information technology and management solutions based on a single automation system. The results of the study confirm the effectiveness of using the "APEX-VUZ" system in a specialized university. Conclusion. In the course of the conducted research, specific proposals were formulated for the modernization of the "APEX-VUZ" system in a specialized university.

Conclusion. In the course of the conducted research, specific proposals were formulated for the modernization of the "APEX-VUZ" system in a specialized university.

MECHATRONICS, ROBOTICS

67-88 19
Abstract

The purpose of the research is to find ways to increase the efficiency of the nanosatellite constellation (network) in the conditions of replenishment and retirement of spacecraft during operation in orbit based on a self-organizing mesh network, in which routing is carried out dynamically based on the connectivity of network elements.

Methods are based on decision-making techniques, systems analysis, and decentralized control principles, enabling a nanosatellite network to independently reconfigure itself to meet changing operating conditions and task requirements. Using the properties of self-organization and adaptive control methods (distribution and responsiveness to change), the nanosatellite constellation maintains a configuration of satellites capable of exchanging data and service information. A two-level network reconfiguration method has been developed, enabling proactive changes to the composition of nanosatellites based on historical assessments of the quality and strength of transmitted signals. Algorithms for route list generation and route analysis have been developed, which can be executed autonomously on each nanosatellite in the constellation.

Results. The developed reconfiguration method enables asynchronous addition and deletion of satellites from the network based on received or discovered information about their status and connections between satellites. It is shown that the decentralized approach has linear time complexity for the most critical algorithms for updating and constructing network routes.

Conclusion. The developed reconfiguration method and algorithms for managing a nanosatellite constellation form the basis for developing network software that allows each satellite to autonomously make decisions about modifying its status and route list.

89-106 34
Abstract

Purpose of research the is development of a mathematical model describing human biomechanical and anthropometric parameters during walking under different conditions (slow and fast gait). The developed model is intended for integration into the control system of rehabilitation exoskeletons to enable dynamic correction of movement patterns through real-time processing of kinematic data. An additional objective is the parameterization of the model for adaptation to individual user characteristics.

Methods. Experimental studies involved video recording of key lower limb point movements of subjects on a treadmill, with subsequent data processing in OpenSourcePhysicsTracker software. A comparative analysis of trajectory approximation accuracy was conducted using polynomial regression and harmonic analysis (Fourier series) in the Curve Fitting Toolbox (MATLAB R2023a). Model validation was performed by calculating the mean square error (MSE), where the MSE did not exceed 1,51×10⁻² m², along with empirical methods.

Results. It was established that the trigonometric method (Fourier series) provides significantly higher accuracy in approximating periodic gait trajectories compared to the polynomial method, as confirmed by lower mean square error values. The polynomial model demonstrated unstable behavior at orders above 7, showing a tendency for significant deviations at the interval endpoints. For the harmonic model, the optimal number of components was 5–7 harmonics. Smooth approximated trajectories were obtained for all key points of the foot and the rotation angle of the metatarsophalangeal joint, with Fourier series expansion coefficients presented for coordinates along the X and Z axes.

Conclusion. An effective methodology for mathematical modeling of foot movement trajectories during walking based on Fourier series has been developed. This method is recognized as the most preferable for describing biomechanical walking patterns. The obtained models possess high application potential for creating control systems for rehabilitation equipment, enabling personalization based on patients' anthropometric characteristics.

107-122 26
Abstract

Purpose of research. Modern unmanned aerial vehicles face challenges in maintaining reliable communications in the face of radio interference, difficult terrain (mountains, forests), or active electronic countermeasures. One possible solution to this problem is the application of quantum cryptography principles in combination with the latest communication protocols, such as LoRaWAN 2, which help ensure secure and stable data transmission in challenging environments. The Purpose of the esearch is to evaluate the feasibility of using quantum cryptography and the LoRaWAN 2 protocol to establish and maintain continuous, stable communication with an unmanned aerial vehicle.

Methods. The research methods are based on concepts from statistical radio engineering theory and ultra-high-frequency radio wave propagation theory. Multicriteria analysis, parametric synthesis, and structural synthesis methods are used. The principles of information transmission from unmanned aerial vehicles are analyzed. A critical assessment of the performance of a UAV using quantum cryptography and the LoRaWAN 2 protocol is conducted.

Results. Analytical expressions and comparative characteristics are presented to assess the potential for using quantum cryptography and the LoRaWAN 2 protocol with unmanned aerial vehicles. It is shown that using quantum cryptography, the probability of data interception in a UAV communication channel is approximately 1%, significantly increasing its security compared to RSA-2048. Quantum cryptography improves UAV security and key retrieval with a slight increase in weight, power consumption, and speed. In the coming years, with the development of compact systems, its promising implementation will become the standard for commercial drones.

Conclusion. As promising areas of research and application in the field of using unmanned aerial vehicles, the use of quantum cryptography technology and the LoRaWAN 2 protocol should be considered, which contributes to improving the characteristics of UAVs: high range, energy efficiency, security and scalability.

IMAGE RECOGNITION AND PROCESSING

123-136 26
Abstract

Purpose of research. The study aims to improve the reliability of biometric user identification based on dynamic signatures by developing neural network models operating in the feature space of multidimensional dynamic curves. The focus is on the structural and parametric synthesis of a classification neural network architecture using statistical, harmonic, and wavelet-transformed features extracted from the dynamic signature.

Methods. The proposed identification model performs parallel recognition of multidimensional curve fragments using various methods, including statistical, metric, and neural classifiers. The analysis is based on a set of dynamic signature parameters, such as pen coordinates, pressure, velocity, acceleration, and their derived features. Statistical metrics – mean values, standard deviations, coefficients of variation, entropy, and equivocation – are combined with Discrete Fourier Transform (DFT), Discrete Cosine Transform (DCT), and Discrete Wavelet Transform (DWT) to form an informative feature space. These features are then used to synthesize an MLP classifier whose architecture is adapted to the input data.

Results. Experimental results confirm that using secondary features significantly increases identification accuracy compared to traditional methods. A set of 3–5 key parameters along with their spectral derivatives allows for accuracy levels of 0,8 to 0,95 with a limited number of users, maintaining around 0,7 when scaling. The average improvement in identification accuracy was 25–35% over statistical methods and 5–15% over metric-based algorithms.

Conclusion. To ensure the required level of identification reliability, it is recommended to apply a multi-level approach involving separate processing of dynamic signature parameters followed by result integration. The most effective configurations were based on neural network models combined with metric and correlation methods operating in the space of spectral and statistical features.

137-149 15
Abstract

Purpose of research. Foot, computer plantography, radiography, integrative research, multimodal data, orthopedics, anatomy. Introduction. Integrative diagnostic methods of the foot, combining X-ray and computer plantography data, allow us to obtain a holistic view of the morphology, condition of the joints and the nature of the contact of the foot with the support in the static. The development of such methods is relevant to increase the information content and accuracy of anatomical assessment of the foot. The purpose of the work is to develop a methodology for integrative anatomical assessment of the foot.

Methods. The study was performed on 50 patients aged 18-70 years who underwent computer plantography and radiography of the foot in a direct projection. During the research, radiopaque metal markers were used for spatial image binding. The plantograms were processed using previously developed software.

Results. A three-stage method of integrative foot examination has been developed, including performing plantography and radiography using metal markers on the foot, image processing and their layered combination during analysis. The technique ensures accurate alignment of images through the use of markers, as well as unification of data visualization and reproducibility of the study. A set of 50 integrative foot studies was obtained. As a result of the integrative approach, the accuracy of localization of anatomical structures increases and the possibilities of complex analysis expand, which is important for planning orthopedic treatment and monitoring its effectiveness.

Conclusion. The proposed technique is of interest for scientific research and clinical practice in view of obtaining a unified result of two different studies – plantography and radiography of the foot. It can be used for in-depth analysis of structural changes in the foot, evaluation of the effectiveness of therapeutic and orthopedic interventions, and the resulting dataset of integrative results can be used in educational programs and further research. The technique also opens up new perspectives for the development of artificial intelligence models in the analysis of multimodal medical data, which is especially important in the context of the development of personalized medicine.

150-161 17
Abstract

Purpose of research. The widespread adoption of dynamic signatures in various biometric technology applicationssupported by clearly defined legal procedures in many countries-drives significant attention toward the reliability of corresponding biometric authentication algorithms. While dynamic signatures are partially free from the drawbacks inherent in static signatures, the problem of authentication reliability remains critical due to the complex interplay of heterogeneous factors. Therefore, the aim of this study is to improve the reliability of user authentication based on the dynamic signature using experimental structural and parametric synthesis of problem-oriented neural networks and comparison with classical detection-discrimination algorithms for multidimensional signals.

Methods. The proposed method involves comprehensive identification of the user's dynamic signature in the sample space of multidimensional curves by means of parallel recognition of curve fragments using multiple detectors/classifiers, followed by integration and analysis of the results.

Results. Neural network algorithms for identifying the user’s dynamic signature in the sample space of multidimensional curves were experimentally studied and compared with optimal detection-discrimination algorithms for multidimensional signals. The experiments demonstrated that 3–5 key parameters-including two stylus coordinates on the tablet plane, screen pressure, and stylus velocity vectors-ensure acceptable identification reliability in the range of 0,8 to 0,95 for a small number of users, and maintain a reliability level of about 0,7 with unlimited user scaling. The average gain in accuracy from using the developed models and algorithms, compared to statistical methods, amounted to 25– 35%, and compared to metric methods, 5–15%.

Conclusion. To achieve the required reliability of user authentication, hardware-software identification models for dynamic signatures should be decomposed into groups with a limited number of users. There exists an optimal combination of algorithms that delivers maximum accuracy in result integration: Euclidean metric, correlation-based, and neural network classifiers.

SYSTEM ANALYSIS AND DECISION-MAKING

162-174 20
Abstract

The purpose of the research is to develop a system to support medical decision-making in the selection of organ transplant recipients, which makes it possible to automate the implementation of a virtual cross-sample and assess tissue compatibility based on already known laboratory data, thus reducing the cognitive and emotional burden on doctors and reducing the likelihood of errors.

Methods. In developing the described medical decision support system, methods of system analysis, methods of designing software for information systems, LibreOffice Base database management systems, Python, SQL, Basic programming languages were used.

Results. In the course of the study, a system to medical medical decision support system in transplantology was designed and developed, automating the process of selecting recipients for donor transplantation taking into account various compatibility factors, and preparing reports in the required format. The core of the system is a database containing information about recipients and donors. The system not only automatically determines the compatibility parameters of recipients and donors, but also ranks them by the degree of compatibility. The described system has been implemented in the activities of the Immunotyping Department of the Regional State Budgetary Healthcare Institution "Regional Clinical Hospital", Krasnoyarsk. The results of the pilot operation showed that the system meets all functional requirements. Automation of routine operations allowed doctors to reduce the time for decision-making and reduce the likelihood of errors.

Conclusion. A medical decision support system in transplantology when selecting recipients for donor organ transplantation has been developed. This system automates the process of ranking recipients by the degree of compatibility with the donor organ and other factors, preparing various reports, which reduces the cognitive load on the doctor and reduces the likelihood of errors in the selection of recipients, and generally reduces the volume of routine work.

MODELING IN MEDICAL AND TECHNICAL SYSTEMS

175-191 20
Abstract

The purpose of the research is to substantiate the possibility of using the tensor and cluster analysis methodology to simulate the processes associated with the assessment of the functional state of critical information infrastructure (CII) objects in conditions of destructive electro-magnetic influences that cause functional damage to the microelectronic component of the information processing tools (routers, controls, computer equipment and monitoring) critical information infrastructure.

Methods. A systematic approach was used to model processes, as well as methods of tensor and cluster analysis in the framework of assessing the functional state of objects of critical information infrastructure in conditions of destructive effects. The integration of these approaches makes it possible to optimize the parameters of networks and communication nodes, as well as to increase their resistance to non-smooth destructive effects.

Results. A mathematical model has been developed to describe the state of the critical information infrastructure, based on the principles of tensor and cluster analysis, as well as on the hierarchical procedure for "degradation of the critical information infrastructure by watering" under conditions of destructive electromagnetic exposure. The proposed model is intended for the formation and implementation of decisive rules aimed at ensuring the stable functioning of critical information systems.

Conclusion. Based on the study, the possibility of applying tensor and cluster analysis to process modeling was proved related to the assessment of the functional state of critical information infrastructure objects under conditions of destructive electromagnetic influences and the hierarchical procedure of "degradation of critical information infrastructure by absorption" with the formulation and implementation of decisive rules during the investigated process, that will make it possible to subsequently apply the developed models and methods to increase protection of object elements and critical information infrastructure from individual external destructive electromagnetic effects.

192-210 29
Abstract

The purpose of the research is evaluation of the effectiveness of the modified EfficientNetB3 architecture based on transfer deep learning and early stopping methods in medical decision support systems for differential diagnosis of Alzheimer's disease stages.

Methods. To conduct experimental studies, a training dataset was generated, normalized, and augmented. A modified EfficientNetB3 neural network architecture was implemented using transfer learning and early stopping methods in Python. The neural network model was trained.

Results. The classification performance of the trained neural network model was assessed using the Recall, Precision, Specificity, F1-score, and AUC-ROC metrics. Analysis of these metrics revealed that the results achieved by the modified EfficientNetB3 architecture are characterized by significant asymmetry, indicating the highly specialized nature of this model. On the one hand, the model proved to be an effective tool for diagnosing moderate dementia, demonstrating the highest possible AUC value. On the other hand, classification performance for the remaining classes was significantly lower (AUC values for the "No Dementia," "Very Mild Dementia," and "Mild Dementia" classes were 0,87, 0,86, and 0,95, respectively).

Conclusion. Based on the results of the analysis, it can be concluded that the primary practical value of this modification of the EfficientNetB3 architecture lies in its use in heterogeneous ensembles or cascaded diagnostic systems for verifying a specific stage of Alzheimer's disease – moderate dementia – in order to improve the overall system efficiency. This points to the potential for further research in the area of creating highly specialized architectures capable of solving specific subproblems with high accuracy, surpassing general-purpose but less focused approaches.

211-234 30
Abstract

Purpose of research. Domestic and foreign studies have proven that the electrical impedance of biotissue can be used as a predictor of chest diseases. However, this method needs to be improved, since it has limitations in resolution, as well as in the imperfection of bioimpedance models required to form input vectors for machine learning systems.

Methods. The presented study proposes a hybrid model of an intelligent bioimpedance research system that uses both a machine learning model and the intelligence of a specialist who analyzes the image of an anatomical object obtained from the results of electrical impedance mapping. To obtain an image, a multi-pole model of biomaterial impedance is used. To construct such a model, direct and inverse problems were solved. As a result of solving the inverse problem, equations were obtained that allow one to determine the potentials at the nodes of a multi-pole with a known impedance in its links. As a result of solving the inverse problem, the impedances of the multipole links were determined with known potentials at its poles.

Results. During the study, a program was developed for constructing heat maps of the chest impedance distribution. The program is a powerful tool for visualizing the impedance distribution over the chest. It combines a convenient graphical interface, modules for mathematical data processing and clear visualization of results. The program allows medical professionals to quickly get an idea of the impedance distribution, which can be useful for diagnostics and assessing the current condition of the patient. The flexibility of choosing the interpolation method and the ability to save the results make the program a valuable tool in medical practice.

Conclusion. A comprehensive solution is proposed that combines advanced mathematical data processing methods and modern approaches to creating user interfaces, which provides medical specialists with a powerful tool for analyzing chest impedance data.

235-248 25
Abstract

The purpose of the research is rationalization of diagnosis and prediction of traumatic brain injury outcomes by blood biomarkers.

Methods. In 125 examined mature (45–59 years old) and elderly (60–74 years old) patients with mild and moderate TBI, blood counts were studied on the 12th day after receiving it. A general blood test was performed using an automatic analyzer on a GS480A device (China), and a biochemical blood test was performed on a THERMO FISHER SCIENTIFIC Konelab Prime 30 device (Pharmа, Russia). Using a one-factor regression analysis of the studied 13 blood parameters, diagnostic and prognostic significance for 11 variables was revealed. To assess the quality of the predictive multivariate regression model, ROC analysis (Receiver Operator Characteristic) was used, and the area under the curve (AUC) was used to assess the discrimination of the model.

Results. In the multifactorial regression analysis in the uncorrected model, all 11 variables with the highest beta coefficient for blood levels of potassium, leukocytes, glucose, lymphocytes, and glucose-to-potassium ratio retained diagnostic and prognostic significance. At the same time, the gender- and age–adjusted multifactorial regression model included only 7 variables, and taking into account the most significant ones, a predictive model was developed: y = 7,561 + 2,652x1 – 2,848x2 + 2,458x3 + 2,573x4. The prognostic value of the created model showed that the AUC is 0,725 (p = 0,0012) with a sensitivity of 62,875% and a specificity of 71,896%.

Conclusion. The created model is of sufficient quality and can be used to diagnose and predict adverse outcomes of traumatic brain injury.



Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2223-1536 (Print)