Length of the cognitive path of iterative reduction of the entropy of binary code sequences from the initial chaos (ideal "white" noise) to the determinism of monotonicity
https://doi.org/10.21869/2223-1536-2025-15-4-8-21
Abstract
The purpose of the research is to demonstrate the possibility of assessing the level of cognition by reducing the entropy of the code sequence. In complete chaos there is no order, but evolution, natural intelligence and artificial intelligence are able to counteract disorder, gradually reducing it.
Methods. Shannon entropy is canonized and described in all textbooks, but as a tool for practical application it is flawed due to the enormous computational complexity of its estimates. However, in this century, alternative approaches have been actively developed that significantly simplify calculations. In particular, the entropy in the space of Hamming distances should have a linear computational complexity, and the entropy of the correlation entanglement of code bits should have a quadratic computational complexity. The only problem is that the Hamming entropy and the entropy of the correlation entanglement of bits have their own scales that do not coincide with the Shannon entropy scale.
Results. There should be many entropy metrics, one of such metrics is the length of the cognitive path from the chaos of "white" noise to complete determinism and monotony. The article provides a software implementation of the assessment of such a metric. It is shown that the length of the cognitive path is reduced to both the Hamming distances and the correlation coefficients between the resulting code sequences.
Conclusion. The proposed cognitive path length metric should apparently have its own entropy scale, which does not coincide with the Shannon entropy scale. All this should be considered as a convenient for practical use special case of some simplified evaluation of a computationally complex problem. At least, the program given in the article can be considered as another system of cryptographic key quality tests, which has polynomial computational complexity.
About the Author
A. I. IvanovРоссия
Alexander I. Ivanov, Doctor of (Engineering) Sciences, Professor, Scientific Consultant
Author ID: 744989
9 Sovetskaya Str., Penza 440000
References
1. Bassham L., Rukhin A., Soto J., et al. A Statistical Test Suite for Random and Pseudorandom Number Generators for Cryptographic Applications, Special Publication (NIST SP). Available at: https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=906762 (accessed 11.09.2025).
2. Zubkov A.M., Serov A.A. Testing the NIST Statistical Test Suite on artificial pseudorandom sequences. Matematicheskie voprosy kriptografii = Mathematical Issues of Cryptography. 2019;(10):89–96. (In Russ.)
3. Zubkov A.M. Entropy as a characteristic of the quality of random sequences. Matematicheskie voprosy kriptografii = Mathematical Issues of Cryptography. 2021;(12):31–48.
4. Ivanov A.I., Chernov P.A. Protocols of biometric and cryptographic handshake: protection of distributed artificial intelligence of the Internet of Things by neural network methods. Sistemy bezopasnosti = Security Systems. 2018;(6):54–59. (In Russ.)
5. Bogdanov D.S., Mironkin V.O. Data recovery for a neural network-based bio-metric authentication scheme. Matematicheskie voprosy kriptografii = Mathematical Issues of Cryptography. 2019;(10):61–74.
6. Rane S. Standardization of Biometric Template Protection. IEEE MultiMedia. 2014;21(4):94–99.
7. Ivanov A.I., Ivanov A.P., Yunin A.P. Elimination of methodological errors in estimating entropy in the space of Hamming distances. Zashchita informatsii. Insaid = Information Protection. Inside. 2023;(6):55–59. (In Russ.)
8. Feng Hao, Anderson R., Daugman J. Combining crypto with biometrics effectively. IEEE Transactions on Computers. 2006;55(9):1084–1088.
9. Nandakumar K., Jain A. K. Biometric Template Protection: Bridging the perfor-mance gap between theory and practice. IEEE Signal Processing Magazine. 2015;32:88–100. https://doi.org/10.1109/MSP.2015.2427849
10. Nikolenko S., Kudrin A., Arkhangelskaya E. Deep learning. Immersion in the world of neural networks. Saint Petersburg: Piter; 2018. 480 p. (In Russ.)
11. Meng Jian, Yao Luxing. DeepSeek in practice. Moscow: DMK Press; 2025. 206 p. (In Russ.)
12. Volchikhin V.I., Ivanov A.I., Ivanov A.P. Algorithms for fast calculation of Shannon entropy on small samples for long codes with significantly different digits. Vestnik Astrakhanskogo gosudarstvennogo tekhnicheskogo universiteta. Seriya: Upravlenie, vychislitel'naya tekhnika i informatika = Bulletin of the Astrakhan State Technical University. Series: Management, Computer Engineering and Computer Science. 2024:(4):27–34. (In Russ.) https://doi.org/10.24143/2072-9502-2024-4-27-34
13. Ivanov A.I. Correlation entropy as a metric of absolute chaos or its opposite in the form of absolute order. Sistemy bezopasnosti = Security Systems. 2025;(4):130–133. (In Russ.)
14. Ivanov A.I., Ivanov A.P., Gorbunov K.A. Neural network transformation of biometrics into an authentication code: addition of Hamming entropy by entropy of relational connections between bits. Nadezhnost' i kachestvo slozhnykh sistem = Reliability and Quality of Complex Systems. 2023;(1):91–98. (In Russ.)
15. Ivanov A.I., Godonov A.I., Malygina E.A., Papusha N.A., Ermakova A.I. Neural network analysis of small samples using a large number of statistical criteria to test the sequence of hypotheses about the value of mathematical expectations of correlation coefficients. Izvestiya vysshikh uchebnykh zavedenii. Povolzhskii region. Tekhnicheskie nauki = Proceedings of Higher Educational Institutions. The Volga Region. Technical Sciences. 2024;(3):37– 46. (In Russ.) https://doi.org/10.21685/2072-3059-2024-3-4
16. Ivanov A.I. Heisenberg's quantum uncertainty for the "medium-group" velocity of mathematical molecules. Zashchita informatsii Insaid = Information Protection. The Inside View. 2024;(6):58–92. (In Russ.)
17. Ivanov A.I. The appearance of a mutual correlation of the states of 256 Ber-zero coins when they are tossed in parallel by a handful. Zashchita informatsii Insaid = Information Protection. The Inside View. 2025;(4):82–86. (In Russ.)
18. Ivanov A.I. Entropy as an estimate of the number of code modifications from initial chaos to maximum order: a fast algorithm for approximate estimation of the quality of random sequences. Zashchita informatsii Insaid = Information Protection. The Inside View. 2024;(4):56–59. (In Russ.)
Review
For citations:
Ivanov A.I. Length of the cognitive path of iterative reduction of the entropy of binary code sequences from the initial chaos (ideal "white" noise) to the determinism of monotonicity. Proceedings of the Southwest State University. Series: IT Management, Computer Science, Computer Engineering. Medical Equipment Engineering. 2025;15(4):8-21. (In Russ.) https://doi.org/10.21869/2223-1536-2025-15-4-8-21
JATS XML


