LIQUID NEURAL NETWORKS: PRINCIPLE OF WORK AND AREAS OF APPLICATION

Authors

  • R. Shevtsov O.M. Beketov National University of Urban Economy in Kharkiv
  • V. Bredikhin O.M. Beketov National University of Urban Economy in Kharkiv
  • I. Khoroshylova Kharkiv National Automobile and Highway University

DOI:

https://doi.org/10.33042/2522-1809-2024-1-182-14-19

Keywords:

liquid neural networks, artificial intelligence, adaptive control, learning efficiency, application potential

Abstract

The article deals with the architecture of liquid neural networks (LNN) and their potential in modern technologies. Thanks to the constant development of algorithms and hardware, neural networks are becoming more and more powerful and efficient, which opens up new opportunities for their application. The authors describe the principle of operation of liquid neural networks, which includes the process of learning and inference, which allows effective use of the natural dynamics of the system to solve various tasks, including classification, prediction, and control. We note that the concept of LNNs arose as an attempt to overcome some of the limitations and problems faced by traditional neural networks. The study considers the basic concepts and principles of LNNs and their application potential in various fields, from robotics to medicine and industry. The study also determines the main advantages and disadvantages of LNNs compared to traditional models. It is possible to use them to process a large stream of data, such as video, audio, or sensory data from various sensor types, allowing robots to receive information about their environment and make decisions based on that data. In medical diagnostics and image processing, liquid neural networks can significantly contribute to the quality and efficiency of diagnostic procedures. LNNs can enable the implementation of automatic control systems that monitor and regulate parameters of production processes or adapt to changes in the environment and optimise parameters to achieve maximum productivity and product quality. The field of LNN lacks standards and is limited to using performance metrics. Establishing standards and objective metrics will allow researchers and engineers to understand and compare different LNN implementations. Although LNNs are relatively efficient in terms of power consumption, their implementation at the hardware level may require new technologies and architectures to optimise performance. As a result, the study outlines the prospects for the further development of this technology.

Author Biographies

R. Shevtsov, O.M. Beketov National University of Urban Economy in Kharkiv

Student at the Academic and Research Institute of Energy, Information and Transport Infrastructure

V. Bredikhin, O.M. Beketov National University of Urban Economy in Kharkiv

Candidate of Technical Sciences, Associate Professor, Associate Professor at the Department of Computer Science and Information Technology

I. Khoroshylova, Kharkiv National Automobile and Highway University

Candidate of Economic Sciences, Associate Professor, Associate Professor at the Department of Accounting and Taxation

References

Shevchenko, V., Bredikhin, V., Senchuk, T., & Verbytska, V. (2022). Comparison of methods for automatic license number recognition. Municipal Economy of Cities. Series: Engineering science and architecture, 4(171), 7–11. https://doi.org/10.33042/2522-1809-2022-4-171-7-11 [in Ukrainian]

Tyshchenko, V. S. (2023). Analysis of training methods and neural network tools for fake news detection. Cybersecurity: Education, Science, Technique, 4(20), 20–34. https://doi.org/10.28925/2663-4023.2023.20.2034 [in Ukrainian]

Hasani, R. (2020). Interpretable recurrent neural networks in continuous-time control environments [Doctoral dissertation, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2020.78942

Holl, P., Koltun, V., & Thuerey, N. (2023). Learning to Control PDEs with Differentiable Physics. In Proceedings of the 8th International Conference on Learning Representations (ICLR) (vol. 16, pp. 12216–12243). Curran Associates, Inc. https://openreview.net/pdf?id=HyeSin4FPB

Che, Z., Purushotham, S., Cho, K., Sontag, D., & Liu, Y. (2018). Recurrent Neural Networks for Multivariate Time Series with Missing Values. Scientific Reports, 8, 6085. https://doi.org/10.1038/s41598-018-24271-9

Koželský, O., & Hild, T. A. (2021, February 21). Reservoir Computing for .NET (RCNet). GitHub. https://github.com/okozelsk/NET?tab=readme-ov-file#reservoir-computing-for-net-rcnet

updreamers. (2022, October 18). Reservoir Computing and Liquid State Machines. Up Dreamers. https://updreamers.com/reservoir-computing-and-liquid-state-machines/

Lindemann, B., Müller, T., Vietz, H., Jazdi, N., & Weyrich, M. (2021). A survey on long short-term memory networks for time series prediction. Procedia CIRP, 99, 650–655. https://doi.org/10.1016/j.procir.2021.03.088

Laurinavicius, T. (2021, June 1). 7 Life-Saving AI Use Cases in Healthcare. V7 Labs. https://www.v7labs.com/blog/ai-in-healthcare

Bartunov, S., Santoro, A., Richards, B. A., Marris, L., Hinton, G. E., & Lillicrap, T. (2019). Assessing the Scalability of Biologically-Motivated Deep Learning Algorithms and Architectures. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, & R. Garnett (Eds.), Proceedings of the 32nd Annual Conference on Neural Information Processing Systems (NIPS 2018) (pp. 9368–9378). Curran Associates, Inc. https://arxiv.org/pdf/1807.04587.pdf

Shi, W., Gong, Y., Tao, X., Cheng, D., & Zheng, N. (2019). Fine-Grained Image Classification Using Modified DCNNs Trained by Cascaded Softmax and Generalized Large-Margin Losses. IEEE Transactions on Neural Networks and Learning Systems, 30(3), 683–694. https://doi.org/10.1109/TNNLS.2018.2852721

Published

2024-04-05

How to Cite

Shevtsov, R., Bredikhin, V., & Khoroshylova, I. (2024). LIQUID NEURAL NETWORKS: PRINCIPLE OF WORK AND AREAS OF APPLICATION. Municipal Economy of Cities, 1(182), 14–19. https://doi.org/10.33042/2522-1809-2024-1-182-14-19

Most read articles by the same author(s)

1 2 > >>