LIQUID NEURAL NETWORKS: PRINCIPLE OF WORK AND AREAS OF APPLICATION
DOI:
https://doi.org/10.33042/2522-1809-2024-1-182-14-19Keywords:
liquid neural networks, artificial intelligence, adaptive control, learning efficiency, application potentialAbstract
The article deals with the architecture of liquid neural networks (LNN) and their potential in modern technologies. Thanks to the constant development of algorithms and hardware, neural networks are becoming more and more powerful and efficient, which opens up new opportunities for their application. The authors describe the principle of operation of liquid neural networks, which includes the process of learning and inference, which allows effective use of the natural dynamics of the system to solve various tasks, including classification, prediction, and control. We note that the concept of LNNs arose as an attempt to overcome some of the limitations and problems faced by traditional neural networks. The study considers the basic concepts and principles of LNNs and their application potential in various fields, from robotics to medicine and industry. The study also determines the main advantages and disadvantages of LNNs compared to traditional models. It is possible to use them to process a large stream of data, such as video, audio, or sensory data from various sensor types, allowing robots to receive information about their environment and make decisions based on that data. In medical diagnostics and image processing, liquid neural networks can significantly contribute to the quality and efficiency of diagnostic procedures. LNNs can enable the implementation of automatic control systems that monitor and regulate parameters of production processes or adapt to changes in the environment and optimise parameters to achieve maximum productivity and product quality. The field of LNN lacks standards and is limited to using performance metrics. Establishing standards and objective metrics will allow researchers and engineers to understand and compare different LNN implementations. Although LNNs are relatively efficient in terms of power consumption, their implementation at the hardware level may require new technologies and architectures to optimise performance. As a result, the study outlines the prospects for the further development of this technology.
References
Shevchenko, V., Bredikhin, V., Senchuk, T., & Verbytska, V. (2022). Comparison of methods for automatic license number recognition. Municipal Economy of Cities. Series: Engineering science and architecture, 4(171), 7–11. https://doi.org/10.33042/2522-1809-2022-4-171-7-11 [in Ukrainian]
Tyshchenko, V. S. (2023). Analysis of training methods and neural network tools for fake news detection. Cybersecurity: Education, Science, Technique, 4(20), 20–34. https://doi.org/10.28925/2663-4023.2023.20.2034 [in Ukrainian]
Hasani, R. (2020). Interpretable recurrent neural networks in continuous-time control environments [Doctoral dissertation, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2020.78942
Holl, P., Koltun, V., & Thuerey, N. (2023). Learning to Control PDEs with Differentiable Physics. In Proceedings of the 8th International Conference on Learning Representations (ICLR) (vol. 16, pp. 12216–12243). Curran Associates, Inc. https://openreview.net/pdf?id=HyeSin4FPB
Che, Z., Purushotham, S., Cho, K., Sontag, D., & Liu, Y. (2018). Recurrent Neural Networks for Multivariate Time Series with Missing Values. Scientific Reports, 8, 6085. https://doi.org/10.1038/s41598-018-24271-9
Koželský, O., & Hild, T. A. (2021, February 21). Reservoir Computing for .NET (RCNet). GitHub. https://github.com/okozelsk/NET?tab=readme-ov-file#reservoir-computing-for-net-rcnet
updreamers. (2022, October 18). Reservoir Computing and Liquid State Machines. Up Dreamers. https://updreamers.com/reservoir-computing-and-liquid-state-machines/
Lindemann, B., Müller, T., Vietz, H., Jazdi, N., & Weyrich, M. (2021). A survey on long short-term memory networks for time series prediction. Procedia CIRP, 99, 650–655. https://doi.org/10.1016/j.procir.2021.03.088
Laurinavicius, T. (2021, June 1). 7 Life-Saving AI Use Cases in Healthcare. V7 Labs. https://www.v7labs.com/blog/ai-in-healthcare
Bartunov, S., Santoro, A., Richards, B. A., Marris, L., Hinton, G. E., & Lillicrap, T. (2019). Assessing the Scalability of Biologically-Motivated Deep Learning Algorithms and Architectures. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, & R. Garnett (Eds.), Proceedings of the 32nd Annual Conference on Neural Information Processing Systems (NIPS 2018) (pp. 9368–9378). Curran Associates, Inc. https://arxiv.org/pdf/1807.04587.pdf
Shi, W., Gong, Y., Tao, X., Cheng, D., & Zheng, N. (2019). Fine-Grained Image Classification Using Modified DCNNs Trained by Cascaded Softmax and Generalized Large-Margin Losses. IEEE Transactions on Neural Networks and Learning Systems, 30(3), 683–694. https://doi.org/10.1109/TNNLS.2018.2852721
Downloads
Published
How to Cite
Issue
Section
License
The authors who publish in this collection agree with the following terms:
• The authors reserve the right to authorship of their work and give the magazine the right to first publish this work under the terms of license CC BY-NC-ND 4.0 (with the Designation of Authorship - Non-Commercial - Without Derivatives 4.0 International), which allows others to freely distribute the published work with a mandatory reference to the authors of the original work and the first publication of the work in this magazine.
• Authors have the right to make independent extra-exclusive work agreements in the form in which they were published by this magazine (for example, posting work in an electronic repository of an institution or publishing as part of a monograph), provided that the link to the first publication of the work in this journal is maintained. .
• Journal policy allows and encourages the publication of manuscripts on the Internet (for example, in institutions' repositories or on personal websites), both before the publication of this manuscript and during its editorial work, as it contributes to the emergence of productive scientific discussion and positively affects the efficiency and dynamics of the citation of the published work (see The Effect of Open Access).