Journal of Machine and Computing


Exploring Long Short Term Memory Algorithms for Low Energy Data Aggregation



Journal of Machine and Computing

Received On : 23 June 2023

Revised On : 30 July 2023

Accepted On : 11 October 2023

Published On : 05 January 2024

Volume 04, Issue 01

Pages : 071-082


Abstract


Long short-term memory methods are employed for data consolidation in intricate low-energy devices. It has enabled accurate and efficient aggregation of statistics in limited electricity settings, facilitating the review and retrieval of data while minimizing electricity wastage. The LSTM rules analyze, organize, and consolidate vast datasets inside weakly connected structures. It has employed a recurrent neural network to handle data processing, particularly nonlinear interactions. The machine's capabilities are subsequently examined and stored utilizing memory blocks. Memory blocks retain extended temporal connections within the data, facilitating adaptive and precise information aggregation. These blocks facilitate the system's ability to shop and utilize relevant capabilities for quick retrieval. The proposed algorithm offers realistic tuning capabilities such as learning rate scheduling and total regularization based on dropout like green information aggregation. These enable systems to reduce over fitting while permitting precise adjustment of the settings. It allows for optimizing the algorithm to provide highly dependable performance within weak structures, enhancing data aggregation techniques' energy efficiency. Standard algorithms provide an efficient, accurate solution for aggregating information in low-power systems. It facilitates evaluating, retrieving, and aggregating accurate and reliable information using memory blocks, adaptive tuning, and efficient learning rate scheduling.


Keywords


Data Aggregation, Low Power, Electricity, Over fitting, Information Retrieval.


  1. P. Ma, S. Cui, M. Chen, S. Zhou, and K. Wang, “Review of Family-Level Short-Term Load Forecasting and Its Application in Household Energy Management System,” Energies, vol. 16, no. 15, p. 5809, Aug. 2023, doi: 10.3390/en16155809.
  2. J. Wang, H. Zhu, Y. Zhang, F. Cheng, and C. Zhou, “A novel prediction model for wind power based on improved long short-term memory neural network,” Energy, vol. 265, p. 126283, Feb. 2023, doi: 10.1016/j.energy.2022.126283.
  3. B. Li, T. Wu, S. Bian, and J. W. Sutherland, “Predictive model for real-time energy disaggregation using long short-term memory,” CIRP Annals, vol. 72, no. 1, pp. 25–28, 2023, doi: 10.1016/j.cirp.2023.04.066.
  4. C. Huang, H. R. Karimi, P. Mei, D. Yang, and Q. Shi, “Evolving long short-term memory neural network for wind speed forecasting,” Information Sciences, vol. 632, pp. 390–410, Jun. 2023, doi: 10.1016/j.ins.2023.03.031.
  5. M. Stankovic, L. Jovanovic, M. Antonijevic, A. Bozovic, N. Bacanin, and M. Zivkovic, “Univariate Individual Household Energy Forecasting by Tuned Long Short-Term Memory Network,” Lecture Notes in Networks and Systems, pp. 403–417, 2023, doi: 10.1007/978-981-99-1624-5_30.
  6. M. Athanasiou, G. Fragkozidis, K. Zarkogianni, and K. S. Nikita, “Long Short-term Memory–Based Prediction of the Spread of Influenza-Like Illness Leveraging Surveillance, Weather, and Twitter Data: Model Development and Validation,” Journal of Medical Internet Research, vol. 25, p. e42519, Feb. 2023, doi: 10.2196/42519.
  7. E. Sarmas, E. Spiliotis, E. Stamatopoulos, V. Marinakis, and H. Doukas, “Short-term photovoltaic power forecasting using meta-learning and numerical weather prediction independent Long Short-Term Memory models,” Renewable Energy, vol. 216, p. 118997, Nov. 2023, doi: 10.1016/j.renene.2023.118997.
  8. P. Govindhan and K. Kumar, “Improving High data rates in Milli meter Wave Communication networks via Long short term memory Technique,” 2023 International Conference on Advancement in Computation & Computer Technologies (InCACCT), May 2023, doi: 10.1109/incacct57535.2023.10141767.
  9. M. R. Maarif, A. R. Saleh, M. Habibi, N. L. Fitriyani, and M. Syafrudin, “Energy Usage Forecasting Model Based on Long Short-Term Memory (LSTM) and eXplainable Artificial Intelligence (XAI),” Information, vol. 14, no. 5, p. 265, Apr. 2023, doi: 10.3390/info14050265.
  10. S. Muppidi, O. P. P G, and K. B, “Dragonfly Political Optimizer Algorithm-Based Rider Deep Long Short-Term Memory for Soil Moisture and Heat Level Prediction in IoT,” The Computer Journal, vol. 66, no. 6, pp. 1350–1365, Feb. 2022, doi: 10.1093/comjnl/bxab215.
  11. Y. Alghofaili and M. A. Rassam, “A Dynamic Trust-Related Attack Detection Model for IoT Devices and Services Based on the Deep Long Short-Term Memory Technique,” Sensors, vol. 23, no. 8, p. 3814, Apr. 2023, doi: 10.3390/s23083814.
  12. R. Bai, Y. Shi, M. Yue, and X. Du, “Hybrid model based on K-means++ algorithm, optimal similar day approach, and long short-term memory neural network for short-term photovoltaic power prediction,” Global Energy Interconnection, vol. 6, no. 2, pp. 184–196, Apr. 2023, doi: 10.1016/j.gloei.2023.04.006.
  13. G. Ruan et al., “Exploring the transferability of wheat nitrogen status estimation with multisource data and Evolutionary Algorithm-Deep Learning (EA-DL) framework,” European Journal of Agronomy, vol. 143, p. 126727, Feb. 2023, doi: 10.1016/j.eja.2022.126727.
  14. M. Amanullah, S. Thanga Ramya, M. Sudha, V. P. Gladis Pushparathi, A. Haldorai, and B. Pant, “Data sampling approach using heuristic Learning Vector Quantization (LVQ) classifier for software defect prediction,” Journal of Intelligent Fuzzy Systems, vol. 44, no. 3, pp. 3867–3876, Mar. 2023, doi: 10.3233/jifs-220480.
  15. Y. Khoshkalam, A. N. Rousseau, F. Rahmani, C. Shen, and K. Abbasnezhadi, “Applying transfer learning techniques to enhance the accuracy of streamflow prediction produced by long Short-term memory networks with data integration,” Journal of Hydrology, vol. 622, p. 129682, Jul. 2023, doi: 10.1016/j.jhydrol.2023.129682.
  16. R. Arsenault, J.-L. Martel, F. Brunet, F. Brissette, and J. Mai, “Continuous streamflow prediction in ungauged basins: long short-term memory neural networks clearly outperform traditional hydrological models,” Hydrology and Earth System Sciences, vol. 27, no. 1, pp. 139–157, Jan. 2023, doi: 10.5194/hess-27-139-2023.
  17. S. Ghannam and F. Hussain, “Comparison of deep learning approaches for forecasting urban short-term water demand a Greater Sydney Region case study,” Knowledge-Based Systems, vol. 275, p. 110660, Sep. 2023, doi: 10.1016/j.knosys.2023.110660.
  18. T. Liu, X. Gu, S. Li, Y. Bai, T. Wang, and X. Yang, “Static voltage stability margin prediction considering new energy uncertainty based on graph attention networks and long short‐term memory networks,” IET Renewable Power Generation, vol. 17, no. 9, pp. 2290–2301, May 2023, doi: 10.1049/rpg2.12731.
  19. S. Dai and L. Han, “Influenza surveillance with Baidu index and attention-based long short-term memory model,” PLOS ONE, vol. 18, no. 1, p. e0280834, Jan. 2023, doi: 10.1371/journal.pone.0280834.

Acknowledgements


This work was supported by Dongseo University, "Dongseo Frontier Project" Research Fund of 2023.


Funding


No funding was received to assist with the preparation of this manuscript.


Ethics declarations


Conflict of interest

The authors have no conflicts of interest to declare that are relevant to the content of this article.


Availability of data and materials


No data available for above study.


Author information


Contributions

All authors have equal contribution in the paper and all authors have read and agreed to the published version of the manuscript.


Corresponding author


Rights and permissions


Open Access This article is licensed under a Creative Commons Attribution NoDerivs is a more restrictive license. It allows you to redistribute the material commercially or non-commercially but the user cannot make any changes whatsoever to the original, i.e. no derivatives of the original work. To view a copy of this license, visit https://creativecommons.org/licenses/by-nc-nd/4.0/


Cite this article


Gi Hwan Oh, “Exploring Long Short Term Memory Algorithms for Low Energy Data Aggregation”, Journal of Machine and Computing, pp. 071-082, January 2024. doi: 10.53759/7669/jmc202404008.


Copyright


© 2024 Gi Hwan Oh. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.