Journal of Computational Intelligence in Materials Science

Machine Learning Approches for Evaluating the Properties of Materials

Journal of Computational Intelligence in Materials Science

Received On : 08 March 2023

Revised On : 08 April 2023

Accepted On : 25 May 2023

Published On : 10 June 2023

Volume 01, 2023

Pages : 067-076


Machine Learning for Materials Science is a primer on the subject that also delves into the specifics of where ML might be applied to materials science research. With a focus on where to collect data and some of the issues when choosing a strategy, this article includes example approaches for ML applied to experiments and modeling, such as the first steps in the procedure for constructing an ML solution for a materials science problem. The lengthy cycles of development, inefficiencies, and higher costs of conventional techniques of material discovery, such as the density functional theory-based and empirical trials and errors approach, make it impossible for materials research to keep up with modern advances. Hence, machine learning is extensively employed in material detection, material design, and material analysis because of its cheap computing cost and fast development cycle, paired with strong data processing and good prediction performance. This article summarizes recent applications of ML algorithms within different material science fields, discussing the advancements that are needed for widespread application, and details the critical operational procedures involved in evaluating the features of materials using ML.


Machine learning (ML), Artificial Intelligence (AI), Density Functional Theory (DFT), Artificial Neural Networks (ANN).

  1. M. T. Cazzolato, L. S. Rodrigues, M. X. Ribeiro, M. A. Gutierrez, C. Traina Jr, and A. J. M. Traina, “Sketch+ for visual and correlation-based Exploratory Data Analysis: A case study with COVID-19 databases,” jidm, vol. 13, no. 2, 2022.
  2. K. Asnaashari and R. V. Krems, “Gradient domain machine learning with composite kernels: improving the accuracy of PES and force fields for large molecules,” Mach. Learn. Sci. Technol., vol. 3, no. 1, p. 015005, 2022.
  3. S. R. Heller, K. Scott, and D. W. Bigwood, “The need for data evaluation of physical and chemical properties of pesticides: the ARS pesticide properties database,” J. Chem. Inf. Model., vol. 29, no. 3, pp. 159–162, 1989.
  4. M. Büchi, N. Festic, and M. Latzer, “The chilling effects of digital dataveillance: A theoretical model and an empirical research agenda,” Big Data Soc., vol. 9, no. 1, p. 205395172110653, 2022.
  5. A. Mitnitski, A. Mogilner, C. MacKnight, and K. Rockwood, “Data integration and knowledge discovery in biomedical databases. Reliable information from unreliable sources,” Data Sci. J., vol. 2, pp. 25–34, 2003.
  6. I. N. Syamsiana, S. S. Wibowo, M. F. Hakim, I. Ridzki, and R. Firjatoellah, “Energy Database Management System (EDBMS)-based data acquisition audit for electricity savings analysis,” IOP Conf. Ser. Mater. Sci. Eng., vol. 1073, no. 1, p. 012036, 2021.
  7. S. Mitra, A. Ahmad, S. Biswas, and A. Kumar Das, “A machine learning approach to predict the structural and magnetic properties of Heusler alloy families,” Comput. Mater. Sci., vol. 216, no. 111836, p. 111836, 2023.
  8. K. Patra, B. Nemade, D. P. Mishra, and P. P. Satapathy, “Cued-click point graphical password using circular tolerance to increase password space and persuasive features,” Procedia Comput. Sci., vol. 79, pp. 561–568, 2016.
  9. L. Yang, Z. Liu, and Q. Kong, “Power consumption prediction with K-nearest-neighbours and XGBoost algorithm,” Int. J. Wirel. Mob. Comput., vol. 15, no. 4, p. 374, 2018.
  10. Decision Tree Writing Group, “Clinical response Decision Tree for the Mountain Gorilla (Gorilla beringeii) as a model for great apes,” Am. J. Primatol., vol. 68, no. 9, pp. 909–927, 2006.
  11. R. Asy’Ari et al., “High heterogeneity LULC classification in Ujung Kulon National Park, Indonesia: A study testing 11 indices, Random Forest, sentinel-2 MSI, and GEE-based cloud computing,” CELEBES Agricultural, vol. 3, no. 2, pp. 82–99, 2023.
  12. R. C. Deo et al., “Cloud cover bias correction in numerical weather models for solar energy monitoring and forecasting systems with kernel ridge regression,” Renew. Energy, vol. 203, pp. 113–130, 2023.
  13. R. Singh and S. Vijaykumar, “Kernel Ridge Regression Inference,” arXiv [math.ST], 2023.
  14. N. Dupuis et al., “AlGaInAs selective area growth by LP-MOVPE: experimental characterisation and predictive modelling,” IEE Proc., vol. 153, no. 6, pp. 276–279, 2006.
  15. A. Soni, G. K. Tripathi, P. Bundela, P. K. Khiriya, and P. S. Khare, “Synthesis and characterization of assorted pH CdSe quantum dots by solvo-thermal method to determine its dye-degradation application,” Opt. Quantum Electron., vol. 55, no. 3, 2023.
  16. M. Iqbal, K. C. Onyelowe, and F. E. Jalal, “Smart computing models of California bearing ratio, unconfined compressive strength, and resistance value of activated ash-modified soft clay soil with adaptive neuro-fuzzy inference system and ensemble random forest regression techniques,” Multiscale Multidiscip. Model. Exp. Des., vol. 4, no. 3, pp. 207–225, 2021.
  17. G. Li et al., “Obstructed surface states as the descriptor for predicting catalytic active sites in inorganic crystalline materials,” Adv. Mater., vol. 34, no. 26, p. e2201328, 2022.
  18. D. Kim, E. S. Vasileiadou, I. Spanopoulos, M. G. Kanatzidis, and Q. Tu, “Abnormal in-plane thermomechanical behavior of two-dimensional hybrid organic-inorganic perovskites,” ACS Appl. Mater. Interfaces, vol. 15, no. 6, pp. 7919–7927, 2023.
  19. S. Saeedi, H. Bouraghi, M.-S. Seifpanahi, and M. Ghazisaeedi, “Application of digital games for speech therapy in children: A systematic review of features and challenges,” J. Healthc. Eng., vol. 2022, p. 4814945, 2022.
  20. T. F. Stepinski and A. Dmowska, “Machine-learning models for spatially-explicit forecasting of future racial segregation in US cities,” Mach. Learn. Appl., vol. 9, no. 100359, p. 100359, 2022.


Authors thank Reviewers for taking the time and effort necessary to review the manuscript.


No funding was received to assist with the preparation of this manuscript.

Ethics declarations

Conflict of interest

The authors have no conflicts of interest to declare that are relevant to the content of this article.

Availability of data and materials

No data available for above study.

Author information


All authors have equal contribution in the paper and all authors have read and agreed to the published version of the manuscript.

Corresponding author

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution NoDerivs is a more restrictive license. It allows you to redistribute the material commercially or non-commercially but the user cannot make any changes whatsoever to the original, i.e. no derivatives of the original work. To view a copy of this license, visit

Cite this article

Nanna Ahlmann Ahm, “Machine Learning Approches for Evaluating the Properties of Materials”, Journal of Computational Intelligence in Materials Science, vol.1, pp. 067-076, 2023. doi: 10.53759/832X/JCIMS202301007.


© 2023 Nanna Ahlmann Ahm. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.