An Efficient Neural Architecture Search Algorithm for AutoEncoder Optimization - A Systematic Literature Review

Authors

DOI:

https://doi.org/10.56919/2543.002

Keywords:

Algorithm optimization, Autoencoder, Machine learning, Neural Architecture Search

Abstract

Autoencoders have developed into neural search networks in recent years, and the majority of machine learning (ML) methods rely on the input properties to produce high-quality models.  Increased dimensionality is a challenge that arises with larger datasets, which significantly reduces machine learning efficiency.  In order to reduce the significant amount of high-dimensional data, researchers have developed feature reduction and selection techniques like Principal Component Analysis (PCA) and autoencoders with their different variations, including Convolutional autoencoders, Sparse autoencoders, Denoising autoencoders, Contractive autoencoders, Variational autoencoders, and Deep autoencoders.  This paper discusses neural search structures, evolutionary methods, and various autoencoder types.  A number of optimization techniques to increase training accuracy and decrease training time are also covered in the review.  By thoroughly reviewing the literature from five scholarly databases on the subject, extracting pertinent information using a PRISMA flow diagram, and applying the proper eligibility criteria to synthesize and evaluate the best papers found, the research project adopted a thorough, detailed review.  According to the research's findings, several neural search architectures and autoencoders used for data reconstruction have improved, including Venkataraman's Convolutional AEs, Baier et al.'s Self-Supervised Siamese Autoencoders, Lazebnik & Simon-Keren's Knowledge-integrated Autoencoder Model, Chen et al.'s Dirichlet Neural Architecture Search, and Zhang et al.'s Graph Hypernetworks for Neural Architecture Search.

References

Aamir, M., Mohd Nawi, N., Wahid, F., & Mahdin, H. (2021). A deep contractive autoencoder for solving multiclass classification problems. Evolutionary Intelligence, 14(4), 1619–1633.https://doi.org/10.1007/s12065-020-00424-6 DOI: https://doi.org/10.1007/s12065-020-00424-6

Aliyu, A. A., Ibrahim, M., & Abdulkadir, S. (2025). A Blockchain Enhanced Deep Learning Approach for Intrusion Detection in Trusted Execution Environments. Digital Technologies Research and Applications, 4(1), 135–157. https://doi.org/10.54963/dtra.v4i1.962 DOI: https://doi.org/10.54963/dtra.v4i1.962

Baier, F., Mair, S., & Fadel, S. G. (2023). Self-Supervised Siamese Autoencoders. http://arxiv.org/abs/2304.02549

Bank, D., Koenigstein, N., & Giryes, R. (2020). Autoencoders.http://arxiv.org/abs/2003.05991

Barwey, S., Shankar, V., Viswanathan, V., & Maulik, R. (2023). Multiscale Graph Neural Network Autoencoders for Interpretable Scientific Machine Learninghttp://arxiv.org/abs/2302.06186 DOI: https://doi.org/10.1016/j.jcp.2023.112537

Bello, I., Zoph, B., Vasudevan, V., & Le, Q. V. (2017). Neural Optimizer Search with Reinforcement Learning.

Bunker, J., Girolami, M., Lambley, H., Stuart, A. M., & Sullivan, T. J. (2024). Autoencoders in Function Space. http://arxiv.org/abs/2408.01362

Charte, D., Charte, F., del Jesus, M. J., & Herrera, F. (2020). A Showcase of the Use of Autoencoders in Feature Learning Applications. In Advances in Intelligent Systems and Computing (Vol. 1000, pp. 445–456). Springer. https://doi.org/10.1007/978-3-030-19651-6_40 DOI: https://doi.org/10.1007/978-3-030-19651-6_40

Charte, D., Charte, F., García, S., del Jesus, M. J., & Herrera, F. (2018). A practical tutorial on autoencoders for nonlinear feature fusion: Taxonomy, models, software and guidelines. Information Fusion, 41, 37–52. https://doi.org/10.1016/j.inffus.2017.12.007 DOI: https://doi.org/10.1016/j.inffus.2017.12.007

Charte, D., Charte, F., García, S., & Herrera, F. (2019). A snapshot on nonstandard supervised learning problems: taxonomy, relationships, problem transformations and algorithm adaptations. Progress in Artificial Intelligence, 8(1), 1–14. https://doi.org/10.1007/s13748-018-00167-7 DOI: https://doi.org/10.1007/s13748-018-00167-7

Charte, F., Rivera, A. J., Martínez, F., & del Jesus, M. J. (2023). EvoAAA: An evolutionary methodology for automated neural autoencoder architecture search. Integrated Computer-Aided Engineering, 30(1), 85–102. https://doi.org/10.3233/ICA-200619 DOI: https://doi.org/10.3233/ICA-200619

Chen, X., Wang, R., Cheng, M., Tang, X., & Hsieh, C.-J. (2020). DrNAS: Dirichlet Neural Architecture Search. http://arxiv.org/abs/2006.10355

Dehghani, M., Montazeri, Z., Dhiman, G., Malik, O. P., Morales-Menendez, R., Ramirez-Mendoza, R. A., Dehghani, A., Guerrero, J. M., & Parra-Arroyo, L. (2020). A spring search algorithm applied to engineering optimization problems. Applied Sciences, 10(18), 6173. https://doi.org/10.3390/app10186173 DOI: https://doi.org/10.3390/app10186173

Felhi, G. (2023). Interpretable Sentence Representation with Variational Autoencoders and Attention. http://arxiv.org/abs/2305.02810

Greenacre, M., Groenen, P. J. F., Hastie, T., Iodice D'Enza, A., Markos, A., & Tuzhilina, E. (2023). Principal component analysis. Nature Reviews Methods Primers, 2(1), 100. https://doi.org/10.1038/s43586-023-00209-y DOI: https://doi.org/10.1038/s43586-022-00184-w

Heiland, J., & Kim, Y. (2024). Polytopic Autoencoders with Smooth Clustering for Reduced-order Modelling of Flows. http://arxiv.org/abs/2401.10620 DOI: https://doi.org/10.1016/j.jcp.2024.113526

Heuillet, A., Nasser, A., Arioui, H., & Tabia, H. (2023). Efficient Automation of Neural Network Design: A Survey on Differentiable Neural Architecture Searchhttp://arxiv.org/abs/2304.05405

Hu, Y., Chu, X., & Zhang, B. (2023a). Masked Autoencoders Are Robust Neural Architecture Search Learners. http://arxiv.org/abs/2311.12086

Hu, Y., Chu, X., & Zhang, B. (2023b). Masked Autoencoders Are Robust Neural Architecture Search Learners. http://arxiv.org/abs/2311.12086

Lazebnik, T., & Simon-Keren, L. (2023). Knowledge-integrated AutoEncoder Model. Expert Systems with Applications, 249, 124108. https://doi.org/10.1016/j.eswa.2024.124108 DOI: https://doi.org/10.1016/j.eswa.2024.124108

Liu, Y., Sun, Y., Xue, B., Zhang, M., Yen, G. G., & Tan, K. C. (2020). A Survey on Evolutionary Neural Architecture Search. IEEE Transactions on Neural Networks and Learning Systems, 32(5), 1949–1968. https://doi.org/10.1109/TNNLS.2021.3100554 DOI: https://doi.org/10.1109/TNNLS.2021.3100554

Mahesh, B. (2020). Machine Learning Algorithms - A Review. International Journal of Science and Research, 9(1), 381–386. https://doi.org/10.21275/ART20203995 DOI: https://doi.org/10.21275/ART20203995

Mai, F., & Henderson, J. (2021). Bag-of-Vectors Autoencoders for Unsupervised Conditional Text Generation. http://arxiv.org/abs/2110.07002

Noroozi, M., Mohammadi, H., Efatinasab, E., Lashgari, A., Eslami, M., & Khan, B. (2022). Golden Search Optimization Algorithm. IEEE Access, 10, 37515–37532. https://doi.org/10.1109/ACCESS.2022.3162853 DOI: https://doi.org/10.1109/ACCESS.2022.3162853

Popov, A. A., Sarshar, A., Chennault, A., & Sandu, A. (2022a). A Meta-learning Formulation of the Autoencoder Problem for Non-linear Dimensionality Reduction. http://arxiv.org/abs/2207.06676

Popov, A. A., Sarshar, A., Chennault, A., & Sandu, A. (2022b). A Meta-learning Formulation of the Autoencoder Problem for Non-linear Dimensionality Reduction. http://arxiv.org/abs/2207.06676

PRISMA 2020 - Creating a PRISMA flow diagram - LibGuides at University of North Carolina at Chapel Hill. (n.d.). Retrieved March 7, 2025, from https://guides.lib.unc.edu/prisma

Pulgar, F. J., Charte, F., Rivera, A. J., & del Jesus, M. J. (2020). Choosing the proper autoencoder for feature fusion based on data complexity and classifiers: Analysis, tips and guidelines. Information Fusion, 54, 44–60. https://doi.org/10.1016/j.inffus.2019.07.004 DOI: https://doi.org/10.1016/j.inffus.2019.07.004

Seidman, J. H., Kissas, G., Pappas, G. J., & Perdikaris, P. (2023). Variational Autoencoding Neural Operators. http://arxiv.org/abs/2302.10351

Shrestha, A., & Mahmood, A. (2019). Review of deep learning algorithms and architectures. IEEE Access, 7, 53040–53065. https://doi.org/10.1109/ACCESS.2019.2912200 DOI: https://doi.org/10.1109/ACCESS.2019.2912200

Singh, J., Azamfar, M., Li, F., & Lee, J. (2021). A systematic review of machine learning algorithms for prognostics and health management of rolling element bearings: fundamentals, concepts and applications. Measurement Science and Technology, 32(1), 012001. https://doi.org/10.1088/1361-6501/ab8df9 DOI: https://doi.org/10.1088/1361-6501/ab8df9

Venkataraman, P. (2022). Image Denoising Using Convolutional Autoencoder. http://arxiv.org/abs/2207.11771

Wei, C., Tang, Y., Niu, C., Hu, H., Wang, Y., & Liang, J. (2020). Self-supervised Representation Learning for Evolutionary Neural Architecture Search. http://arxiv.org/abs/2011.00186

Wistuba, M., Rawat, A., & Pedapati, T. (2019). A Survey on Neural Architecture Search. http://arxiv.org/abs/1905.01392

Wu, S., Beaulac, C., & Cao, J. (2024). Functional Autoencoder for Smoothing and Representation Learning. http://arxiv.org/abs/2401.09499 DOI: https://doi.org/10.1007/s11222-024-10501-w

Xia, L., Huang, C., Xu, Y., Xu, H., Li, X., & Zhang, W. (2022). Collaborative Reflection-Augmented Autoencoder Network for Recommender Systems. ACM Transactions on Information Systems, 40(1), 1–28. https://doi.org/10.1145/3467023 DOI: https://doi.org/10.1145/3467023

Ye, Y., Xia, L., & Huang, C. (2023). Graph Masked Autoencoder for Sequential Recommendation. In SIGIR 2023 - Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 321–330). https://doi.org/10.1145/3539618.3591692 DOI: https://doi.org/10.1145/3539618.3591692

Zhang, C., Ren, M., & Urtasun, R. (2018). Graph HyperNetworks for Neural Architecture Search. http://arxiv.org/abs/1810.05749

Published

2025-09-30

Issue

Section

Articles

How to Cite

Ogbe, S. M., & Abubakar, A. A. (2025). An Efficient Neural Architecture Search Algorithm for AutoEncoder Optimization - A Systematic Literature Review. UMYU Scientifica, 4(3), 9-18. https://doi.org/10.56919/2543.002

Most read articles by the same author(s)