Kamalov, FiruzLeung, Ho Hon2021-02-212021-02-21© 20202020-11-03Kamalov, F., & Leung, H. H. (2020, November). Deep learning regularization in imbalanced data. In 2020 International Conference on Communications, Computing, Cybersecurity, and Informatics (CCCI) (pp. 1-5). IEEE. https://doi.org/10.1109/CCCI49893.2020.9256674978-172812035-5http://hdl.handle.net/20.500.12519/339This conference paper is not available at CUD collection. The version of scholarly record of this conference paper is published in 2020 International Conference on Communications, Computing, Cybersecurity, and Informatics (CCCI) (2020), available online at: https://doi.org/10.1109/CCCI49893.2020.9256674Deep neural networks are known to have a large number of parameters which can lead to overfitting. As a result various regularization methods designed to mitigate the model overfitting have become an indispensable part of many neural network architectures. However, it remains unclear which regularization methods are the most effective. In this paper, we examine the impact of regularization on neural network performance in the context of imbalanced data. We consider three main regularization approaches: L{1}, L{2}, and dropout regularization. Numerical experiments reveal that the L{1} regularization method can be an effective tool to prevent overfitting in neural network models for imbalanced data. Index Terms-regularization, neural networks, imbalanced data. © 2020 IEEE.enPermission to reuse abstract has been secured from Institute of Electrical and Electronics Engineers Inc.imbalanced dataneural networksregularizationDeep learning regularization in imbalanced dataConference PaperCopyright : 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.