Comparative analysis of activation functions in neural networks

Date

2021

Journal Title

Journal ISSN

Volume Title

Publisher

Institute of Electrical and Electronics Engineers Inc.

Abstract

Although the impact of activations on the accuracy of neural networks has been covered in the literature, there is little discussion about the relationship between the activations and the geometry of neural network model. In this paper, we examine the effects of various activation functions on the geometry of the model within the feature space. In particular, we investigate the relationship between the activations in the hidden and output layers, the geometry of the trained neural network model, and the model performance. We present visualizations of the trained neural network models to help researchers better understand and intuit the effects of activation functions on the models. © 2021 IEEE.

Description

This conference paper is not available at CUD collection. The version of scholarly record of this paper is published in 2021 28th IEEE International Conference on Electronics, Circuits, and Systems (ICECS) (2021), available online at: https://doi.org/10.1109/ICECS53924.2021.9665646

Keywords

activation function, loss function, neural networks, ReLU, sigmoid

Citation

Kamalov, F., Nazir, A., Safaraliev, M., Cherukuri, A. K., & Zgheib, R. (2021). Comparative analysis of activation functions in neural networks. 2021 28th IEEE International Conference on Electronics, Circuits, and Systems, ICECS. https://doi.org/10.1109/ICECS53924.2021.9665646

DOI