ReLU-based activations: analysis and experimental study for deep learning
Hits: 5985
- Research areas:
- Uncategorized
- Year:
- 2021
- Type of Publication:
- In Proceedings
- Keywords:
- analysis activations, RELU, RELU activations, deep learning
- Authors:
- Volume:
- 12882
- Book title:
- Proceedings of the XIX Conference of the Spanish Association for Artificial Intelligence (CAEPIA)
- Series:
- Lecture Notes in Artificial Intelligence (LNAI)
- Pages:
- 33-43
- Organization:
- Malaga, Spain
- Month:
- 22nd-24th September
- ISBN:
- 978-3-030-85712-7
- ISSN:
- 0302-9743
- BibTex:
- Abstract:
- Activation functions are used in neural networks as a tool to introduce non-linear transformations into the model and, thus, enhance its representation capabilities. They also determine the output range of the hidden layers and the final output. Traditionally, artificial neural networks mainly used the sigmoid activation function as the depth of the network was limited. Nevertheless, this function tends to saturate the gradients when the number of hidden layers increases. For that reason, in the last years, most of the works published related to deep learning and convolutional networks use the Rectified Linear Unit (ReLU), given that it provides good convergence properties and speeds up the training process thanks to the simplicity of its derivative. However, this function has some known drawbacks that gave rise to new proposals of alternatives activation functions based on ReLU. In this work, we describe, analyse and compare different recently proposed alternatives to test whether these functions improve the performance of deep learning models regarding the standard ReLU.