Bulletin of Electrical Engineering and Informatics
Vol 13, No 5: October 2024

A comparative analysis of activation functions in neural networks: unveiling categories

Bouraya, Sara (Unknown)
Belangour, Abdessamad (Unknown)



Article Info

Publish Date
01 Oct 2024

Abstract

Activation functions (AFs) play a critical role in artificial neural networks, allowing for the modeling of complex, non-linear relationships in data. In this review paper, we provide an overview of the most commonly used AFs in deep learning. In this comparative study, we survey and compare the different AFs in deep learning and artificial neural networks. Our aim is to provide insights into the strengths and weaknesses of each AF and to provide guidance on the appropriate selection of AFs for different types of problems. We evaluate the most commonly used AFs, including sigmoid, tanh, rectified linear units (ReLUs) and its variants, exponential linear unit (ELU), and SoftMax. For each activation category, we discuss its properties, mathematical formulation (MF), and the benefits and drawbacks in terms of its ability to model complex, non-linear relationships in data. In conclusion, this comparative study provides a comprehensive overview of the properties and performance of different AFs, and serves as a valuable resource for researchers and practitioners in deep learning and artificial neural networks.

Copyrights © 2024






Journal Info

Abbrev

EEI

Publisher

Subject

Electrical & Electronics Engineering

Description

Bulletin of Electrical Engineering and Informatics (Buletin Teknik Elektro dan Informatika) ISSN: 2089-3191, e-ISSN: 2302-9285 is open to submission from scholars and experts in the wide areas of electrical, electronics, instrumentation, control, telecommunication and computer engineering from the ...