Browsing by Author "Elnagar, Ashraf"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Deep learning for Covid-19 forecasting: State-of-the-art review(Elsevier B.V., 2022-10-28) Kamalov, Firuz; Rajab, Khairan; Cherukuri, Aswani Kumar; Elnagar, Ashraf; Safaraliev, MurodbekItem Emotion Recognition from Speech Using Convolutional Neural Networks(Springer Science and Business Media Deutschland GmbH, 2023) Mahfood, Bayan; Elnagar, Ashraf; Kamalov, FiruzThe human voice carries a great deal of useful information. This information can be utilized in various areas such as call centers, security and medicine among many others. This work aims at implementing a speech emotion recognition system that recognizes the speaker’s emotion using a deep learning neural network based on features extracted from audio clips. Different datasets including the RAVDESS, EMO-DB, TESS and an Emirati-based dataset were used to extract features. The features of each dataset were used as the input that would be fed into a convolution deep neural network for emotion classification. Several models were implemented based on extracted features from each dataset. The top three models that produced the best results were reported. © 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.Item Ensemble Learning with Resampling for Imbalanced Data(Springer Science and Business Media Deutschland GmbH, 2021) Kamalov, Firuz; Elnagar, Ashraf; Leung, Ho HonImbalanced class distribution is an issue that appears in various applications. In this paper, we undertake a comprehensive study of the effects of sampling on the performance of bootstrap aggregating in the context of imbalanced data. Concretely, we carry out a comparison of sampling methods applied to single and ensemble classifiers. The experiments are conducted on simulated and real-life data using a range of sampling methods. The contributions of the paper are twofold: i) demonstrate the effectiveness of ensemble techniques based on resampled data over a single base classifier and ii) compare the effectiveness of different resampling techniques when used during the bagging stage for ensemble classifiers. The results reveal that ensemble methods overwhelmingly outperform single classifiers based on resampled data. In addition, we discover that NearMiss and random oversampling (ROS) are the optimal sampling algorithms for ensemble learning. © 2021, Springer Nature Switzerland AG.Item Evaluation of Arabic-Based Contextualized Word Embedding Models(Institute of Electrical and Electronics Engineers Inc., 2021) Yagi, Sane Mo; Mansour, Youssef; Kamalov, Firuz; Elnagar, AshrafThe distributed representation of words, as in Word2Vec, FastText, and GloVe, results in the production of a single vector for each word type regardless of the polysemy or homonymy that many words may have. Context-sensitive representation as implemented in deep learning neural networks, on the other hand, produces different vectors for the multiple senses of a word. Several contextualized word embeddings have been produced for the Arabic language (e.g., AraBERT, QARiB, AraGPT, etc.). The majority of these were tested on a few NLP tasks but there was no direct comparison between them. As a result, we do not know which of these is most efficient and for which tasks. This paper is a first step in an endeavor to establish evaluation criteria for them. It describes 24 such embeddings, then conducts exploratory intrinsic and extrinsic evaluation of them. Afterwards, it tests relational knowledge in them, covering four semantic relations: colors of fruits, capitals of countries, causation, and general information. It also evaluates the utility of these models in Named Entity Recognition and Sentiment Analysis tasks. It has been demonstrated here that AraBERTv02 and MARBERT are the best on both types of evaluation; therefore, both are recommended for fine-tuning Arabic NLP tasks. The ultimate conclusion is that it is feasible to test higher order reasoning relations in these embeddings. © 2021 IEEEItem Kernel density estimation-based sampling for neural network classification(Institute of Electrical and Electronics Engineers Inc., 2021) Kamalov, Firuz; Elnagar, AshrafImbalanced data occurs in a wide range of scenarios. The skewed distribution of the target variable elicits bias in machine learning algorithms. One of the popular methods to combat imbalanced data is to artificially balance the data through resampling. In this paper, we compare the efficacy of a recently proposed kernel density estimation (KDE) sampling technique in the context of artificial neural networks. We benchmark the KDE sampling method against two base sampling techniques and perform comparative experiments using 8 datasets and 3 neural networks architectures. The results show that KDE sampling produces the best performance on 6 out of 8 datasets. However, it must be used with caution on image datasets. We conclude that KDE sampling is capable of significantly improving the performance of neural networks. © 2021 IEEE.