Please use this identifier to cite or link to this item: https://dspace.univ-ouargla.dz/jspui/handle/123456789/37004
Title: A DAM VARIANTS O PTIMIZERS
Other Titles: A C OMPARATIVE S TUDY
Authors: K HADRA BOUANANE
DAHIMI, KHAOULA
Keywords: Adam
Adam variants
optimizers
comparative study
Issue Date: 2024
Publisher: Kasdi Merbah University OUARGLA ALGERIA
Citation: FACULTY OF N EW I NFORMATION AND C OMMUNICATION T ECHNOLOGIES
Abstract: The Adam optimizer is widely regarded as a highly effective algorithm for training deep learning models. However, there are instances where Adam may not perform optimally and could lead to poor generalization. To address these limitations, several variants of Adam have been introduced to mitigate its drawbacks. For example, AdaMod, AdaBound, R-Adan and N-Adam. this thesis aims to introducing the best optimizer and which Adam limitations addressed that has impact to its performance by providing comparative study of these variants include AdaMod, Adabound, R-Adam, AdaMax, AMSGrad, AdaBeleif, E- Adam, YOGI, AdamW, N-Adam, ND-Adam, MSVAG, T-Adam, and Ro-Adam. The results shows that when we employ a basic CNN and ResNet34 trained on the MNIST and CIFAR10 datasets respectively, AdaMod and AMSGRAD achieves the highest performance with accu- racy scores 98.8%, 73.4% respectively. These optimizers behave similarly and achieve nearly the same results. To better understand and ensure the selection of the best optimizer, we utilized an LSTM architecture to train a sentence completion model on the Penn Treebank dataset and a time series forecasting model using the Amazon stock dataset. We found that AdamW and T-Adam outperformed the other optimizers.
Description: A RTIFICIAL I NTELLIGENCE AND D ATA S CIENCE
URI: https://dspace.univ-ouargla.dz/jspui/handle/123456789/37004
Appears in Collections:Département d'informatique et technologie de l'information - Master

Files in This Item:
File Description SizeFormat 
DAHIMI .pdfA RTIFICIAL I NTELLIGENCE AND D ATA S CIENCE4,07 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.