Please use this identifier to cite or link to this item: https://dspace.univ-ouargla.dz/jspui/handle/123456789/37004
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorK HADRA BOUANANE-
dc.contributor.authorDAHIMI, KHAOULA-
dc.date.accessioned2024-10-01T14:44:15Z-
dc.date.available2024-10-01T14:44:15Z-
dc.date.issued2024-
dc.identifier.citationFACULTY OF N EW I NFORMATION AND C OMMUNICATION T ECHNOLOGIESen_US
dc.identifier.urihttps://dspace.univ-ouargla.dz/jspui/handle/123456789/37004-
dc.descriptionA RTIFICIAL I NTELLIGENCE AND D ATA S CIENCEen_US
dc.description.abstractThe Adam optimizer is widely regarded as a highly effective algorithm for training deep learning models. However, there are instances where Adam may not perform optimally and could lead to poor generalization. To address these limitations, several variants of Adam have been introduced to mitigate its drawbacks. For example, AdaMod, AdaBound, R-Adan and N-Adam. this thesis aims to introducing the best optimizer and which Adam limitations addressed that has impact to its performance by providing comparative study of these variants include AdaMod, Adabound, R-Adam, AdaMax, AMSGrad, AdaBeleif, E- Adam, YOGI, AdamW, N-Adam, ND-Adam, MSVAG, T-Adam, and Ro-Adam. The results shows that when we employ a basic CNN and ResNet34 trained on the MNIST and CIFAR10 datasets respectively, AdaMod and AMSGRAD achieves the highest performance with accu- racy scores 98.8%, 73.4% respectively. These optimizers behave similarly and achieve nearly the same results. To better understand and ensure the selection of the best optimizer, we utilized an LSTM architecture to train a sentence completion model on the Penn Treebank dataset and a time series forecasting model using the Amazon stock dataset. We found that AdamW and T-Adam outperformed the other optimizers.en_US
dc.description.sponsorshipD EPARTMENT OF C OMPUTER S CIENCE AND I NFORMATION T ECHNOLOGYen_US
dc.language.isoenen_US
dc.publisherKasdi Merbah University OUARGLA ALGERIAen_US
dc.subjectAdamen_US
dc.subjectAdam variantsen_US
dc.subjectoptimizersen_US
dc.subjectcomparative studyen_US
dc.titleA DAM VARIANTS O PTIMIZERSen_US
dc.title.alternativeA C OMPARATIVE S TUDYen_US
dc.typeThesisen_US
Appears in Collections:Département d'informatique et technologie de l'information - Master

Files in This Item:
File Description SizeFormat 
DAHIMI .pdfA RTIFICIAL I NTELLIGENCE AND D ATA S CIENCE4,07 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.