Please use this identifier to cite or link to this item:
https://dspace.univ-ouargla.dz/jspui/handle/123456789/37004
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | K HADRA BOUANANE | - |
dc.contributor.author | DAHIMI, KHAOULA | - |
dc.date.accessioned | 2024-10-01T14:44:15Z | - |
dc.date.available | 2024-10-01T14:44:15Z | - |
dc.date.issued | 2024 | - |
dc.identifier.citation | FACULTY OF N EW I NFORMATION AND C OMMUNICATION T ECHNOLOGIES | en_US |
dc.identifier.uri | https://dspace.univ-ouargla.dz/jspui/handle/123456789/37004 | - |
dc.description | A RTIFICIAL I NTELLIGENCE AND D ATA S CIENCE | en_US |
dc.description.abstract | The Adam optimizer is widely regarded as a highly effective algorithm for training deep learning models. However, there are instances where Adam may not perform optimally and could lead to poor generalization. To address these limitations, several variants of Adam have been introduced to mitigate its drawbacks. For example, AdaMod, AdaBound, R-Adan and N-Adam. this thesis aims to introducing the best optimizer and which Adam limitations addressed that has impact to its performance by providing comparative study of these variants include AdaMod, Adabound, R-Adam, AdaMax, AMSGrad, AdaBeleif, E- Adam, YOGI, AdamW, N-Adam, ND-Adam, MSVAG, T-Adam, and Ro-Adam. The results shows that when we employ a basic CNN and ResNet34 trained on the MNIST and CIFAR10 datasets respectively, AdaMod and AMSGRAD achieves the highest performance with accu- racy scores 98.8%, 73.4% respectively. These optimizers behave similarly and achieve nearly the same results. To better understand and ensure the selection of the best optimizer, we utilized an LSTM architecture to train a sentence completion model on the Penn Treebank dataset and a time series forecasting model using the Amazon stock dataset. We found that AdamW and T-Adam outperformed the other optimizers. | en_US |
dc.description.sponsorship | D EPARTMENT OF C OMPUTER S CIENCE AND I NFORMATION T ECHNOLOGY | en_US |
dc.language.iso | en | en_US |
dc.publisher | Kasdi Merbah University OUARGLA ALGERIA | en_US |
dc.subject | Adam | en_US |
dc.subject | Adam variants | en_US |
dc.subject | optimizers | en_US |
dc.subject | comparative study | en_US |
dc.title | A DAM VARIANTS O PTIMIZERS | en_US |
dc.title.alternative | A C OMPARATIVE S TUDY | en_US |
dc.type | Thesis | en_US |
Appears in Collections: | Département d'informatique et technologie de l'information - Master |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
DAHIMI .pdf | A RTIFICIAL I NTELLIGENCE AND D ATA S CIENCE | 4,07 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.