Effectiveness of Optimization Algorithms in Deep Image Classification

From MaRDI portal
Publication:6379352

arXiv2110.01598MaRDI QIDQ6379352

Author name not available (Why is that?)

Publication date: 4 October 2021

Abstract: Adam is applied widely to train neural networks. Different kinds of Adam methods with different features pop out. Recently two new adam optimizers, AdaBelief and Padam are introduced among the community. We analyze these two adam optimizers and compare them with other conventional optimizers (Adam, SGD + Momentum) in the scenario of image classification. We evaluate the performance of these optimization algorithms on AlexNet and simplified versions of VGGNet, ResNet using the EMNIST dataset. (Benchmark algorithm is available at hyperref[1]{https://github.com/chuiyunjun/projectCSC413}).




Has companion code repository: https://github.com/chuiyunjun/projectCSC413








This page was built for publication: Effectiveness of Optimization Algorithms in Deep Image Classification

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6379352)