| Authors | مهسا سهیل شمائی,سجاد فتحی هفشجانی,زینب سعیدیان طریی |
| Journal | Frontiers of computer scinece |
| IF | 4.2 |
| Paper Type | Full Paper |
| Published At | 2024-03-14 |
| Journal Grade | Scientific - research |
| Journal Type | Electronic |
| Journal Country | Iran, Islamic Republic Of |
| Journal Index | JCR ,SCOPUS |
Abstract
In this paper, we propose a novel warm restart
technique using a new logarithmic step size for the stochastic
gradient descent (SGD) approach. For smooth and non-convex
functions, we establish an convergence rate for the
SGD. We conduct a comprehensive implementation to
demonstrate the efficiency of the newly proposed step size on
the FashionMinst, CIFAR10, and CIFAR100 datasets.
Moreover, we compare our results with nine other existing
approaches and demonstrate that the new logarithmic step size
improves test accuracy by 0.9% for the CIFAR100 dataset
when we utilize a convolutional neural network (CNN) model.