SGDR: Stochastic Gradient Descent with Warm Restarts
Date : 2016-05-03
Abstract
Restart techniques are common in gradient-free optimization to deal with multimodal functions. Partial warm restarts are also gaining popularity in gradient-based optimization to improve the rate of convergence in accelerated gradient schemes to deal with ill-conditioned functions. In this paper, we propose a simple warm restart technique for stochastic gradient descent to improve its anytime performance when training deep neural networks. We empirically study its performance on the CIFAR-10 and CIFAR-100 datasets, where we demonstrate new state-of-the-art results at 3.14% and 16.21%, respectively. We also demonstrate its advantages on a dataset of EEG recordings and on a downsampled version of the ImageNet dataset.
Research paper below links to GitHub repo
Recently on :
Artificial Intelligence
Research
PITTI - 2024-09-19
A bubble in AI?
Bubble or true technological revolution? While the path forward isn't without obstacles, the value being created by AI extends ...
PITTI - 2024-09-08
Artificial Intelligence : what everyone can agree on
Artificial Intelligence is a divisive subject that sparks numerous debates about both its potential and its limitations. Howeve...
WEB - 2024-03-04
Nvidia bans using translation layers for CUDA software | Tom's Hardware
Tom's Hardware - Nvidia has banned running CUDA-based software on other hardware platforms using translation layers in its lice...
WEB - 2024-02-21
Retell AI : conversational speech engine
Retell tackle the challenge of real time conversations with voice AI.
WEB - 2024-02-21
Groq Inference Tokenomics: Speed, But At What Cost? | Semianalysis
Semianalysis - Groq, an AI hardware startup, has been making waves with their impressive demos showcasing Mistral Mixtral 8x7b ...