文件名称:
AdaBatch: Efficient Gradient Aggregation.pdf
开发工具:
文件大小: 393kb
下载次数: 0
上传时间: 2020-02-27
详细说明:We study a new aggregation operator for gradients coming from a mini-batch for stochastic
gradient (SG) methods that allows a significant speed-up in the case of sparse optimization
problems. We call this method AdaBatch and it only requires a few lines of code change com-
pared to regular mini-batch SGD algorithms. We provide a theoretical insight to understand
how this new class of algorithms is performing and show that it is equivalent to an implicit
per-coordinate rescaling of the gradients, similarly to what Adagrad methods can do. In theory
and in practice, this new aggregation allows to keep the same sample efficiency of SG methods
while increasing the batch size. Experimentally, we also show that in the case of smooth convex
optimization, our procedure can even obtain a better loss when increasing the batch size for a
fixed number of samples. We then apply this new algorithm to obtain a parallelizable stochas-
tic gradient method that is synchronous but allows speed-up on par with Hogwild! methods as
convergence does not deteriorate with the increase of the batch size. The same approach can be
used to make mini-batch provably efficient for variance-reduced SG methods such as SVRG.
(系统自动生成,下载前可以参看下载内容)
下载文件列表
相关说明
- 本站资源为会员上传分享交流与学习,如有侵犯您的权益,请联系我们删除.
- 本站是交换下载平台,提供交流渠道,下载内容来自于网络,除下载问题外,其它问题请自行百度。
- 本站已设置防盗链,请勿用迅雷、QQ旋风等多线程下载软件下载资源,下载后用WinRAR最新版进行解压.
- 如果您发现内容无法下载,请稍后再次尝试;或者到消费记录里找到下载记录反馈给我们.
- 下载后发现下载的内容跟说明不相乎,请到消费记录里找到下载记录反馈给我们,经确认后退回积分.
- 如下载前有疑问,可以通过点击"提供者"的名字,查看对方的联系方式,联系对方咨询.