Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sparsified SGD with Memory #7

Open
nocotan opened this issue May 25, 2021 · 0 comments
Open

Sparsified SGD with Memory #7

nocotan opened this issue May 25, 2021 · 0 comments

Comments

@nocotan
Copy link
Member

nocotan commented May 25, 2021

一言でいうと

エラー補正項を含むgradient sparsificationの収束性についての理論を導出.

論文リンク

https://papers.nips.cc/paper/2018/hash/b440509a0106086a67bc2ea9df0a1dab-Abstract.html

著者/所属機関

Sebastian U. Stich, Jean-Baptiste Cordonnier, Martin Jaggi
Machine Learning and Optimization Laboratory (MLO), EPFL, Switzerland

投稿日付(yyyy/MM/dd)

NeurIPS2018

概要

理論展開のために以下のk-contraction propertyが満足されることを仮定する:

Screen Shot 2021-05-26 at 14 11 58

この仮定のもとでエラー補正項を含むgradient sparcificationの収束レートがオリジナルのSGDに匹敵することを証明.

新規性・差分

  • 勾配圧縮のエラー補正の情報を含むMEM-SGDの収束性についての理論保証を行った.

手法

SGD with Memory

Screen Shot 2021-05-26 at 14 07 46

エラー補正項mを考慮した以下の最適化アルゴリズムを考える:

Screen Shot 2021-05-26 at 14 18 01

このアルゴリズムの収束性について以下の定理が得られる:

Screen Shot 2021-05-26 at 14 19 04

Screen Shot 2021-05-26 at 14 19 56

オーダー比較から,収束レートにおいて支配的な項がVanillaのSGDと一致する.

結果

Screen Shot 2021-05-26 at 14 09 26

Screen Shot 2021-05-26 at 14 09 05

Screen Shot 2021-05-26 at 14 08 29

コメント

@nocotan nocotan self-assigned this May 25, 2021
@nocotan nocotan changed the title [WIP] Sparsified SGD with Memory Sparsified SGD with Memory May 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant