DSpace Repository

Asynchronous Methods in Gradient Descent

Show simple item record

dc.contributor.author Ghosh, Koushik
dc.date.accessioned 2022-03-24T05:01:58Z
dc.date.available 2022-03-24T05:01:58Z
dc.date.issued 2021-07
dc.identifier.citation 33p. en_US
dc.identifier.uri http://hdl.handle.net/10263/7303
dc.description Dissertation under the supervision of Swagatam Das en_US
dc.description.abstract Su, Boyd and Candes ’14 [1] showed that if we make the stepsizes smaller and smaller, Nesterov Accelerated Gradient Descent converges to a 2nd order ODE. On the other hand, arjevani has shown recently some convergence results on delayed vanilla Gradient descent . Our idea is to take a delayed version of Nesterov Accelerated Gradient Descent and derive it’s corresponding ODE and prove convergence for the convex case. en_US
dc.language.iso en en_US
dc.publisher Indian Statistical Institute, Kolkata. en_US
dc.relation.ispartofseries Dissertation;CS-1906
dc.subject Nesterov Accelerated Gradient Descent en_US
dc.subject Asynchronous en_US
dc.subject Hogwild en_US
dc.subject DownPour SGD en_US
dc.title Asynchronous Methods in Gradient Descent en_US
dc.type Other en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account