Please use this identifier to cite or link to this item:
http://hdl.handle.net/10263/7303
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ghosh, Koushik | - |
dc.date.accessioned | 2022-03-24T05:01:58Z | - |
dc.date.available | 2022-03-24T05:01:58Z | - |
dc.date.issued | 2021-07 | - |
dc.identifier.citation | 33p. | en_US |
dc.identifier.uri | http://hdl.handle.net/10263/7303 | - |
dc.description | Dissertation under the supervision of Swagatam Das | en_US |
dc.description.abstract | Su, Boyd and Candes ’14 [1] showed that if we make the stepsizes smaller and smaller, Nesterov Accelerated Gradient Descent converges to a 2nd order ODE. On the other hand, arjevani has shown recently some convergence results on delayed vanilla Gradient descent . Our idea is to take a delayed version of Nesterov Accelerated Gradient Descent and derive it’s corresponding ODE and prove convergence for the convex case. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Indian Statistical Institute, Kolkata. | en_US |
dc.relation.ispartofseries | Dissertation;CS-1906 | - |
dc.subject | Nesterov Accelerated Gradient Descent | en_US |
dc.subject | Asynchronous | en_US |
dc.subject | Hogwild | en_US |
dc.subject | DownPour SGD | en_US |
dc.title | Asynchronous Methods in Gradient Descent | en_US |
dc.type | Other | en_US |
Appears in Collections: | Dissertations - M Tech (CS) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Koushik Ghosh-cs-19-21.pdf | 659.03 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.