Please use this identifier to cite or link to this item: http://hdl.handle.net/10263/7303
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGhosh, Koushik-
dc.date.accessioned2022-03-24T05:01:58Z-
dc.date.available2022-03-24T05:01:58Z-
dc.date.issued2021-07-
dc.identifier.citation33p.en_US
dc.identifier.urihttp://hdl.handle.net/10263/7303-
dc.descriptionDissertation under the supervision of Swagatam Dasen_US
dc.description.abstractSu, Boyd and Candes ’14 [1] showed that if we make the stepsizes smaller and smaller, Nesterov Accelerated Gradient Descent converges to a 2nd order ODE. On the other hand, arjevani has shown recently some convergence results on delayed vanilla Gradient descent . Our idea is to take a delayed version of Nesterov Accelerated Gradient Descent and derive it’s corresponding ODE and prove convergence for the convex case.en_US
dc.language.isoenen_US
dc.publisherIndian Statistical Institute, Kolkata.en_US
dc.relation.ispartofseriesDissertation;CS-1906-
dc.subjectNesterov Accelerated Gradient Descenten_US
dc.subjectAsynchronousen_US
dc.subjectHogwilden_US
dc.subjectDownPour SGDen_US
dc.titleAsynchronous Methods in Gradient Descenten_US
dc.typeOtheren_US
Appears in Collections:Dissertations - M Tech (CS)

Files in This Item:
File Description SizeFormat 
Koushik Ghosh-cs-19-21.pdf659.03 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.