Please use this identifier to cite or link to this item: http://hdl.handle.net/10263/7155
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBAG, ARPAN KUMAR-
dc.date.accessioned2021-05-13T08:18:45Z-
dc.date.available2021-05-13T08:18:45Z-
dc.date.issued2020-08-
dc.identifier.citation28p.en_US
dc.identifier.urihttp://hdl.handle.net/10263/7155-
dc.descriptionDissertation under the supervision of Dr. UJJWAL BHATTACHARYAen_US
dc.description.abstractIn the past few years, deep models have made a huge impact in the eld of computer vision. Among these deep models, Residual Networks or ResNets have become par- ticularly popular for their simple architecture and e cient performance. Despite the achievement, the skip connection which made the training of a very deep model possible was also considered as a drawback of this model. Some studies have been done on the comparative performance of various types of skip connections. Inspired by the recent work on skip connection which proposed use of ReLU with group normalization as an alternative to the identity skip connection resulting in better performance than tradi- tional ResNet, we have explored use of various activation functions. In this thesis, we propose a di erent transformation to be used together with the ReLU Group Normaliza- tion (RG) connection to improve the performance of Residual networks. We simulated our results on CIFAR-10 and CIFAR-100 datasets. The code developed as a part of this study is available at https:// github.com/Arpan142/Arpan dissertation. 2en_US
dc.language.isoenen_US
dc.publisherIndian Statistical Institute, Kolkataen_US
dc.relation.ispartofseriesDissertation;;2020-7-
dc.subjectdeep networksen_US
dc.subjectResidual Networksen_US
dc.titleAN EFFICIENT ALTERNATIVE OF THE IDENTITY MAPPING OF THE ResNet ARCHITECTUREen_US
dc.typeOtheren_US
Appears in Collections:Dissertations - M Tech (CS)

Files in This Item:
File Description SizeFormat 
Arpan_Kumar_Bag_CS1808_MTCSthesis2020.pdf1.21 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.