Abstract:
In the past few years, deep models have made a huge impact in the eld of computer
vision. Among these deep models, Residual Networks or ResNets have become par-
ticularly popular for their simple architecture and e cient performance. Despite the
achievement, the skip connection which made the training of a very deep model possible
was also considered as a drawback of this model. Some studies have been done on the
comparative performance of various types of skip connections. Inspired by the recent
work on skip connection which proposed use of ReLU with group normalization as an
alternative to the identity skip connection resulting in better performance than tradi-
tional ResNet, we have explored use of various activation functions. In this thesis, we
propose a di erent transformation to be used together with the ReLU Group Normaliza-
tion (RG) connection to improve the performance of Residual networks. We simulated
our results on CIFAR-10 and CIFAR-100 datasets. The code developed as a part of this
study is available at https:// github.com/Arpan142/Arpan dissertation.
2