Please use this identifier to cite or link to this item: http://hdl.handle.net/10263/7154
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSAHA, ARNAB-
dc.date.accessioned2021-05-13T08:11:18Z-
dc.date.available2021-05-13T08:11:18Z-
dc.date.issued2020-07-
dc.identifier.citation25p.en_US
dc.identifier.urihttp://hdl.handle.net/10263/7154-
dc.descriptionDissertation under the supervision of Dr. N.R. Palen_US
dc.description.abstractGAN or Generative Adversarial Network is a combination of two deep Neural Networks in which one network acts as a generator where the other acts as a discriminator which differentiate between real and generated fake samples. There are different variants of GAN. For every variant of GAN we have to train two deep neural networks simultaneously and the hardest part about GAN is it’s training. During training many GAN models suffer various major problems like non-convergence, mode collapse, high sensitivity to the selection of hyper-parameters and vanishing gradient. In this project we tried to address the problem Mode-collapse. Where the generator generates only one or limited variants of samples irrespective of the inputs.en_US
dc.language.isoenen_US
dc.publisherIndian Statistical Institute, Kolkataen_US
dc.relation.ispartofseriesDissertation;;2020-6-
dc.subjectGenerative Adversarial Networksen_US
dc.subjectMode Collapseen_US
dc.titleEfficient Learning of GANen_US
dc.typeOtheren_US
Appears in Collections:Dissertations - M Tech (CS)

Files in This Item:
File Description SizeFormat 
Arnab_Saha_CS1821_MTCSthesis2020.pdf2.37 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.